I'm trying to compare the standard Java desiralisation. And have a question if it's a correct way to do this. I wrote the following class:
//{"first", 1234.1234, 21341234, 234123412341234124L, "fifth"}
public class ArrayInputStreamStub extends InputStream{
public int[] arr = new int[260];
private int reader = 0;
public ArrayInputStreamStub(){
reader[0] = -84;
//...
}
@Override
public int read() throws IOException {
return arr[reader++];
}
public void reset() {
reader = 0;
}
}
I have the same stub for HashMap.
//{("first", 1), ("second", 2), ("third", 3), ("fourth", 4), ("fifth", 5)}
public class HashMapInputStreamStub extends InputStream{
//...
}
Now my benchmarks look as this:
public InputStreamStub is;
public ArrayInputStreamStub ais;
@Setup
public void setup(){
is = new InputStreamStub();
ais = new ArrayInputStreamStub();
}
@Benchmark
@OutputTimeUnit(TimeUnit.MICROSECONDS)
@BenchmarkMode(Mode.AverageTime)
public void measureDeserializeHashMap(Blackhole bh){
try {
ObjectInputStream ois = new ObjectInputStream(is);
Map<String, Integer> schema = (Map<String, Integer>) ois.readObject();
is.reset();
bh.consume(schema);
} catch (Exception e) {
e.printStackTrace();
}
}
@Benchmark
@OutputTimeUnit(TimeUnit.MICROSECONDS)
@BenchmarkMode(Mode.AverageTime)
public void measureDeserializeArray(Blackhole bh){
try {
ObjectInputStream ois = new ObjectInputStream(ais);
Object[] array = (Object[]) ois.readObject();
ais.reset();
bh.consume(array);
} catch (Exception e) {
e.printStackTrace();
}
}
The result I got is the following:
Benchmark Mode Cnt Score Error Units
MyBenchmark.measureDeserializeArray avgt 5 15.940 ± 0.044 us/op
MyBenchmark.measureDeserializeHashMap avgt 5 12.118 ± 0.057 us/op
Just ~10 microseconds for the following. I ran -prof gc
, the result pretty much the same.
The question is about is that measurement correct? Serialization of these simple objects is ~10 microseconds?