I'm using HDF5DotNet to write a generic data logging API, DataLog
. The idea is to use reflection to automatically create a H5 compound data type which contains the fields in <T>
T
. The user can then easily add data to the data log using a write(T[] data)
method.
In order to automatically create the H5 Types the class or structure must be decorated with [StructLayoutAttribute]
and some fields with [MarshalAsAttribute]
. Each field is then mapped to a H5 Type and added to the H5 compound data type. Types that contain enumerations or other user defined structs are a little bit more complicated but they still work.
From examples that I've found on the web I've been successful in creating the H5 Type, creating a Dataset and adding some data for a simple struct
I created.
My problem occurs when I change the type from a struct
to a class
. The H5Type is still created and no exceptions are thrown, however when I open the file in HDF View I can see the correct fields but the data is garbage.
I suspect that the problem occurs because in C# structures are value types and classes are reference types. I used the following code to determine what the underlying structure of the data looked like:
public static byte[] GetBytes(dynamic obj)
{
var size = Marshal.SizeOf(obj);
// Both managed and unmanaged buffers required.
var bytes = new byte[size];
var ptr = Marshal.AllocHGlobal(size);
// Copy object byte-to-byte to unmanaged memory.
Marshal.StructureToPtr(obj, ptr, false);
// Copy data from unmanaged memory to managed buffer.
Marshal.Copy(ptr, bytes, 0, size);
// Release unmanaged memory.
Marshal.FreeHGlobal(ptr);
return bytes;
}
It turned out that it doesn't matter whether I use a class
or a struct
, the number of bytes and their order come out of this function exactly the same.
For reference (and because there aren't many code examples on the web) I'll provide my code to append to an existing data set:
public static void AppendToDataSet<DataType>(H5DataSetId dataSetId, DataType[] data)
{
if (data.Length < 1)
{
return;
}
var dataSpaceId = H5D.getSpace(dataSetId);
var rank = H5S.getSimpleExtentNDims(dataSpaceId);
var dims = H5S.getSimpleExtentDims(dataSpaceId);
var rank_chunk = new long[] { };
var prop = H5D.getCreatePropertyList(dataSetId);
if (H5P.getLayout(prop) == H5D.Layout.CHUNKED)
{
rank_chunk = H5P.getChunk(prop, rank);
}
var dims_extended = new long[] { dims[0] + data.Length }; // the new data set size
/* Extend the dataset */
H5D.setExtent(dataSetId, dims_extended);
/* Select a hyperslab in extended portion of dataset */
H5S.selectHyperslab(dataSpaceId, H5S.SelectOperator.SET, new long[] { dims[0] }, new long[] { data.Length });
/* Define memory space */
var memspace = H5S.create_simple(rank, dims_extended);
/* Convert to HDF data */
var hdf_data = new H5Array<DataType>(data);
/* Write the data to the extended portion of dataset */
var dataTypeId = H5D.getType(dataSetId);
H5D.write(dataSetId,
dataTypeId,
hdf_data);
}
I'd love to be able to use classes for this because using structures would require major structural changes to the rest of the application. Does anyone know why the H5 writing isn't working for classes and any way that I can fix it?