Hello,
I am trying to use TFLM on a platform using big endian. The tensor allocation fails as according to commit #689 the support for BE was removed. A custom tool for the endianness conversion of the .tflite file is proposed to use as solution. Since the model in .tflite format is a flatbuffer, I rebuilt schema_generated.h using the tflite schema with the mutable option on. I then tried to use the non-const accessors to load the data and byte-swap it, before saving it again. However I can’t seem to make it work.
I was trying something like this:
//model_as_char_array obtained using the xxd command
model = tflite::GetMutableModel(model_as_char_array);
auto vec=model->mutable_buffers();
for (auto it = vec->begin(); it != vec->end(); ++it){
auto d=it->mutable_data();// <----the problem is on this line
}
which fails to compile due to invalid conversion (const to non-const):
error: invalid conversion from 'flatbuffers::IndirectHelper<flatbuffers::Offset<tflite::Buffer> >::return_type' {aka 'const tflite::Buffer*'} to 'tflite::Buffer*' [-fpermissive]
74 | IT operator->() const { return IndirectHelper<T>::Read(data_, 0); }
| ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~
| |
| flatbuffers::IndirectHelper<flatbuffers::Offset<tflite::Buffer> >::return_type {aka const tflite::Buffer*}
I am obviously missing something and got stuck on this hard. Can anyone propose a way to achieve the aforementioned conversion? Any help is highlt appreciated.
Thank you!
Note: While flattbuffers perform the byte-swap when the accessors are used, TFLM (probably) tries to read the buffer directly and hence the byte-swap is not performed.
After some further investigation, I managed to resolve the problem. Posting the solution if anyone else runs into similar problems. The steps assume only INT32 (or other 4-bytes data types such as FLOAT)
are used for the model parameters.
- swap the bytes inside the buffer using the mutable accessors or C API:
tflite::ModelT model;
tflite::GetModel(MODEL)->UnPackTo(&model);
uint32_t sizeBuffer = model.buffers.size();
for (uint32_t i = 0; i < sizeBuffer; i++)
{
uint32_t sizeData = model.buffers.at(i)->data.size();
for (uint32_t j = 0; j < sizeData; j += 4)
{
uint8_t vals[] = {
model.buffers.at(i)->data[j + 0],
model.buffers.at(i)->data[j + 1],
model.buffers.at(i)->data[j + 2],
model.buffers.at(i)->data[j + 3],
};
model.buffers.at(i)->data[j + 0] = vals[3];
model.buffers.at(i)->data[j + 1] = vals[2];
model.buffers.at(i)->data[j + 2] = vals[1];
model.buffers.at(i)->data[j + 3] = vals[0];
}
}
flatbuffers::FlatBufferBuilder builder;
builder.Finish(tflite::Model::Pack(builder, &model));
uint8_t *buf = builder.GetBufferPointer();
int size = builder.GetSize();
std::ofstream ofile(name, std::ios::binary);
ofile.write((char *)buf, size);
ofile.close();
- Edit the flatbuffer_utils.cc file by modifying FlatBufferVectorToTfLiteTypeArray methods (similarly for the float array):
/// container to hold the allocated TfLiteInt arrays to avoid memory leak
std::map<const flatbuffers::Vector<int32_t>*, TfLiteIntArray*>
TfliteIntArrayMap;
TfLiteIntArray* FlatBufferVectorToTfLiteTypeArray(
const flatbuffers::Vector<int32_t>* flatbuffer_array) {
// if the key doesnt exist
if (TfliteIntArrayMap.count(flatbuffer_array) == 0) {
int size = flatbuffer_array->size();
// allocate memory for it
size_t alloc_size = TfLiteIntArrayGetSizeInBytes(size);
if (alloc_size <= 0) return nullptr;
TfLiteIntArray* ret = (TfLiteIntArray*)malloc(alloc_size);
if (!ret) return ret;
ret->size = size;
for (int i = 0; i < size; i++) {
ret->data[i] = flatbuffer_array->Get(i);
}
// save it to map
TfliteIntArrayMap.insert(
std::pair<const flatbuffers::Vector<int32_t>*, TfLiteIntArray*>(
flatbuffer_array, ret));
}
// return the map element for the given flatbuffer::Vector
return TfliteIntArrayMap[flatbuffer_array];
}
Note that byteswapping the subgraphs as well won’t work, as TFLM loads vectors from those buffers as well as reinterpret_cast it to TfLiteArray, so in one of those cases, the resulting container will have its size swapped anyway.
It’s not the cleanest solution, but seems to give correct results.
1 Like