Hello,
In my React Native app using tensorflow.js, I retrieve an image (a file) as a base64 type with:
const img64 = await FileSystem.readAsStringAsync(result.uri, {encoding:FileSystem.EncodingType.Base64})
This returns something like this:
iVBORw0KGgoAAAANSUhEUgAAAFoAAABaCAIAAAC3ytZVAAAAAXNSR0IArs4c6QAAAERlWElmTU0AKgAAAAgAAYdpAAQAAAABAAAAGgAAAAAAA6ABAAMAAAABAAEAAKACAAQAAAABAAAAWqADAAQAAAABAAAAWgAAAABJfIu3AAAA......
But then when I call:
const imgBuffer = tf.util.encodeString(img64, 'base64').buffer
it returns an empty array…
Hi @aymeric_B ,
I’ve implemented base64 string to buffer conversion in my react-native
app using the following code:
const resizedImage = await ImageManipulator.manipulateAsync(
selectedImage,
[{ resize: { width: 640, height: 640 } }],
{ format: ImageManipulator.SaveFormat.PNG, base64: true }
);
const resizedImagBase = resizedImage.base64 ?? '';
console.log("resizedImage..", resizedImagBase)
const imgBuffer = tf.util.encodeString( resizedImagBase
, 'base64').buffer;
console.log("imgBuffer",imgBuffer)
Output:
Let me know if helps. Thank You!!