I have a dataset of several hundreds measurements each represented by an array of 2x60 datapoints. The measurements are categorized by labels into 8 different categories.
The data is structured as follows:
DATA = [
{‘labelA’: np.array([[0.0, 0.0], [1.1, 3.1], [2.5, 6.5], …, [0.1, 0.0]])},
{‘labelB’: np.array([[0.0, 0.0], [2.1, 1.1], [3.2, 4.5], …, [0.2, 0.0]])},
{‘labelA’: np.array([[0.0, 0.0], [1.1, 3.2], [2.5, 6.6], …, [0.1, 0.0]])},
…
{‘labelH’: np.array([[0.0, 0.0], [3.1, 1.1], [4.2, 4.2], …, [0.1, 0.0]])},
]
A single measurement can be represented in a graph:
I have two questions:
- What is the best way to load this data into a Tensorflow dataset?
- What Tensorflow model is best to use for this type of data? Since expanding my dataset is very time consuming I would like to make the most out of it.
Any reference to similar problems is appreciated!
Many thanks in advance.