Hi @Kiran_Sai_Ramineni ,
Thank you for your reply.
I have made some progress but am still struggling with this problem.
I am trying to classify signals using a convolutional neural network in 1 dimension.
I have made a minimal example where I train the model by assigning the following labels to the signals.
[1, 1, 1, 1, 1, 1, 1, 1] label: 1
[2, 2, 2, 2, 2, 2, 2, 2] label: 2
[3, 3, 3, 3, 3, 3, 3, 3] label: 3
[4, 4, 4, 4, 4, 4, 4, 4] label: 4
I have organised the training data in this way:
data=np.array([[[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4],
[1, 2, 3, 4]]])
From what I have understood, the first dimension corresponds to the batch, the second one, to the signal (it is along this direction that the convolution is made) and the third one, to the category.
The labels are organised this way:
labels=np.array(np.array([[0, 1, 2, 3]]))
The structure of the neural network is as follows:
model=models.Sequential()
model=tf.keras.models.Sequential([
tf.keras.layers.Conv1D(1, 1, input_shape=(8, 4), activation="relu"),
tf.keras.layers.MaxPooling1D(pool_size=8, strides=8),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(1, activation="relu"),
tf.keras.layers.Dense(4, activation="softmax")
])
Of course, this is a silly network but the purpose is simply to reproduce the problem I am currently facing.
I fit the model in this way:
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-6), loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(data, labels, epochs=100000)
This is where I have had lots of problems related to the shapes of the data and the labels. However, with the way things are now, there are no more exceptions thrown about that and the software runs.
Unfortunately, the loss function diverges. If I use a tiny learning rate, it converges but towards a constant value and, when I look at the predictions of the model, it seems to give a weight of 1/4 to each category.
I suppose that there is something wrong with the way I have set up the data and the labels and perhaps with the loss function I have used.
Thank you very much!