i am training on dataset which has three classes :class, sum of labels:
0, 3132
1, 492
-1, 12
as you can see there is huge inbalance here so wanted to fix it using class weights (maybe there is better way). So i created dict: {-1: 85.27777777777777,0: 0.3919315715562364,1: 2.2893363161819535}
passed it to .fit: model.fit (self.Xs, self.ys, epochs=epoch, batch_size=batch, class_weight = class_weight_dict)
error:ValueError: Expected \
class_weight to be a dict with keys from 0 to one less than the number of classes, found {-1: 85.27777777777777, 0: 0.3919315715562364, 1: 2.2893363161819535}
so i changed class_weight_dict to {0: 85.27777777777777, 1: 0.3919315715562364, 2: 2.2893363161819535}
it feels wrong i dont know how keras is suposed to know which index is for what label but i still get error (it gets further there is 1/15 epochs):
2 root error(s) found. (0) INVALID_ARGUMENT: indices[49] = -1 is not in [0, 3) [[{{node GatherV2}}]] [[IteratorGetNext]] [[Cast/_16]] (1) INVALID_ARGUMENT: indices[49] = -1 is not in [0, 3) [[{{node GatherV2}}]] [[IteratorGetNext]] 0 successful operations. 0 derived errors ignored. [Op:__inference_train_function_5684]
this is my out layer:
model.add(Dense(3, activation='softmax'))
i wanted to use Dense(1, activation='tanh'))
but chatGPT said that is not good idea and was not able to explain why. maybe you could shed some light to that?
compilation of model:
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
thanks in advance for any explanations/solutions/ideas