I am new to Keras, and I don’t really understand how many units I have to put for every layer. Is the number related to my input_shape? or just put a random number for it? For example, why they decide to put 32 units for the first layer and 10 units for the second layer from the code below? Thanks
model = models.Sequential() model.add(layers.Dense(32, activation=‘relu’, input_shape=(784,))) model.add(layers.Dense(10, activation=‘softmax’))
The choice of the number of units in the first layer (32) is a hyperparameter that determines the number of neurons and outputs. The number of units in the last layer (10) is typically based on the number of classes in a classification problem. In this case, it suggests that the network is being used for a classification task with 10 classes.
Does that mean it should be fine to put random number for the first layer? And the number of last layer depends on the result I want. For example, if I want the result to be 0 and 1, then the number of last layer should be 1?
Does that mean it should be fine to put random number for the first layer?
The number of units in the first layer should not be random but should be determined based on the complexity of the data. Too few units can result in poor performance, while an excessive number of units can lead to overfitting. It is generally advisable to start with a small number of units and gradually increase complexity if necessary.
if I want the result to be 0 and 1, then the number of last layer should be 1?
You should use model.add(layers.Dense(1, activation='sigmoid')) as the final layer of your neural network.