ValueError: logits and labels must have the same shape ((None, 1) vs (None, 2))
# you should reshape your labels as 2d-tensor
# the first dimension will be the batch dimension and the second the scalar label)
y_train = np.asarray(train_labels).astype('float32').reshape((-1,1))
y_test = np.asarray(test_labels).astype('float32').reshape((-1,1))
ValueError: `logits` and `labels` must have the same shape, received ((None, 10) vs (None, 1)).
#this problem always related to loss function ,check number of classes if binary or categorical
LOSS='binary_crossentropy' # for binary (2 classes)
LOSS='categorical_crossentropy' # for categorical (3-5)
LOSS = 'sparse_categorical_crossentropy' # for categorical more than 5
model.compile(loss=LOSS,
optimizer='adam',
metrics=['acc'])
ValueError: logits and labels must have the same shape ((None, 1) vs (None, 2))
# you should reshape your labels as 2d-tensor
# the first dimension will be the batch dimension and the second the scalar label)
y_train = np.asarray(train_labels).astype('float32').reshape((-1,1))
y_test = np.asarray(test_labels).astype('float32').reshape((-1,1))
ValueError: `logits` and `labels` must have the same shape, received ((None, 10) vs (None, 1)).
#this problem always related to loss function ,check number of classes if binary or categorical
LOSS='binary_crossentropy' # for binary (2 classes)
LOSS='categorical_crossentropy' # for categorical (3-5)
LOSS = 'sparse_categorical_crossentropy' # for categorical more than 5
model.compile(loss=LOSS,
optimizer='adam',
metrics=['acc'])