We could not get the result we wanted from the Few-Shot Learning which we tried last week. Thus, we turned our direction to the Siamese Neural Networks.
First of all we used Synthetic Minority Oversampling Technique which is a newly developed technique for imbalanced datasets. Before giving up the normal MLP model, we obtained the following results using SMOTE;
This week, we made adjustments on data such as increasing number of our top tracks and normal tracks. For now, we have 5255 top tracks and 20558 normal tracks.
We used this data on Tensorflow Keras Sequential Model training. And we tested our first model. We split our data to 3 portion which are Train, Validation and Test.
You can see the model code below;
We used “ReLu” as activation function in our hidden layers. And used “Softmax” in output layer. We tried variety of hyperparameters to observe our model is working properly.
After trying our…