Siamese Twins

We could not get the result we wanted from the Few-Shot Learning which we tried last week. Thus, we turned our direction to the Siamese Neural Networks.

First of all we used Synthetic Minority Oversampling Technique which is a newly developed technique for imbalanced datasets. Before giving up the normal MLP model, we obtained the following results using SMOTE;


This week, we made adjustments on data such as increasing number of our top tracks and normal tracks. For now, we have 5255 top tracks and 20558 normal tracks.

We used this data on Tensorflow Keras Sequential Model training. And we tested our first model. We split our data to 3 portion which are Train, Validation and Test.

You can see the model code below;

Tensorflow Keras Sequential Model

We used “ReLu” as activation function in our hidden layers. And used “Softmax” in output layer. We tried variety of hyperparameters to observe our model is working properly.

After trying our…

Mehmet Ali Korkmaz

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store