My main interests lie in the domains of Computer Vision, Natural Language Processing and Data Science.
Computer Vision

Trained a deep learning model that classifies real and AI generated images of faces. The dataset used to train the model can be found here
Utilized ResNet50 as the base layer and added 2 dense layers to the model. Calculated the loss with categorical cross-entropy and optimized the model with Adam optimizer.
Completed on: September 2023


Trained a deep learning model that classifies pneumonia from lung CT scans. The dataset used to train the model can be found here
Generated masks of all the 5233 images present in the dataset by applying smoothing and Roberts cross operator. Utilized ResNet152 as the base layer and added 2 dense layers to the model. Calculated the loss with binary cross-entropy and optimized the model with Adam optimizer.
Completed on: September 2023
Trained a deep learning model using Tensorflow based on the ResNet152, that classifies scans of shoe imprints to their respective sizes. The dataset used to train the model can be found here
The model was trained twice, once with the raw images and once with the masks of the images. The masked images were generated by applying smoothing and Roberts cross edge detector.
Completed on: May 2023
Trained a deep learning model using Tensorflow, that classifies chest CT scans as cancerous or non-cancerous. The dataset used to train the model was obtained from Kaggle
The model was trained twice, once with the raw images and once with the masks of the images. The masked images were generated by applying smoothing and Roberts cross edge detector.
Completed on: May 2023
Natural Language Processing
Trained a deep learning model using a combination of LSTM and GRU layers, that classifies news headlines as sarcastic or non-sarcastic. The dataset, obtained from Kaggle, was preprocessed by removing the stopwords and tokenizing the headlines. The headlines were then converted to word vectors using the Word2Vec model. The activation function used in the hidden layers were ‘tanh’, while the output layer used ‘sigmoid’.
Completed on: May 2023
Trained a deep learning model using Tensorflow that responds to a sentence from a movie with a relevant quote from the same movie. The custom dataset was created by referring to the Cornell Movie-Dialogs Corpus. The dataset was cleaned and preprocessed using NLTK.
Completed on: April 2023
Trained a deep learning model using Tensorflow that responds to a sentence from a movie with a relevant quote from the same movie. The custom dataset was created by referring to the Cornell Movie-Dialogs Corpus. The dataset was cleaned and preprocessed using NLTK.
Completed on: Mar 2023
Data Science
iChurn
Analyzed the performance of different classifiers for predicting customer churn. The dataset used to train the models was obtained from Kaggle. The dataset was also used to train a convolutional neural network.
Completed on: May 2023
IPLay
Developed a web scraper that scrapes the IPL website for the tables of all previous seasons, using Selenium and Beautiful Soup. Trained multiple machine learning models that predicts the probability of a team in the Indian Premier League qualifying for the playoffs. The decision tree model performed with the highest accuracy.
Completed on: April 2023

You may also like

Back to Top