top of page
Project gallery


Googles Stock Price Prediction
📈 Stock Price Prediction using Facebook Prophet
Forecasting | Time Series Analysis | Python
Overview
This project demonstrates how to predict Google’s stock prices using the Facebook Prophet model. It involves time series forecasting based on historical data obtained from Yahoo Finance, enabling accurate stock trend prediction for the next 365 days.
Objective
Utilize Facebook Prophet to forecast future stock prices and uncover seasonal trends using time series modeling techniques.
Tools & Libraries
Python – Pandas, NumPy, Matplotlib
Facebook Prophet – Time series forecasting
Jupyter Notebook – For data analysis and model building
Project Steps
Loaded historical Google stock data
Preprocessed data to meet Prophet’s input format
Trained the Prophet model on historical trends
Forecasted stock prices for 365 future days
Visualized both predictions and seasonal components
Output Highlights
📊 Line chart: Historical vs. Predicted Stock Prices
⏳ Seasonal Components: Trend, weekly, and yearly patterns
📅 Forecast graph for next 1 year of stock movement
Outcome
Successfully built a robust forecasting model using Prophet. This project illustrates my ability to work with real-world financial datasets, perform time series modeling, and extract business-relevant insights.
Forecasting | Time Series Analysis | Python
Overview
This project demonstrates how to predict Google’s stock prices using the Facebook Prophet model. It involves time series forecasting based on historical data obtained from Yahoo Finance, enabling accurate stock trend prediction for the next 365 days.
Objective
Utilize Facebook Prophet to forecast future stock prices and uncover seasonal trends using time series modeling techniques.
Tools & Libraries
Python – Pandas, NumPy, Matplotlib
Facebook Prophet – Time series forecasting
Jupyter Notebook – For data analysis and model building
Project Steps
Loaded historical Google stock data
Preprocessed data to meet Prophet’s input format
Trained the Prophet model on historical trends
Forecasted stock prices for 365 future days
Visualized both predictions and seasonal components
Output Highlights
📊 Line chart: Historical vs. Predicted Stock Prices
⏳ Seasonal Components: Trend, weekly, and yearly patterns
📅 Forecast graph for next 1 year of stock movement
Outcome
Successfully built a robust forecasting model using Prophet. This project illustrates my ability to work with real-world financial datasets, perform time series modeling, and extract business-relevant insights.


Lightning Strike Data Analysis
⚡ U.S. Lightning Strike Analysis
Google Advanced Data Analytics | Tableau
Overview
Analyzed lightning strike data across the U.S. using Tableau. Developed an interactive dashboard to explore spatial and temporal patterns based on strike counts, dates, and locations.
Objective
Visualize lightning activity by region and time to uncover trends and provide actionable insights for weather analysis and public safety.
Tools Used
Tableau – Dashboard creation & interactive visuals
Tableau Prep – Data cleaning & structuring
Key Features
Two interactive bar charts for lightning counts by date and location
Dynamic filters to refine visual output by region and time
Clean and responsive layout optimized for user exploration
Outcome
Delivered a responsive, filterable dashboard that helps users explore lightning strike trends across different regions and time periods. Strengthened skills in data storytelling, geospatial visualization, and interactive design.
Google Advanced Data Analytics | Tableau
Overview
Analyzed lightning strike data across the U.S. using Tableau. Developed an interactive dashboard to explore spatial and temporal patterns based on strike counts, dates, and locations.
Objective
Visualize lightning activity by region and time to uncover trends and provide actionable insights for weather analysis and public safety.
Tools Used
Tableau – Dashboard creation & interactive visuals
Tableau Prep – Data cleaning & structuring
Key Features
Two interactive bar charts for lightning counts by date and location
Dynamic filters to refine visual output by region and time
Clean and responsive layout optimized for user exploration
Outcome
Delivered a responsive, filterable dashboard that helps users explore lightning strike trends across different regions and time periods. Strengthened skills in data storytelling, geospatial visualization, and interactive design.


NYC Taxi fare estimation analysis
🚖 NYC Taxi Fare Estimation
Google Advanced Data Analytics | Python, Pandas, Seaborn
Overview
Developed as part of the Google Advanced Data Analytics course in collaboration with Automatidata, this project focused on cleaning and analyzing NYC taxi trip data to support fare estimation for an upcoming predictive app.
Objective
Prepare and analyze real-world taxi trip data to generate insights that will later feed into a regression model for fare prediction.
Tools Used
Python – Pandas, Seaborn, Matplotlib, Scikit-learn, Statsmodels
Jupyter Notebook – For analysis, visualizations, and reporting
Key Contributions
Cleaned and joined large-scale NYC TLC datasets
Performed EDA to uncover patterns in fares, trip duration, and distances
Conducted hypothesis testing for fare-related assumptions
Created impactful visualizations for stakeholders
Delivered an executive summary with findings and recommendations
Outcome
Provided a structured dataset and deep insights to support regression-based fare predictions. Strengthened expertise in EDA, statistical analysis, and stakeholder-focused reporting.
Google Advanced Data Analytics | Python, Pandas, Seaborn
Overview
Developed as part of the Google Advanced Data Analytics course in collaboration with Automatidata, this project focused on cleaning and analyzing NYC taxi trip data to support fare estimation for an upcoming predictive app.
Objective
Prepare and analyze real-world taxi trip data to generate insights that will later feed into a regression model for fare prediction.
Tools Used
Python – Pandas, Seaborn, Matplotlib, Scikit-learn, Statsmodels
Jupyter Notebook – For analysis, visualizations, and reporting
Key Contributions
Cleaned and joined large-scale NYC TLC datasets
Performed EDA to uncover patterns in fares, trip duration, and distances
Conducted hypothesis testing for fare-related assumptions
Created impactful visualizations for stakeholders
Delivered an executive summary with findings and recommendations
Outcome
Provided a structured dataset and deep insights to support regression-based fare predictions. Strengthened expertise in EDA, statistical analysis, and stakeholder-focused reporting.


CivX
CivX is an innovative project focused on detecting and mitigating cyberbullying through advanced machine learning models. As part of this initiative, we created a social media platform, CivX, where users can post images and interact by commenting on them.
The core functionality of CivX revolves around the analysis of comments and replies on these posts, using cutting-edge machine learning algorithms such as Chained LSTM and BERT. These models assess whether the comments contain cyberbullying content and, if so, classify the type of bullying, such as ethnic, age, or other categories.
In addition, the project incorporates bystander dynamics, which helps to understand the role of individuals who are not directly involved in the bullying but may influence the situation. This multi-faceted approach aims to not only detect harmful interactions but also provide insights into the broader social dynamics on online platforms.
I was responsible for completing the *Machine Learning* section of the project, where I developed, trained, and fine-tuned the models to effectively identify and classify cyberbullying content. This work was crucial in ensuring the accuracy and reliability of the system's predictions.
Credits
This project was a collaborative effort, and each team member played a crucial role in bringing CivX to life:
Shanis K – Led the Machine Learning module. Responsible for developing, training, and fine-tuning advanced ML models such as Chained LSTM and BERT for cyberbullying detection and classification, including bystander analysis.
Salman – Handled the entire Frontend and Backend development of the CivX platform. He built the user interface and managed the server-side logic, ensuring seamless integration of ML models and smooth user experience.
Sarang – Managed all Documentation work. He was responsible for preparing technical reports, project overviews, and ensuring the clarity and completeness of all written deliverables.
The core functionality of CivX revolves around the analysis of comments and replies on these posts, using cutting-edge machine learning algorithms such as Chained LSTM and BERT. These models assess whether the comments contain cyberbullying content and, if so, classify the type of bullying, such as ethnic, age, or other categories.
In addition, the project incorporates bystander dynamics, which helps to understand the role of individuals who are not directly involved in the bullying but may influence the situation. This multi-faceted approach aims to not only detect harmful interactions but also provide insights into the broader social dynamics on online platforms.
I was responsible for completing the *Machine Learning* section of the project, where I developed, trained, and fine-tuned the models to effectively identify and classify cyberbullying content. This work was crucial in ensuring the accuracy and reliability of the system's predictions.
Credits
This project was a collaborative effort, and each team member played a crucial role in bringing CivX to life:
Shanis K – Led the Machine Learning module. Responsible for developing, training, and fine-tuning advanced ML models such as Chained LSTM and BERT for cyberbullying detection and classification, including bystander analysis.
Salman – Handled the entire Frontend and Backend development of the CivX platform. He built the user interface and managed the server-side logic, ensuring seamless integration of ML models and smooth user experience.
Sarang – Managed all Documentation work. He was responsible for preparing technical reports, project overviews, and ensuring the clarity and completeness of all written deliverables.
bottom of page