Career Profile
I am an adept analytical detail-oriented self-motivated individual and a keen learner with an eye for detail. Capable of turning dry analysis into an exciting visualization story using data visualization tools that influence the direction of the business and communicating with diverse teams to take a project from start to finish.
Actively looking for full-time opportunities in the field of Data Science, Machine learning as Data Scientist, Data Analyst, Business Intelligence Analyst.
Experiences
- Create Daily, Weekly, Monthly, Quarterly and Annually illustrative interactive dashboards to monitor Platform health using Splunk and Tableau. Analyze the incidents by the root cause and predict the potential failures on a selected application.
- Outcome: The model helped enhance the resource utilization by 50% and helped in cutting down the maintenance time from 12 hours to 4 hours every weekend.
- Create live interactive single platform view dashboards to monitor emergency fixes, status updates for critical processes and job runs, abends by the root cause, ad-hoc requests to be addressed.
- Outcome: Rather than using 5 different applications to monitor a job or an application, this dashboard platform gives user the privilege to view all of the requirements in a single page, Thus saving tremendous time and manual efforts.
- Design Database schema to store the SLA compliance metrics for any given job/application at a specified time of the day. Create interactive dashboards to monitor SLA compliance.
- Outcome: This dashboard gives user a clear picture of the jobs that are about to miss an SLA and notifies the user through mail when the time for SLA compliance is <25% remaining.
- Work with NLP libraries such as NLTK, openNLP, Stanford-NLP, WordNet, SAS Text Miner or the NLP software to track system progress and improve upon existing methodologies by developing new data sources, testing model enhancements, and fine-tuning model parameters.
- HR Predictive Modelling
- Outcome: Suggested a new tactic to persuade leaving employees to stay with the company, resulting in a 5% decrease in attrition.
- Gender Diversity Analysis
- Outcome: Introduced a live connection to the dashboard with auto fetching of data from the database, resulting in reduced manual data fetch and load operations.
- Tele Commute (Work from Home) Analysis
- Outcome: After introducing the priority scheduling, the tele commute rate increased slightly by 0.8% wit in a month time frame.
Fetched employee data from the server to clean, pre-process, and analyze the insights of the attrition rate. Predict the likelihood of the employee attrition rate and create dashboard views of analysis and predictions.
Using complex SQL queries, performed ETL operations on data. Interpreted & analyzed the results using statistical techniques like MS EXCEL, Tableau. Provide ongoing reports and display in front-end dashboards created using Tableau, R-shiny, Python, MySQL.
Scripted a priority scheduling algorithm in Python to prioritize the work from home eligibility. Analyzing the tele commute utility ratio and the trends in the work from home days utilized.
- Scrapped data from web using Python, designed database schema and created NO SQL database. Introduced dynamic interactive dashboards that replaced traditional static reports, increasing the productivity of sales and helping in better understanding of the data insights.
- Outcome: Trend & competitive analysis on Continental’s Market Pricing data resulting an increase of 16% sales of winter tires.
- Performed data base schema design, data collection, data base creation and web designing using JavaScript, MySQL, HTML, CSS
- Outcome: Created a database using MySQL and information of about 10,000 students and 280 staff with different logins for both.
Projects
- Performed Data preprocessing on the twitter political data from year 2009-2010 using Microsoft Excel, Tableau, R and Topic modelling using LDA Tuning and LDA to find out the topics in the datasets.
- Applied Lexicon based sentiment analysis using ‘SentimentR’ package and ‘Syuzhet’ to classify the sentiment into positive, neutral, and negative on the topics discovered from LDA.
- Applied Predictive Data Analytics on attrition, absenteeism, and time to hire data from the Continental AG. Implemented data cleaning using R Programming Language and MS Excel.
- Random Forest Algorithm applied for prediction of hypothesis which was 76% accurate.
- Created interactive dashboard with Python Framework Dash by Plotly coded and deployed on Heroku for external access.
- Trained the robot to sketch the scenario on canvas based on the voice inputs of the user and further developed the program by updating the image with user feedbacks. Used Spacy in R and Python NLTK for grouping words and creating clusters from the voice inputs.
- Deployed sentiment analysis for vector representations of text and response classification to develop the feedback module.