Adetunji Fasiku
Data Analyst | 50M+ Data Points | 95% Model Accuracy | Team Leader
Parkville, MD |
tunchiiefashh@gmail.com |
linkedin.com/in/adetunji-fasiku |
github.com/Tunchiie
Professional Summary
Data analyst with experience in machine learning, statistical analysis, and team leadership. Built models with 95% accuracy, processed 50M+ data points, led 25+ team members. Currently completing Data Analytics certification at Correlation One while working on algorithmic trading systems.
Technical Skills
Programming & Analytics
- Python (Advanced)
- SQL (Expert - 5+ Platforms)
- HTML/CSS
- Pandas & NumPy
- Statistical Analysis
- SHAP (ML Explainability)
- XGBoost & Scikit-learn
Data & Visualization
- Tableau & Looker
- Streamlit Dashboards
- Plotly & Seaborn
- Excel (Advanced)
- Statistical Testing (ANOVA)
- A/B Testing
- Data Pipeline Design
Database & Infrastructure
- Oracle SQL (Enterprise)
- SQL Server
- PostgreSQL & MySQL
- Database Architecture
- ETL Pipeline Development
- AWS (Cloud Computing)
- API Integration
Key Projects
Bungaku – Hit Song Pattern Analyzer
June 2025
• Analyzed 130K+ Billboard hits to crack the code of what makes songs chart-toppers
• Used statistical tests (ANOVA, Tukey HSD) to find the real drivers of hit probability
• Discovered cool patterns: winter songs are quieter, modern hits are louder and less acoustic
• Created visualizations to help music producers understand successful patterns
• Technologies: Python, Seaborn, SciPy, Exportify API, Pandas, ANOVA, T-Test
• Used statistical tests (ANOVA, Tukey HSD) to find the real drivers of hit probability
• Discovered cool patterns: winter songs are quieter, modern hits are louder and less acoustic
• Created visualizations to help music producers understand successful patterns
• Technologies: Python, Seaborn, SciPy, Exportify API, Pandas, ANOVA, T-Test
CandleThrob – Enterprise Financial Data Pipeline & Trading System
March 2025 - Ongoing
• Engineered comprehensive financial data pipeline processing 510+ securities (S&P 500, 18 ETFs, 10 macroeconomic indicators) with 24+ years of historical data (Jan 2000-present)
• Implemented dual-source data integration: Yahoo Finance for historical analysis and Polygon.io for real-time minute-level data spanning 2+ years
• Built enterprise-grade Oracle database architecture with 50M+ data points, featuring incremental loading and bulk operations for high-performance analytics
• Developed 113+ technical indicators using TA-Lib with advanced pattern recognition algorithms for algorithmic trading strategies
• Designed scalable system architecture supporting neural networks and ensemble techniques for future trading strategy implementation
• Technologies: Python, Oracle SQL, TA-Lib, Polygon.io API, Yahoo Finance, Pandas, NumPy, Enterprise Database Design
• Implemented dual-source data integration: Yahoo Finance for historical analysis and Polygon.io for real-time minute-level data spanning 2+ years
• Built enterprise-grade Oracle database architecture with 50M+ data points, featuring incremental loading and bulk operations for high-performance analytics
• Developed 113+ technical indicators using TA-Lib with advanced pattern recognition algorithms for algorithmic trading strategies
• Designed scalable system architecture supporting neural networks and ensemble techniques for future trading strategy implementation
• Technologies: Python, Oracle SQL, TA-Lib, Polygon.io API, Yahoo Finance, Pandas, NumPy, Enterprise Database Design
Customer Purchase Predictor
January 2025
• Analyzed 3.9M+ customer behavioral events to identify purchase timing patterns (peak activity 6 AM-12 AM)
• Engineered behavioral features including clumpiness and buyer segmentation to improve model performance
• Built a classifier that predicts purchase triggers with optimized threshold for business decision-making
• Used SHAP to provide model explainability for marketing team implementation
• Technologies: Python, SHAP, Streamlit
• Engineered behavioral features including clumpiness and buyer segmentation to improve model performance
• Built a classifier that predicts purchase triggers with optimized threshold for business decision-making
• Used SHAP to provide model explainability for marketing team implementation
• Technologies: Python, SHAP, Streamlit
Loan Default Predictor – Lending Club
September 2024
• Processed 2.2M+ borrower records and engineered custom features (e.g. avg. days between payments, payment count) to improve model signal
• Built and benchmarked classification models to predict default risk: logistic regression baseline (66% accuracy) vs. tuned XGBoost model (77% accuracy, 0.76 F1-score)
• Visualized precision-recall tradeoffs and optimized classification threshold (0.87) to reduce false approvals
• Applied SHAP to explain top risk factors (e.g., loan grade, annual income) and support risk-based lending decisions
• Designed for use in PD segmentation, credit approval workflows, and adverse action reasoning
• Technologies: Python, XGBoost, Logistic Regression, SHAP, Scikit-learn, Pandas
• Built and benchmarked classification models to predict default risk: logistic regression baseline (66% accuracy) vs. tuned XGBoost model (77% accuracy, 0.76 F1-score)
• Visualized precision-recall tradeoffs and optimized classification threshold (0.87) to reduce false approvals
• Applied SHAP to explain top risk factors (e.g., loan grade, annual income) and support risk-based lending decisions
• Designed for use in PD segmentation, credit approval workflows, and adverse action reasoning
• Technologies: Python, XGBoost, Logistic Regression, SHAP, Scikit-learn, Pandas
Breast Cancer Classifier – Logistic Regression
April 2024
• Coded logistic regression from scratch; achieved 95% test accuracy and visualized learning convergence
• Technologies: Python (NumPy, Matplotlib)
• Technologies: Python (NumPy, Matplotlib)
Education
Bachelor of Science
University of Maryland Baltimore County
2019 - 2022
Coursework in Python programming, machine learning, statistical inference, AI, and data visualization.
Certifications & Learning
Machine Learning Course
Andrew Ng - Stanford University (Coursera)
2024
Completed comprehensive ML course covering supervised/unsupervised learning, neural networks, and practical implementation. Applied knowledge in from-scratch logistic regression project.
Additional Coursework
Various Platforms
Ongoing
• Data Science Specializations
• Advanced Python Programming
• Statistical Analysis and Modeling
• Financial Markets and Trading
• Advanced Python Programming
• Statistical Analysis and Modeling
• Financial Markets and Trading
Professional Experience
Data Analytics Trainee
Correlation One
January 2025 - July 2025
• Built analytics solutions for supply chain, HR, and finance teams - processing 180K+ records and turning data chaos into clear dashboards
• Applied statistical testing (ANOVA, hypothesis testing) to identify patterns relevant to business decisions
• Examined over 130,000 Billboard entries to uncover hit song patterns, significantly enhancing insights into successful music trends.
• Solved real problems across automotive, education, and entertainment - because good data skills work everywhere
• Applied statistical testing (ANOVA, hypothesis testing) to identify patterns relevant to business decisions
• Examined over 130,000 Billboard entries to uncover hit song patterns, significantly enhancing insights into successful music trends.
• Solved real problems across automotive, education, and entertainment - because good data skills work everywhere
Process Guide (Operations Team Lead)
Amazon
May 2023 - February 2025
• Led a team of 25+ people moving 240K+ items daily - boosted productivity 20% through data analysis
• Reduced downtime by 30% by identifying and systematically addressing operational bottlenecks
• Worked with IT and Facilities to solve workflow problems that were slowing everyone down
• Kept a high-pressure operation running smoothly while keeping everyone motivated and focused
• Reduced downtime by 30% by identifying and systematically addressing operational bottlenecks
• Worked with IT and Facilities to solve workflow problems that were slowing everyone down
• Kept a high-pressure operation running smoothly while keeping everyone motivated and focused