Hi! I'm Bhavana Jaiswal, a Cloud Data Engineer with 3+ years of experience. I thrive at the intersection of data, technology, and innovation, leveraging my expertise in Fabric, Azure Data Factory, Databricks, ADLS SQL, and Pyspark to deliver impactful solutions.
Key Achievements
Top 5% on Topmate, Medium Author, 5+ End-to-End Data Engineering Projects, Received Multiple Managerial Recommendations, Praised for Fast Learning and Real-World Problem Solving
Certifications
1x GCP & Databircks Certified Data Engineer Associate, Essentials in GenAI Professional Certificate, Microsoft Fabric Engineering with Azure Badge - Microsoft
Goals
Google Professional Cloud Data Engineer Certification, Earning the AWS Golden Jacket.
Recommendations from Colleagues, Director & Managers
Hear what colleagues and managers have to say about Bhavana's dedication, expertise, and collaborative spirit.
Arun Kumar G
Director - Technology (Managed Directly, Mar 2025)
Bhavana is a dedicated and hardworking individual with an exceptional work ethic. She proactively explores new tools, grasps complex concepts quickly, and solves problems with a positive attitude. Her enthusiasm fosters a culture of continuous learning.
Kamalutheen Abdul Rasheed
Manager - Service Delivery Automation (Managed Directly, Jan 2025)
As an Associate Consultant, Bhavana consistently showed a strong willingness to learn and contribute effectively. Her attention to detail, collaborative spirit, and technical skills made a significant positive impact on our projects. A valuable asset to any team.
Rahul Kumar
Data Engineer
Bhavana's dedication, expertise in SQL, Python, and Azure, and her contributions to our Azure Synapse Analytic Project were exceptional. Collaborating with her was a pleasure, and I highly recommend her for any data engineering role.
Karishma Mohammed
Data Engineer
Despite being a fresher, Bhavana displayed remarkable talent and potential in Data Science and Data Engineering. Her unwavering determination, quick grasp of concepts, and collaborative spirit make her an invaluable team player. Highly promising future.
Nnamdi .S
Data Engineer
Bhavana is a very smart, hard-working, and highly focused individual. She's resourceful and always open to learning, making her a great fit for any team looking to maximize data insights.
Shilpa Arya
Very good in technology.
Core Skills & Expertise
Cloud & Data Platforms
Microsoft Fabric
Azure Data Factory
Databricks
ADLS Gen 2
GCP
AWS
Programming & Scripting
Python
PySpark
SQL
Shell Scripting
Data Warehousing & Modeling
Data Warehousing
Data Modeling
DBT
ETL & Data Integration
ETL/ELT Pipeline Development
Data Ingestion
Data Transformation
Platforms
MySQL
Postgre SQL
SSIS/ SSMS
Fivetran
Tools & Methodologies
Power BI
Gen AI
Git/GitHub
Excel, PPT, Doc
Content Writing
Current Role & Vision
As an Associate Consultant at BeauRoi Technology Pvt Ltd, I specialize in designing and deploying cutting-edge data solutions across various industries. My mission is to empower businesses with scalable, data-driven solutions that enhance efficiency and drive growth.
Industry Focus
Inventory Management, Healthcare, E-commerce, Ed-tech, Logistics, Pet Care, Medicare, Insurance, Energy.
Leadership & Collaboration
Leading teams, solving technical challenges, ensuring project success through collaboration and innovation.
Philosophy
Continuous learning, innovation, and leveraging data to unlock new opportunities.
Looking for a detail-oriented, dedicated, and innovative Data Engineer to transform your data into meaningful business insights? Let's connect and explore opportunities to collaborate!
Professional Experience
1
Topmate Mentor
June 2025 - Present | Remote
Helping freshers with resume reviews, LinkedIn branding, role clarity, project help, and interview communication.
2
Cloud Data Engineer, BeauRoi
March 2025 - June 2025 (4 months)
Building scalable ETL pipelines using Databricks, PySpark, and cloud-native services across GCP and AWS. Mentored 10+ interns.
3
Associate Consultant, BeauRoi
October 2024 - June 2025 (9 months) | Remote
Designing, building, architecting, deploying, and testing demo application products. Leading client calls and managing reports.
4
Software Analyst, BeauRoi
March 2024 - September 2024 (7 months) | Remote
Key role in end-to-end process of designing, building, architecting, deploying, and testing demo application products across various domains.
Databricks Data Engineer, Vrahad Analytics
September 2022 - December 2023 | Remote
Built and optimized end-to-end ETL pipelines using Databricks for structured (CSV/XLS) and semi-structured (JSON) datasets.
Designed ER diagrams and relational data models to ensure efficient database architecture.
Connected and ingested data from Azure.
Internship Highlights
Upwork Azure Data Engineer
August 2023 - December 2023 (5 months) | Remote
Bharat Intern - Data Science
August 2023 - September 2023 (2 months)
Stock Prediction with LSTM
Titanic Survival Classification
Handwritten Number Recognition with Neural Networks
KPMG Internship: Data Analytics
During my internship at KPMG, I gained valuable skills in Data Quality Analysis, analytical dashboard creation, and customer segmentation.
Data Quality Assessment
Ensured accuracy and completeness of data for subsequent analysis.
High-Value Customer Identification
Analyzed customer demographics and attributes to identify key segments.
Impactful Presentations
Utilized data visualizations to present key findings and insights effectively.
Accenture Internship: Data Analyst & Visualization
My internship at Accenture provided a solid foundation in data management and analysis, focusing on Excel and Power BI.
Excel & Power BI Proficiency
Gained valuable skills in data cleaning, integration using VLOOKUP, and extracting meaningful insights.
Data Accuracy & Consistency
Responsible for connecting datasets across multiple sheets and applying data cleaning techniques.
Valuable Insights Presentation
Developed a strong understanding of data analysis and visualization to present insights to the team.
Key Projects & Data Solutions
Healthcare Data Pipeline
Built an end-to-end data pipeline in Databricks for healthcare data, processing JSON and Parquet from cloud storage using PySpark. Focused on normalization, efficient querying, and utilizing Unity Catalog, Structured Streaming, and Delta Lake.
Data Engineering
Databricks Platform
PySpark
SQL
California Housing Price Analysis
Explored California housing data with Python, analyzing income, location, and age impact on prices. Included data cleaning, feature engineering, and visualizations in Google Colab to identify price correlations and clusters.
Python
Pandas
Matplotlib
Seaborn
Data Analysis
End-to-End ETL Pipeline (AWS & Databricks)
Developed a secure, scalable ETL pipeline using AWS (S3, IAM) and Databricks with PySpark. Focused on reusable notebooks, permission-controlled workflows, and optimized orchestration, reducing manual effort and improving cost efficiency.
AWS
Databricks Platform
PySpark
Data Ingestion
Cloud Cost Optimization
Social Media Dataset Using SQL
Built a complete relational database from a social media dataset, covering data extraction, cleaning, normalization (3NF), ER diagram design, and SQL schema creation. Developed queries for insights like top users, engagement trends, and post patterns, delivering a fully normalized, query-ready database with clear documentation.
SQL
Data Modeling
Normalization
MySQL
Data Analysis
Pet Consulting Data Pipeline
Designed and implemented a data pipeline for a Pet Consulting app using Databricks and PySpark. Normalized raw client, pet, and service data, reducing redundancy and creating clear entity-relationship diagrams. Built modular PySpark notebooks for reusable, scalable transformations, improving data integrity and preparing structured outputs for analytics.
Normalization
Data Modeling
PySpark
Databricks Platform
Google Cloud Platform
Tokyo Olympic Data Analytics
An end-to-end data analytics project focused on the Tokyo Olympics. This project leveraged Azure Data Factory (ADF) for data ingestion, Azure Logic Apps for orchestration, and Databricks with PySpark for robust data processing and analysis, demonstrating comprehensive pipeline development.
Azure Data Factory (ADF)
Azure Logic Apps
Databricks
PySpark
End-to-End Pipeline
Building on my foundation, these projects highlight my ability to design and implement robust data solutions across diverse platforms and domains.
Education & Languages
Education
OdinSchool Bootcamp, Data Science (Aug 2022 - Dec 2025)
NIELIT, Diploma in Computer Science (July 2021 - July 2023)
University of Allahabad, BA Economics (July 2019 - Aug 2022)
Jeevan Jyoti Public School, Intermdiate (Aug 2017 - Aug 2019)
I look forward to connecting with you and discussing potential collaborations or opportunities!
Thank You for Visiting!
I appreciate your time and interest in my professional journey and projects. Feel free to connect or explore my work further through the links provided.