
krishna deepika
Engineering / Architecture
About krishna deepika:
📊 My first script was probably a [SELECT * FROM life] — and I’ve been optimizing ever since…
What started as a curiosity for “how data flows behind the scenes” during my undergrad years quickly evolved into a full-blown passion for data engineering. Courses in databases, big data, and cloud computing sparked my interest — but it was the thrill of building scalable systems and optimizing pipelines that truly hooked me.
Eventually I took the first step into the journey of data with a Data Analyst internship, where I first saw the power of transforming messy raw data into real business impact. That internship turned into a full-time role, and over the next few years, I immersed myself in SQL-heavy reporting, building dashboards, and automating analytics pipelines to improve efficiency and insights across domains.
Driven by a deeper curiosity and a love for math, I pursued a Master’s in Data Science and Business Analytics. During grad school, I interned again — this time focused on cloud data engineering — and I knew I had found my niche. I loved the challenge of building systems that weren’t just analytical but also scalable, automated, and built for real-time decision-making. Currently, I’m a Data Engineer at Acer America, where I’ve:
- Designed ETL pipelines that process 1.2TB+ of data weekly
- Automated 85+ Airflow jobs across departments
- Worked closely with data scientists to deploy ML models that improved failure prediction accuracy by 23%
- Supported real-time analytics for over 5 million customer records across Snowflake, Redshift, and AWS/GCP stacks
🛠 Core Skills: ETL Pipelines | Data Modeling | Airflow | SQL | Python | Spark
☁️ Cloud & Tools: Snowflake | Redshift | BigQuery | dbt | Kafka | SageMaker | Power BI
Outside work, I’m passionate about sustainability. I research the environmental impact of recycled and organic waste, enjoy gardening, and love creating homemade eco-friendly products — it’s my way of balancing tech with the earth 🌱
If you’re building something impactful or just want to talk data or sustainability — I’d love to connect!
Experience
- 4+ years of end-to-end experience in data engineering and analytics, transitioning from Data Analyst Intern in India to a full-time Data Engineer role in the U.S.
- Hands-on expertise building ETL pipelines using Apache Spark, AWS Glue, and Python, processing over 1.2TB/week and supporting analytics for 5M+ customer records.
- Designed and optimized SQL-heavy data transformations using Teradata SQL, T-SQL, and Snowflake, enabling robust BI outputs and advanced reporting layers.
- Automated 85+ data workflows via Apache Airflow, simulating Control-M-style job orchestration to reduce manual intervention by over 90%.
- Collaborated with data scientists to deploy ML models (AWS SageMaker), improving predictive accuracy and integrating insights into business processes.
- Conducted data cleaning, validation, and root cause analysis using Python, Pandas, Jupyter, and Power BI, supporting inventory, marketing, and customer data strategies.
- Skilled in cloud-native data engineering across AWS (Redshift, S3, Lambda), Azure (Data Factory, Blob), and Google Cloud (BigQuery, Dataflow, Pub/Sub).
- Additional technologies: dbt, Kafka, Docker, Kubernetes, Terraform, SSIS, Looker, Tableau, Power BI, Notion, JIRA, Confluence.
Education
🎓 Master of Science in Data Science and Business Analytics – University of North Carolina at Charlotte
🎓 B.Tech in Engineering – SRM University, India
🌏 AI & ML Winter Program – Asia University, Taiwan
Certifications:
✅ Microsoft Certified: Azure Data Engineer Associate
✅ Machine Learning in Production – Coursera
✅ Machine Learning/AI Intern – Microsoft Verzeo
✅ Python (Crash, Intermediate, Advanced) – Coursera
Related professionals
Professionals in the same Engineering / Architecture sector as krishna deepika
Professionals from different sectors near Charlotte, Mecklenburg
Other users who are called krishna
Jobs near Charlotte, Mecklenburg
-
We are looking for a Senior Software Engineer with deep experience in Generative AI, · LLM platforms, and Python to build and scale production-grade AI systems. · Design and develop scalable, production-grade Generative AI applications using Python · Build LLM pipelines, RAG syst ...
Charlotte, NC2 weeks ago
-
Job Summary · The purpose of the Senior AWS Solution Architect is to provide leadership and subject-matter expertise to development teams that build cloud-native solutions on AWS. Based on a thorough understanding of how to efficiently build high-quality, cloud-native application ...
Charlotte, NC1 week ago
-
We are a leading global software company dedicated to the world of computer-aided design, 3D modeling, and simulation— helping innovative global manufacturers design better products, faster With the resources of a large company and the energy of a software start-up, we have fun t ...
Charlotte, NC2 weeks ago