
Deekshith Ravula
Accounting / Finance
About Deekshith Ravula:
Dear Hiring,
It was a pleasure having the position.
Experience
- Involved in DatabaseDesign, Data architecture, Data modeling,Development, Implementation, ETL and Reporting in SQL Server 2000/2005/2008.
- Extensive experience in Databasedevelopment for both OLTP (Batch Processing, Online Processing) & OLAP Systems (SSRS, SSIS, SSAS), Microsoft PowerBI and using SQL Server 2000/2005/2008.
- Expert in providing Enterprise Reporting solutions using SSRS 2000/2005, Crystal Reports,Performance Point Server, ProClarity and SharePoint Services.
- Extensive experience in developing ETL mappings, scripts, and data integration using Informatica
Power Center.
- Worked with Business Owners and Stakeholders to gather requirements for the report and implement those in Power BI and build visuals which shows meaning full insights to the Business.
- Extensive ETL experience using DTS/SSISfor Data Extractions, Transformations and Loads.
- Expertise in developing application using.NET, C# , SQL Server.
- Over 6 years of extensive hands-on experience in the IT industry specializing in Data Engineering and Cloud Solutions with proficiency in Spark, PySpark, AWS, Azure, Databricks, and Snowflake. Proven expertise in big data processing, cloud data services, and ETL processes, alongside strong skills in SQL, data modeling, and data visualization.
- Strong working experience in Spark and PySpark for big data processing, including writing complex transformations, actions, and optimizations.
- Proficient in AWS services such as EC2, S3, EMR, Redshift, Athena, and Glue for scalable data storage and processing.
- Expertise in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage, and Azure SQL Database for comprehensive data engineering solutions.
- Strong experience in writing scripts using Python API, PySpsark API and Spark API for analyzing the data.
- Hands - on use of Spark and Scala API's to compare the performance of Spark with Hive and SQL, and Spark SQL to manipulate Data Frames in Scala.
- Experienced on working data warehouse and analytics projects, which are running on different environments including AWS (Redshift, S3, and Quick sight) and other big data technologies.
- Automating ETL process by using Unix/Perl/Python scripting languages.
- Experience in developing web applications by using Python, Django, C++, XML, CSS, HTML, JavaScript and jQuery.
- Strong experience in Teradata, Informatica, Python, UNIX shell scripting for processing large volumes of data from varied sources and loading into databases like Teradata, Oracle.
- Implemented various components in OLAP systems using different ETL tools like OWB, Wherescape RED and Pentaho, Informatica PWC
- Experienced in designing Star Schema, Snowflake schema for Data Warehouse, by using tools like Erwin data modeler, Power Designer and Embarcadero E-R Studio.
- Extensive experience as a solution architect for business information systems, focusing on Data Architecture, Data Stores and Data Mart/Data Warehouse concepts.
- Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like Erwin.
- Experienced in Logical Data Model (LDM) and Physical Data Models (PDM) using Erwin data modeling tool.
- Good knowledge of Data Marts, OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
- Experienced working with JIRA for project management, GIT for source code management, JENKINS for continuous integration and Crucible for code reviews
- Used MongoDB to stored data in JSON format and developed and tested many features of dashboard using Python, Bootstrap, CSS, and JavaScript.
- Experience in full Software Development Life Cycle (SDLC) that includes Analyzing, Designing, Coding, Testing, implementation & Production Support as per quality standards using Waterfall, Agile methodology
- Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute.
- Performed data profiling of data vault hubs, links and satellites using Erwin generated SQL scripts and designed Physical Data Model (PDM) using Erwin data modeling tool and SQL and T-SQL Managed Meta-data for data models.
- Experienced in analyzing/designing various components related to Data warehouse system and keeping the artifacts in Confluence/SharePoint repository.
Education
- Over 6 years of extensive hands-on experience in the IT industry specializing in Data Engineering and Cloud Solutions with proficiency in Spark, PySpark, AWS, Azure, Databricks, and Snowflake. Proven expertise in big data processing, cloud data services, and ETL processes, alongside strong skills in SQL, data modeling, and data visualization.
- Strong working experience in Spark and PySpark for big data processing, including writing complex transformations, actions, and optimizations.
- Proficient in AWS services such as EC2, S3, EMR, Redshift, Athena, and Glue for scalable data storage and processing.
- Expertise in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage, and Azure SQL Database for comprehensive data engineering solutions.
- Strong experience in writing scripts using Python API, PySpsark API and Spark API for analyzing the data.
- Hands - on use of Spark and Scala API's to compare the performance of Spark with Hive and SQL, and Spark SQL to manipulate Data Frames in Scala.
- Experienced on working data warehouse and analytics projects, which are running on different environments including AWS (Redshift, S3, and Quick sight) and other big data technologies.
- Automating ETL process by using Unix/Perl/Python scripting languages.
- Experience in developing web applications by using Python, Django, C++, XML, CSS, HTML, JavaScript and jQuery.
- Strong experience in Teradata, Informatica, Python, UNIX shell scripting for processing large volumes of data from varied sources and loading into databases like Teradata, Oracle.
- Implemented various components in OLAP systems using different ETL tools like OWB, Wherescape RED and Pentaho, Informatica PWC
- Experienced in designing Star Schema, Snowflake schema for Data Warehouse, by using tools like Erwin data modeler, Power Designer and Embarcadero E-R Studio.
- Extensive experience as a solution architect for business information systems, focusing on Data Architecture, Data Stores and Data Mart/Data Warehouse concepts.
- Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like Erwin.
- Experienced in Logical Data Model (LDM) and Physical Data Models (PDM) using Erwin data modeling tool.
- Good knowledge of Data Marts, OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
- Experienced working with JIRA for project management, GIT for source code management, JENKINS for continuous integration and Crucible for code reviews
- Used MongoDB to stored data in JSON format and developed and tested many features of dashboard using Python, Bootstrap, CSS, and JavaScript.
- Experience in full Software Development Life Cycle (SDLC) that includes Analyzing, Designing, Coding, Testing, implementation & Production Support as per quality standards using Waterfall, Agile methodology
- Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute.
- Performed data profiling of data vault hubs, links and satellites using Erwin generated SQL scripts and designed Physical Data Model (PDM) using Erwin data modeling tool and SQL and T-SQL Managed Meta-data for data models.
- Experienced in analyzing/designing various components related to Data warehouse system and keeping the artifacts in Confluence/SharePoint repository.
Professionals in the same Accounting / Finance sector as Deekshith Ravula
Professionals from different sectors near Ohio, Town of Ohio, Herkimer
Jobs near Ohio, Town of Ohio, Herkimer
-
This Senior Infrastructure Analyst role requires on-site attendance at our home office where you will join a business casual environment. · Last update: Bachelors degree is essential · Advanced degree is preferred 1 year of experience in System Infrastructure desired · ...
Oriskany3 weeks ago
-
We are seeking a hands-on data scientist to join our Marketing Analytics team. This person will play a dual role: doing the advanced analytical work that powers our team's strategic output, and helping build the data science infrastructure our team needs to grow. ...
Utica-Rome Area2 weeks ago
-
We are proudly building an environment where you can bring your authentic self to work. · Enjoy doing things that people say can't be done? Innovation is at the center of everything we do. · Hate red tape? We remove roadblocks instead of creating them. ...
Marcy1 month ago