Snowflake Cloud Administrator - Chicago, United States - Ansys

Ansys
Ansys
Verified Company
Chicago, United States

2 weeks ago

Mark Lane

Posted by:

Mark Lane

beBee recruiter


Description

Requisition #: 14292


Our Mission:

Powering Innovation That Drives Human Advancement:

When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation.

For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation.

From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys.


Innovate With Ansys, Power Your Career.

Summary / Role Purpose:


This includes working with other architects, developers, and stakeholders to ensure that the data warehouse supports the organization's overall digital transformation goals and data mesh strategy.

Experience with Data Vault 2.0 preferred as well as demonstrable success managing large scale transactional and operational Data Marts/ Lakes.


You will also be responsible for creating and maintaining the technical documentation, ensuring compliance with data governance and security policies, and providing technical guidance and configuration to the team.

This role requires a deep understanding of Snowflake's capabilities, as well as expertise in data modeling, ETL, specifically Fivetran HVR (High-Volume Replicator) and SQL.

This position is remote, although it requires traveling to corporate HQ based on business needs.


Key Duties and Responsibilities:


  • Expertise in Snowflake platform: You should have a deep understanding of the Snowflake platform and its capabilities. You should be able to design and implement complex data architectures on Snowflake and have a thorough understanding of Snowflake's performance optimization techniques.
  • ETL/ELT and data integration: You should be familiar with ETL/ELT, specifically Fivetran HVR (High-Volume Replicator) and data integration tools and techniques. You should be able to design and implement data pipelines that extract data from various sources and load it into Snowflake with mínimal storage hops.
  • Performance Optimization: You should be able to implement strategies to optimize the performance of Snowflake databases, including query tuning, indexing, and data partitioning.
  • Security Management: You should be able to implement and maintain security best practices within the Snowflake environment, including user access control, data encryption, masking, and compliance with regulatory requirements.
  • Capacity Planning: You should be able to monitor resource utilization and plan for capacity expansion to accommodate future growth and workload demands.
  • Cost Management: You should be able to identify opportunities for cost optimization within the Snowflake environment, such as optimizing resource utilization, implementing autoscaling policies, and leveraging costeffective storage options.
  • Cloud expertise: You should have experience working with cloudbased data platforms such as AWS, Azure, or GCP. You should be familiar with cloud security best practices and have experience with cloud infrastructure automation tools.
  • Data governance and security: You should be familiar with data governance and security policies and be able to design solutions to ensure compliance using Snowflake capabilities.
  • Collaboration and communication: You should have excellent communication skills and the ability to work collaboratively with data engineering and analytics teams.
  • Strong advocacy of the Snowflake platform, its capabilities, and technical knowhow.

Minimum Education/Certification Requirements and Experience

  • Bachelor's degree or equivalent experience in Information Technology, Computer Science, Engineering, or related field.
  • Experience Requirements 7+ years' experience:
  • 5 years of experience in Snowflake database management and administration.
  • 2 years in a data architect role with Snowflake Architect Certification or 3 years in a data architect role without architect certification
  • SnowPro certification required.
  • Experience with distributed systems and big data technologies.
  • Familiarity with Agile development methodologies.
  • Excellent communication skills and the ability to work in a collaborative team environment.

Preferred Qualifications and Skills

  • Strong understanding of data modeling, data warehousing, and data integration principles, with the ability to architect scalable and efficient data analytics solutions.
  • Demonstrated expertise in leveraging cloud technologies to enhance data analytics.
  • Databuildtool (dbt) experience automating platform code in a versioned CI/CD process.
  • Possess a broad set of working skills and knowledge across hardware, software systems and solutions development across more than one technical domain.
  • Exceptional written and verbal communication skills, capable of articulating complex technical ideas and analytical findings in a concise and compelling manner.
-

More jobs from Ansys