Data Engineer Autodesk - Snowflake, United States - InterviewGIG

    InterviewGIG
    InterviewGIG Snowflake, United States

    2 weeks ago

    Default job background
    Description
    Position Overview


    We are looking for an exceptional data engineer to transform, optimize, test, and maintain architectures for enterprise analytics databases, data pipelines, and processing systems, as well as optimizing data flow and collection for teams.

    The mission of the team is to empower decision makers and the broader data communities through trusted data assets and scalable self-serve analytics.

    The focus of your work will be engineering new pipelines and maintaining, creating frameworks, enhancing existing data pipeline with new features to ensure accurate data delivery to stakeholders in a timely manner, also support ad-hoc reporting requirements that facilitate data-driven actionable insights at Autodesk.

    Responsibilities

    Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation and loading of data from a wide variety of data sources using Python, SQL, DBT, other data technologies

    Design, implement,test and maintain data pipelines/ new features based on stakeholders' requirements

    Develop/maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts

    Optimize/ maintain workflows/ scripts on present data warehouses and presentETL

    Design / develop / maintain components of data processing frameworks

    Build and maintain data quality and durability tracking mechanisms to provide visibility into and address inevitable changes in data ingestion, processing, and storage

    Translate technical designs into business appropriate representations and analyze business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business

    Focus on automation use cases, CI/CD approaches and self-service modules relevant for data domains

    Address questions from downstream data consumers through appropriate channels

    Create data tools for analytics and BI teams that assist them in building and optimizing our product into an innovative industryleader

    Stay up to date with data engineering best practices, patterns, evaluate and analyze new technologies, capabilities, open-source software in context of our data strategy to ensure we are adapting, extending, or replacing our own core technologies to stay ahead of the industry

    Contribute to Analytics engineering process

    Minimum Qualifications

    Bachelor's degree in computer science, information systems, or a relateddiscipline

    3+ years in the Data Engineer role

    Built processes supporting data transformation, data structures, metadata, dependency, data quality, and workloadmanagement

    Experience with Snowflake, Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe. Must have worked on Snowflake

    Cost optimization

    scenarios.

    Overall solid programming skills, able to write modular, maintainable code, preferably Python & SQL

    Have experience with workflow management solutions like Airflow

    Have experience on Data transformations tools like
    DBT
    Experience working with Git

    Experience working with big data environment, like, Hive, Spark and Presto

    Strong analytical, problem solving and interpersonal skills

    Familiar with Scrum

    Ready to work flexible European hours

    Preferred Qualifications

    Snowflake


    DBT
    Fivetran

    Airflow

    CI/CD (Jenkins)

    Basic understanding of Power BI

    AWS environment, for example S3, Lambda, Glue, Cloud watch

    Basic understanding of Salesforce

    Experience working with remote teams spread across multiple time-zones

    Attention to detail

    Have a hunger to learn and the ability to operate in a self-guided manner

    #J-18808-Ljbffr