Sr. Data Engineer - Los Angeles, United States - Revel IT

    Default job background
    Description
    We are seeking a

    Data Engineer

    in Los Angeles, California for a remote contract opportunity.

    Data engineer with GCP, Airflow, Python experience.
    Media / streaming data experience strongly preferred.
    This is a remote position.

    PLUTO IS OK

    Top 3 Skills:
    Looking for mid level engineer: 3+ years experience 1 year experience of JCP, Big Query, Python, Airflow.

    Other Skills/Nice to Haves:
    Data Modeling, Spark, Adobe Analytics, Snowflake/AWS are nice to have.

    Team Description:


    Mixed level Data Engineering team serving many business units with analytical datasets and pipelines used for Analytics and Data Science.


    Role Details:


    The Data Engineer should possess a deep sense of curiosity and a passion for building smart data pipelines, data structures and data products and the ability to communicate data structures and tools throughout the Client's Streaming organization.

    The candidate for this role will use their skills in reverse engineering, analytics, and creative, experimental solutions to devise data and BI solutions.

    This engineer supports data pipeline development which includes machine learning algorithms using disparate data sources.

    The ideal candidate will work closely with BI, Research, Engineering, Marketing, Finance, and Product teams to implement data-driven plans that drive the business.

    They will have good communication skills and possess the ability to convey knowledge of data structures and tools throughout the Client's Digital Media organization.

    This candidate will be expected to lead a project from inception to completion as well as help mentor junior members of the team on best practices and approaches around data.


    Your Day-to-Day:
    Works with large volumes of traffic data and user behaviors to build pipelines that enhance raw data.
    Able to break down and communicate highly complex data problems into simple, feasible
    solutions.
    Extract patterns from large datasets and transform data into an informational advantage.

    Find answers to business questions via hands-on exploration of data sets via Jupyter, SQL, dashboards, statistical analysis, and data visualizations.

    Partner with the internal product and business intelligence teams to determine the best approach around data ingestion, structure, and storage.

    Then, work with the team to ensure these are implemented correctly.

    Contributing ideas on how to make our data more effective and working with other members of the engineering, BI teams, and business units to implement changes.

    Ongoing development of technical solutions while developing and maintaining documentation, at times training impacted teams.
    Early on collaboration with the team on internal initiatives to create strategies that improve company processes.

    Look at ways of improving efficiency by staying current on the latest technology and trends and introducing team members to such.

    Develop prototypes to proof out strategies for data pipelines and products.
    Mentor members of the team and department on best practices and approaches.

    Lead initiatives in ways to improve the quality of our data as well as make the data more effective, with other members of engineering, BI teams, and business units to implement changes.

    Able to break down and communicate highly complex data problems into simple, feasible solutions.


    Qualifications:
    What you bring to the team: You have

    Bachelor's degree and 3+ years of work experience in Data Engineering and Analytics fields or consulting roles with a focus on digital analytics implementations.

    Experience with large scale data warehouse management systems such as Big Query for 1+ years with advanced level understanding of warehouse cost management and query optimization
    Proficient in Python.
    Experience with Apache Airflow or equivalent tools for orchestration of pipelines.
    Able to write SQL to perform common types of analysis and transformations.
    Strong problem-solving and creative-thinking skills.
    Demonstrated development of ongoing technical solutions while developing and maintaining documentation, at times training impacted teams.
    Experience developing solutions to business requirements via hands-on discovery and exploration of data.
    Exceptional written and verbal communication skills, including the ability to communicate technical concepts to non-technical audiences, as well as translating business requirements into Data Solutions
    Experience with ETL & ELT.
    Experience building and deploying applications on GCP cloud platform.
    Builds strong commitment within the team to support the appropriate team priorities
    Stays current with new and evolving technologies via formal training and self-directed education.


    You might also have:
    Experience with Snowflake, Redshift and other AWS technologies.
    Experience with Docker and container deployment.
    Influences and applies data standards, policies, and procedures
    Experience with Data Modeling of performant table structures.
    Experience with Marketing tools like Kochava, Braze, Branch, Salesforce Marketing Cloud is a plus.
    Experience with exploratory data analysis using tools like iPython Notebook, Pandas & matplotlib, etc.
    Familiarity in Hadoop pipelines using Spark, Kafka.
    Familiar with GIT.
    Familiar with Adobe Analytics (Omniture) or Google Analytics.
    Digital marketing strategy including site, video, social media, SEM, SEO, and display advertising.

    by Jobble

    #J-18808-Ljbffr