Data Engineer GCP - Lansing, United States - Ciber

    Default job background
    Description
    HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success.

    You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.

    At HTC Global Services our consultants have access to a comprehensive benefits package.

    Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.


    Position Description:
    We're seeking a Data Engineer who has experience building data products on a cloud analytics platform.

    You will work on ingesting, transforming, and analyzing large datasets to support the Enterprise in the Data Factory on Google Cloud Platform (GCP).Experience with large scale solution and operationalization of data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments is a must.

    We are looking for candidates who have a broad set of technical skills across these areas.
    You will work in collaborative environment that leverages paired programming.
    Work on a small agile team to deliver curated data products.
    Work effectively with fellow data engineers, product owners, data champions and other technical experts.
    Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions.

    Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles.

    Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies.


    Skills Required:


    Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.

    Implement methods for automation of all parts of the pipeline to minimize labor in development and production.

    Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products.

    Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption.

    Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management.


    Skills Preferred:
    Strong drive for results and ability to multi-task and work independently.
    Self-starter with proven innovation skills.
    Ability to communicate and work with cross-functional teams and all levels of management.
    Demonstrated commitment to quality and project timing.
    Demonstrated ability to document complex systems.


    Experience Required:


    5+ years of SQL development experience.5+ years of analytics/data product development experience required.3+ years of cloud experience (GCP preferred) with solutions designed and implemented at production scale.

    Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc.

    Experience working with Airflow for scheduling and orchestration of data pipelines.
    Experience working with Terraform to provision Infrastructure as Code.2 + years professional development experience in Java or Python.


    Experience Preferred:
    In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures.
    Experience in working with DBT/Dataform.
    Experience with DataPlex or other data catalogs is preferred.
    Experience with development eco-system such as Tekton, Git, Jenkins for CI/CD pipelines.
    Exceptional problem solving and communication skills.
    Experience in working with Agile and Lean methodologies.
    Team player and attention to detail.
    Experience with performance tuning SQL queries.


    Education Required:
    Bachelor's degree in computer science or related scientific field

    Education Preferred:
    GCP Professional Data Engineer Certified.
    Master's degree in computer science or related field.2+ years mentoring engineers.
    In-depth software engineering knowledge.

    #J-18808-Ljbffr