Data Engineer/ ETL Developer - Mead, United States - Global Alliant Inc

    Global Alliant Inc
    Global Alliant Inc Mead, United States

    1 month ago

    Default job background
    Description
    ETL Developer/ Senior Data Engineer

    Local to Maryland Candidates highly preferred (Hybrid 2 days a week in office Baltimore, MD)

    One year contract (with two year possible extension)

    Up to $70/hr on C2C

    Job Description


    Job Requisition:
    Sr. Data Engineer / Extract, Load, Transform (ETL) Developer

    Job Description:
    Seeking a Sr.

    Data Engineer / ETL Developer with strong systems, software, and AWS cloud experience to support Data Analytics Platform initiative for a federal customer.

    Based in DevOps framework, participate in and/or direct major deliverables of projects through all aspects of the software development lifecycle including scope and work estimation, architecture and design, coding and unit testing.

    Primary Responsibilities

    Develop data pipelines using Cloud-Native tools like AWS Glue based on Apache Spark and Step Functions.
    Assemble large, complex data sets that meet functional / non-functional business requirements.
    Leverage serverless cloud services to prepare (extract and transform) and load large numbers of datasets for data processing.
    Extend standard ETL tool capabilities using Glue, Python/PySpark, Step Functions, SQS and Athena.
    Implement overall solution comprising of ETL jobs, Lambda and Python.
    Support the implementation of data analytics products.
    Develop and integrate custom developed software solutions to leverage automated deployment technologies.
    Develop, prototype, and deploy solutions in AWS Cloud.
    Coordinate closely with functional team to ensure requirements are clearly understood.

    Analyze (through proof of concept, performance, and end-to-end testing) and effectively coordinate the infrastructure/service needs working with the architecture and Data Center teams.

    Closely work with the architecture team to review the design and ETL code.
    Use industry leading DevOps tools such as AWS CloudFormation.
    Communicate key project data to team members and build team cohesion and effectiveness.
    Leverage Atlassian tool suite like JIRA and Confluence to track activities.
    Identify and apply best practices and standard operating procedures.
    Create innovative solutions to meet the technical needs of customers.

    Basic Qualifications

    Experience working in the software development lifecycle, with strong experience on ETL based development.
    Experience working with databases like DynamoDB. Experience with querying tools like SQL.
    Experience with containerization tools like Kubernetes.
    Experience using Delta Lake.
    Experience with data catalog tools like AWS Glue Catalog and DataHub.
    Experience working with programming languages (Python required)
    Experience working in a fast-paced development environment with drive to completion.
    Experience with development using Amazon Web Services (AWS) big data technologies.
    Well versed with using version control systems (CodeCommit preferred)
    Well versed with using issue/problem tracking systems (Jira preferred)

    Candidate must have bachelor's with 8-12 years of prior relevant experience or master's with 6-10 years of prior relevant experience.

    Preferred Qualifications

    Working experience on AWS Glue and Glue studio.
    Experience building processes supporting data transformation, data structures, metadata, dependency and workload management.
    Experience with Amazon QuickSight.
    Prior experience working with federal government contract.

    If interested in applying for the position, please reach out to me at
    #J-18808-Ljbffr