Sr. Data Engineer/Architect - Alexandria, United States - CALIBRE Systems

    Default job background
    Description
    Job Description


    CALIBRE Systems Inc., an employee-owned Management Consulting and Digital Transformation company, is hiring a Senior Data Architect as a key member of a technology solution delivery team support a Department of Defense (DoD) client improve its business applications.

    This position can be performed remotely.

    This position will also require an active Secret Security Clearance (or the ability to obtain a Security Clearance within 60 days), and require an active or able to obtain a CompTIA Security+ certification within 60 days of hire.


    JOB RESPONSIBILITIES:


    Partner with senior cloud architects, business process experts, and data scientists to apply experience-based insight and knowledge of modern cloud ETL/ELT pipeline and data storage concepts during daily, interactive collaboration session and agile sprints with clients.

    Responsibilities include but are not limited to researching new cloud native data technologies, creatively architecting data pipeline solutions, building prototypes, documenting ETL processes, and evaluating alternatives of cloud products/services.

    Including, the reengineering of all ETL pipelines from existing SQL Server and Neo4j database stored procedures and functions into serverless data pipelines using Python and AWS services.


    • Responsible for working closely with our clients to modernize legacy ETL pipelines by facilitating client sessions to capture understanding their business application data needs, data quality challenges, and guiding the complete refactoring of data pipelines from raw source to production that incorporate automation, data quality detection and correction, and the use of machine learning throughout the pipeline.
    Required Skills

    4-5 years of demonstrated proficiency in the design and implementation of cloud native data architectures and solutions for data processing and pipeline orchestration using AWS/Azure services common to serverless architectures (AWS Lambda, API Gateway, Step Functions, etc.)


    • Hands-on data engineering experience building solutions to extract, load, and transform raw data in the form of several hundred, small data files and web scrapings with a variety of formats using a combination of Python scripting and AWS cloud services such as S3, Glue, Crawlers, Lambda, Step Functions, Redshift, DynamoDB, etc.
    • Experience with containerization (Docker, etc.) and container orchestration (Kubernetes, etc.) in deployment and operations.
    Demonstrable skill in AWS Networking with a proven skillset in latest AWS networking services

    Extensive experience in anomaly detection, logging, monitoring & alerting, deep troubleshooting, and the integration of machine learning tools to build in data quality throughout the ETL/ELT data pipeline.

    Proficiency in one or more of the scripting languages - python, ruby, java or similar and use of Pandas or equivalent data manipulation library.


    • Experience working with enterprise databases /ERPs (e.g., Microsoft SQL, Oracle, SAP), process data extraction, process data transformation, process data analysis, reporting, etc.
    Required Experience


    Bachelor's Degree in Business Administration, Engineering, Computer Science, or related discipline.; an Associates Degree together with two (2) years of documented relevant experience may be substituted for the Bachelors Degree.

    Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) certifications
    CompTIA Security+ certification

    We are unable to sponsor candidates for this position. Candidates must have (or be able to obtain an interim within 60 days) a DoD Secret Clearance.

    Candidates must have or be able to obtain CompTIA Security+ certification within 60 days.
    #J-18808-Ljbffr