ETL Data Engineer - Washington, United States - The Midtown Group

    The Midtown Group
    The Midtown Group Washington, United States

    2 weeks ago

    Default job background
    Description


    Job Description Job Description Our client is hiring a ETL Data Engineer to support data operations for its Cloud Data Exchange onsite in Washington DC for $60/hour.

    Because this is a federal position, you MUST be a US Citizen or Green Card holder.


    Responsibilities:
    Analyzes, designs, develops, implements, replicates, and supports complex enterprise data projects.

    Interfaces with other agencies, consult with and inform user departments on system requirements, advise on environment constraints and operating difficulties for the current state and advise and resolve problems using cloud solutions and develop and replicate future enhancements to Districts data systems.

    Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica by gathering requirements from the stake holders or analyzing the existing code and perform enhancements or new development.

    Establishing the cloud and on-premise connectivity in different systems like ADLS, ADF, Synapse, DatabricksHands on experience in Azure cloud services like Azure Data Factory or Azure Synapse, MSSQL Db, Azure SQL DB, Azure Data Lake Storage Gen2, Blob Storage, Python etc.

    Worked on creating end to end pipelines to load data by reading it from multiple sources or source systems and load to landing layer or SQL tables.

    Familiarity/experience with data integration and data pipeline tools (e.g., Informatica, Synapse, Apache NiFi, Apache Airflow)Familiarity/experience with various data formats including database specific (Oracle, SQL Server, DB2, Quickbase), text formats(CSV, XML) and Binary(Parquet, AVRO)Develops, standardizes and optimizes existing data workflow/pipelines adhering to best practices.

    Adhere and contribute to enterprise data governance standards by ensuring data accuracy, consistency, security, and reliability.
    Automates, monitors, alerts, and manages data pipelines and workflows.
    Analyzes and evaluates system changes to determine feasibility, provides alternative solutions, back-up, and rollback procedures.

    Works on the development of new systems, upgrades and enhancements to existing systems and ensure systems follow approved standards and consistency after the changes.

    Develops complex programs and reports in database query language.
    Familiarity/experience with data visualization tools.

    Familiarity/experience handling and securing sensitive data based on the level of the sensitivityRequirements:15 years
    • Strong knowledge for development of Extract-Transform Load (ETL) processes, including end to end pipelines with data loading from multiple sources15 years
    • Ability to gather and document requirements for data extraction, transformation and load processes15 years
    • Understanding of data warehousing, data lake, business intelligence and information management concepts and standards.
    15 years
    • Ability to advise internal and external customers on appropriate tools and systems for complex data processing challenges.
    11 years - Knowledge and use of SQL for relational databases11 years - Experience with various data formats including database specific (Oracle, SQL, Postgres, DB2), text formats (CSV, XML) and Binary (Parquet, AVRO)7 years - Contribute to enterprise data governance standards by ensuring accuracy, consistency, security and reliability5 years - Strong experience with Microsoft Azure tools such as Azure Data Factory, Synapse, SQL Database, Data Lake Storage Gen2, Blob Storage5 years - Experience with data integration and data pipeline tools such as Informatica PowerCenter, Apache NiFi, Apache Airflow and FME3 years - Experience with visualization and reporting software such as MicroStrategy, Tableau, and Esri ArcGIS3 years - Strong communication skills, both oral and written3 years - Ability to work independently or as part of a larger team3 years - Experience performing data functions with DatabricksCompany Description Our client has a great internal culture, and they are a company you can grow with Company Description Our client has a great internal culture, and they are a company you can grow with#J-18808-Ljbffr