Data Engineer - Bloomfield Hills, United States - Unified Tech Group Inc

    Unified Tech Group Inc
    Unified Tech Group Inc Bloomfield Hills, United States

    2 weeks ago

    Default job background
    Description

    Job Description

    Job DescriptionSalary:

    Job Description

    Data Ingestion and Data Engineering is a critical skill set required for the growth of the data platform called DCAS and thus, the scope of the deliverables includes helping the team improve the current functionality of the data platform, as well as support ongoing business usage of current use cases currently utilizing the solution. DCAS Data Platform is a production solution within the Upstream BU and thus, requires a professional customer-focused mindset, as well as technical skills to improve current platform functionality.

    Top Skills' Details

    (1) 2+ years experience with Snowflake Data Warehouse (including SnowSQL or dbt [data build tool] for data transformation)
    (2) 5+ years experience Data Engineering - building data ingestion pipelines (via tools like FiveTran, ADF, Qlik, etc)
    years Database Development/Administration experience

    **candidates must be highly efficient at extracting/ingesting data (RDBMS, flat-files, structured, semi-structured, unstructured, etc.) from numerous data sources (MS SQL, Oracle, MySQL, PostgresSQL, SAP, etc) and building out data pipelines to a cloud-based data warehouse (Snowflake) for analytics.

    Any experience with API automation via Python (Snowflake Python Connector API) and knowledge around Redhat OpenShift Containerization (Docker/Kubernetes) would be highly preferred. Also, knowledge and experience with data visualization (PowerBI / Tableau) would be a huge plus.

    Additional Skills & Qualifications

    Data Engineer will be responsible for the following deliverables:

    • Support in onboarding Upstream use cases into the Azure-hosted Upstream IT data platform built on Snowflake

    • Creation of data ingestion tasks to migrate additional on-premise and cloud data sources into the Upstream IT DCAS Platform

    • Architecture and consulting to aid customers in their data extraction and transformation pipelines in Snowflake Data Warehouse

    • Identifying and implementing data warehouse access workflows through ServiceNow automation as well as progressing efforts to improve the data approval-to-grant flow within our current RBAC process

    • Managing and automating user and service account access to data sources within the Upstream data platform (Kepler) through API automation and standard DBA best practices

    • Building out data and platform security monitoring within the platform to ensure the ongoing appropriate use of data within the platform

    • Data Engineering support for Upstream Digital Data Foundation and Upstream use cases

    • Delivery of task monitoring and notification system for data pipeline status

    • Supporting additional data ingestion and curation tasks to enable ongoing business use cases

    • Data Engineering support for Upstream Digital Data Foundation and Upstream use cases

    Technical Stack:

    • ANSI SQL and Database Administration

    • Snowflake Data Warehousing

    • Data Engineering and Modeling experience

    • Python Development

    • PowerBI, Tableau, Excel, Azure PaaS, Azure SaaS, and PowerApps integration experience