No more applications are being accepted for this job
- Build creates, maintain and optimize data pipeline from development to production for uses cases,
- Drive Automation through effective metadata management: responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity.
- Use modern data preparation, integration and AI-enabled metadata management tools and techniques.
- Tracking data consumption patterns.
- Performing intelligent sampling and caching.
- Monitoring schema changes.
- Recommending - or sometimes even automating - existing and future integration flows.
- Proposing appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques in optimally addressing these data requirements.
- Train counterparts such as [data scientists, data analysts, users or any data consumers] in these data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.
- Ensure data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Provide knowledge transfer
- Strong experience with various Data Management architectures like Data Warehouse, Data Lake, Data Hub and the supporting processes like Data Integration, Governance, Metadata Management
- Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management.
- Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies.
- These should include ETL/ELT, data replication/CDC, message-oriented data movement, API design and access and upcoming data ingestion and integration technologies such as stream data integration, CEP and data virtualization.
- Strong experience and collaboration supporting and working with cross-functional teams in a dynamic business and IT environment.
- Highly creative and collaborative working with both business and IT teams to define the business problem, refine the requirements, and design and develop data deliverables accordingly.
- Ability to interface with, and gain the respect of, stakeholders at all levels and roles within the company.
- Good judgment, a sense of urgency and has demonstrated commitment to high standards of ethics, regulatory compliance, customer service and business integrity.
Data Engineer - Saint Paul, United States - TechNix LLC
Description
Job Description
Job DescriptionPosition: Data Engineer
Duration: 2 Years
Location: St. Paul, Minnesota (Onsite from Day 1)
Sample Tasks
Qualifications