- Design and develop ETL workflow solutions to build historical data layers, manage complex data loads, and support metadata framework methodologies.
- Collaborate with internal stakeholders and third-party vendors to develop and maintain data pipelines, integrating data from diverse sources.
- Build and maintain Python pipeline development and manage infrastructure monitoring.
- Aggregate, transform, and interpret data from multiple sources to structure it within a centralized data warehouse.
- Analyze processes and data to identify improvement opportunities and recommend system modifications.
- Support project teams in optimizing business efficiencies for Northwood and participate in special technology initiatives.
- Maintain documentation best practices for written code and procedures, including mapping documents.
- Assist with developing and deploying business intelligence and data visualization tools.
- Support change management around existing data warehouse reports based on business feedback.
- Bachelor's degree in Information Systems, Software Engineering, or a similar technical field.
- Proven experience writing complex SQL queries and a strong focus on SQL script tuning/performance optimization.
- Experience using Office 365, particularly Excel (Power Pivot, Power Query) and Power Automate.
- Expertise in ETL tools like SSIS, Alteryx, Informatica, DataStage, Pentaho, DBT, or Azure Data Factory.
- Ability to prepare user-facing reports with Power BI or similar tools.
- Experience developing and deploying AWS Microservices, particularly with Lambda and EC2.
- Experience developing/supporting/troubleshooting Visual Basic code, Shell Scripting, Python/JavaScript.
- Knowledge of cloud architecture patterns and cloud platforms, especially AWS.
- Familiarity with the financial industry and investment data management concepts.
- Understanding of Business Intelligence Tools (SSRS, Power BI, OBIEE, Tableau, etc.).
- Experience with Snowflake Data Warehousing environments, Oracle DB, MySQL, and Postgres SQL.
- Knowledge of MRI, Yardi, Investran, or similar ERP systems.
- Excellent interpersonal, leadership, presentation, and collaboration skills.
- Strong written and verbal communication skills to effectively translate technical concepts into business-understandable language.
- Ability to promote a positive and collaborative work environment while maintaining a high level of intellectual curiosity and a drive to learn.
- Strong analytical thinking skills with a balance of creativity and organization when problem-solving.
- Ability to handle multiple tasks with ownership and responsibility.
- Excellent time management and prioritization skills.
- Adaptability to evolving business needs and a commitment to process improvement.
- This is an open-ended contract position, not full-time.
- NO C2C. If you require C2C, or are a business development resource seeking C2C engagements, please save us all the time as we will decline your outreach.
- We are not able to accommodate a work visa/H1B visa transfer at this time. Viable candidates must either be a permanent resident (GC holder) of the United States or be a USC.
- Viable candidates MUST be located in the Denver, CO area and open to a hybrid work schedule.
Data Integratrion Engineer - Denver, United States - The Judge Group
Description
Are you a skilled and passionate Data Integration Engineer looking to leverage your expertise in a dynamic and industry-leading organization?
Our client seeks a talented Data Integration Engineer to join their team in Denver, CO. This role plays a critical role in supporting the data warehousing project, further enhancing reporting and analysis for informed decision-making.
Responsibilities:
Qualifications:
Desired Skills:
Additional Considerations:
If you are a highly motivated and results-oriented Data Integration Engineer with a passion for building robust data pipelines, we encourage you to apply
PLEASE NOTE:
If you fit the requirements and noted above, you can apply, and also either DM Charles Herman or reach out via