- Consistent record of accomplishment of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following coding standard methodologies and CI/CD pipelines.
- Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
- At least 5 years of building and deploying data movement and data transformation workflows, and pipelines.
- At least 2 years of proven web-service development experience with Java and or Python and/or practical experience with AI/ML models or their delivery.
- At least 1year experience in GenAI, preferably on Microsoft OpenAI or AWS Bedrock.
- Experience interacting with financial services and or digital channel data (Web clicks, Phone logs, email campaigns) a plus.
- Hands on experience with AWS and/or Azure infrastructure strongly preferred.
- Familiarity or exposure to VetorDB like Redis Cache or AWS OpenSearch would be an added advantage.
- Bachelor's degree in computer science or similar field.
Artificial Intelligence and Reporting Analyst - Durham, United States - Compunnel Inc.
![Default job background](https://contents.bebee.com/public/img/bg-user-ex-1.jpg)
Description
Client Job: Artificial Intelligence and Reporting Analyst ( SQL, Python)
Contract Duration: 6 months - extension based on performance/team needs.
Primary Location: Durham, NC
Hybrid Scheduled: 5 days onsite during Connect Week, 2 weeks since September 2024
Interview Process: 2 rounds
Analytics and Reporting Engineer
CANDIDATE PROFILE:
1.) Microsoft Open AI
2.) SQL
3.) Python
Job Description:
The Workplace Investing Analytics and Reporting Chapter team is looking for an Engineer to join our team to help deliver reporting application and AI (Artificial Intelligence) models into production. This role is a dynamic Agile engineering position where you will partner with your teammates on our development team and peer data scientists to assist with data analysis, reporting and research; conduct tactical data extracts and build balanced ETL data pipelines; deploy AI/ Machine learning models via RESTful APIs; and structure model output data and build reports to measure deployed model efficiency. If you have an inquisitive and consulting approach, excellent SQL skills, strong experience in building and deploying data movement solutions and/or data centric APIs and a passion for delivering innovative products and services that improve the lives of our customers, a career on the WI Analytics and Reporting squad could be for you
The Expertise You Have
The Value You Deliver
Work with data scientists and business sponsors to understand the business use cases that need to be solved and then conduct data analysis to find the right data sources for the project and conduct data analysis to profile needed tables.
Develop ETL workflows in AWS/Snowflake using Python to structure data for AI model training and development and to measure already deployed models.
Develop data webservice APIs in Java or Python to feed into models.
Ability to learn quickly, be flexible, adapt and strive to excel in a fast paced, changing environment.