Data Architect - Austin, United States - Connect Tech+Talent
Description
Data Architect
Hybrid (Austin, Texas -2 days per in the office, and the other days working from home)
6 Months Contract
Functional Responsibilities:
A strong data integration background, with knowledge of Investment business operations (Front, Middle and Back-Office)
Familiar with all aspects of the Eagle PACE warehouse, including scheduling, report writing,data stewardship, portal, and data model/schemas
Familiar with Eagle v2017 Modules (Portfolio Data Center, Message Center, Automation Center, Scheduler, Workflow Manager, Reference Data Center, Recon Center, etc.)
Builds complex SQL queries against base and aggregate tables
Participates in the design, development, and integration of Investment databases
Design, implement and deploy the data pipelines to transform (structure and map) vendor data into Investments systems.
Stay current with data vendor product offerings and how they may be used by Investments and assisting with tool selection.
Manage and enhance investment data infrastructure through initiatives such as improving data integrity and quality with every release
Work in a fast-paced environment collaborating with business users, engineers, architects, and analysts.
Maintain data management standards to ensure that new data entities get implemented in the data model using schemas that are appropriate for their use, facilitating good performance and analytics needs
Perform Eagle maintenance release including pre and post data and software validation
Provide input to our framework team for enhancing the tools we utilize
Develops technical designs and specifications for complex file data pipelines/data flows and document file data migrations
Leverage creativity to solve complex data and business problems
Documents data dictionary and data mapping/transformation rules
Works extensively with business analysts and investment professionals to analyze business problems
Supports daily activity of the investment control systems areas by creating and maintaining custom reports and programs and by providing technical support for system and software issues
Assists customer in creating and running test scripts for data loads and report development
Additional Skills:
Prefer experience in general accounting, investment accounting, account reconciliation, and writing procedures
Financial knowledge in investment banking and accounting.
Understanding of data mart structures, ETL processes, and data warehousing concepts
Desire to work in an Agile environment and use DevOps / DataOps practices such as continuous integration, continuous delivery, and continuous deployment
Hands on and extensive experience with many of the following technologies: Crystal Reports,.Net (VB or C#), MS SQL Server, Oracle, MS Access/Excel. FIX, SWIFT, and XML, IWS/Eagle ML
Strong communication, presentation and interpersonal skills
Proficient ability to explain technical information to a non-technical audience
Ability to learn new concepts, systems, and software independently and quickly
Willingness to take initiative, seeing problems through to final completion
The ability handle diverse situations and rapidly changing priorities with deadlines
Demonstrated experience with implementation of Eagle STAR/PACE software upgrades.
Strong aptitude for troubleshooting and problem solving
Minimum Requirements:
Graduation from an accredited four-year college or university.
Each year of related experience over the required may be substituted for one year (30 semester hours) of required college credit
5 years data analysis and data modeling/architecture experience
5 years SQL experience
5 years experience designing and implementing ETL technical solutions for complex data models using multiple data sources
4 years of experience with Eagle database environment including schema architecture
Strong communication and collaboration skills and good documentation habits.
Demonstrated experience with orchestration and automation of database processes.
4+ years experience working in Eagle PACE as a data integrator implementing data pipelines.