Data Engineer/Developer - New Castle, United States - Axelon

    Axelon
    Axelon New Castle, United States

    1 month ago

    Default job background
    Description
    Global Financial Firm located in NEW CASTLE, DE has an immediate contract opportunity for an experienced Data Engineer/Developer

    "This role is currently on a Hybrid Schedule.
    You will need to have reliable internet, computer and android or iphone for remote access into the client systems during remote work.
    We will be expected in the office weekly 2-3 days depending on the team requirement.

    ****Video/ f2f interviews are required prior to all offers.

    Excited to grow your career?
    We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply
    Our people make all the difference in our success.

    If you are a technologist at heart, this is an amazing opportunity for you. As a key member of the data engineering team, you will work closely with full stack data engineers, product managers, cross-functional partners, and the larger technology organization to get high-quality products and new features built, tested, and released on time for Information Management team, part of Operations & Technology at Client.

    • You will be responsible for many fast-changing, moving parts and get them to come together as a product.

    • You will need excellent communication and collaboration skills.

    • You will be responsible for identifying and managing risks, making sound judgments about quality, and speed of deliverables and deployment to production.

    • You will be a key player in our Data transformation to digitize our business.

    • You will be responsible for leading development on TTS Big Data platform.

    Job Description:
    Looking for Hard Core Spark/Scala/Java/HDFS candidates. Strong academic record, ideally with a good degree in a mathematical or scientific background.
    This role is for a Data Engineering Lead to work on the Big Data Platform. The team is responsible for the maintenance and development of leading Big Data initiatives and use cases providing business value.

    Job Background/context:
    The Big Data platform supports operational real-time event-based processing and compute applications as well as analytics consumption use cases including machine/deep learning.
    The technology team is responsible for building and maintaining a multitude of applications including:
    • Real-time ingestion/stream processing and data distribution via Big Data APIs.
    • Build out canonical models and data conformance.
    • Implement best in class data management and data ingestion.
    • Leverage new storage engines like Kudu that enables analytics on fast changing data.
    • Leverage GPU implementation to enable advanced Machine Learning.
    • Enhance Self-Service capability for data science and Client practitioners.
    This is a hands-on development role that will offer exposure to the full development cycle, whilst working closely with our business and technology stakeholders.
    Analysis and development across Lines of business including Payments, Digital Channels, Liquidities, Trade
    Cross train and fertilize functional and technical knowledge,
    Align to Engineering Excellence Development principles and standards.
    Promote and increase our Development Productivity scores for coding
    Fully adhere to and evangelize a full Continuous Integration and Continuous Deploy pipeline.

    Development Value:
    This role would open career opportunities for the successful individual to establish their profile in the Data Innovation and Architecture organization.
    It provides an opportunity in Institutional Client banking to work closely with the business to provide value add data solutions.

    Knowledge/Experience (Must):

    • 10+ years of experience in Hadoop/big data technologies.

    • Experience with Spark/Storm/Kafka or equivalent streaming/batch processing and event based messaging.

    • Relational and NoSQL database integration and data distribution principles experience

    • Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr)

    • Experience with API development and use of JSON/XML/Hypermedia data formats.

    • Strong development/automation skills

    • Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.)

    • 5+ years of hands-on experience as a Scala developer (with previous Java background)

    Knowledge/Experience (Preferred):

    • Experience in Core Banking functionality for generating various hand-offs

    • Experience with containerization and related technologies (e.g. Docker, Kubernetes)

    • Comprehensive knowledge of the principles of software engineering and data analytics

    • Knowledge of agile(scrum) development methodology is a plus

    • Cloudera/Hortonworks/AWS EMR, S3 experience a plus

    Qualifications:
    Strong academic record, ideally with a Bachelor degree in Engineering/mathematical or scientific background.

    • Strong Communication skills

    • Self-Motivated

    • Willingness to learn.

    • Excellent planning and organizational skills

    Client is an equal opportunity employer. Accordingly, we will make accommodations to respond to the needs of people with disabilities (including, without limitation, physical and mental health disabilities) during the recruitment process and otherwise in accordance with law. Individuals who view themselves as Aboriginals, members of visible minority or racialized communities, and people with disabilities are encouraged to apply.