AWS Salesforce Data Engineer/Integration Team Lead - Jersey City, United States - NAM Info Inc

    Default job background
    Accounting / Finance
    Description

    AWS Salesforce Data Engineer/Integration Team Lead

    Full Time

    Onsite at Jersey City NJ

    BANKING DOMAIN IS A MUST

    Skillset: AWS, Java, Python, Pyspark, Big Data, Databricks, Salesforce

    Must be hands on and an excellent communicator

    Must be able to lead solutioning discussions

    Manage a team of data engineers responsible for building out an AWS integration layer connecting 25+ applications to the CRM platform

    Job responsibilities

    1. Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
    2. Develops secure high-quality production code, and reviews and debugs code written by others
    3. Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
    4. Leads evaluation and design sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture
    5. Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
    6. Adds to team culture of diversity, equity, inclusion, and respect
    7. Manage team of developers to build data integration layer for Strategic CRM Platform.

    Required qualifications, capabilities, and skills

    1. Formal training or certification on data engineering concepts and 8+ years of applied experience
    2. Advanced in one or more programming language(s), such as Java, Python
    3. Hands-on practical experience delivering data pipelines
    4. Proficient in all aspects of the Software Development Life Cycle
    5. Advanced understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security
    6. Demonstrated proficiency and experience with cloud-native distributed systems
    7. Ability to develop reports, dashboards, and processes to continuously monitor data quality and integrity
    8. Working knowledge of bitbucket and JIRA
    9. Excellent communicator.

    Preferred qualifications, capabilities, and skills

    1. Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR
    2. Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark
    3. Strong hands-on experience building event driven architecture using Kafka
    4. Experience writing Splunk or Cloudwatch queries, DataDog metrics

    Kindly reply with your resume to Email- -