Sr. Hadoop Developer - Jacksonville, United States - Tekwissen

    Default job background
    Description
    Overview:

    TekWissen Group is a workforce management provider throughout the USA and many other countries in the world. Our client is a mutual insurance holding corporation formed after a reorganization from the bigger entity and operates in Florida, USA. It is one of the top 10 health insurance companies in the US with sales of more than $15B.

    Title: Sr. Hadoop Developer

    Work Location: Jacksonville, FL 32246

    Duration: 6 Months

    Job Type: Contract

    Work Type: Hybrid (2-3 days per week in Jacksonville)


    JOB SUMMARY:
    • A Hadoop developer is responsible for the design, development and operations of systems that store and manage large amounts of data. Most Hadoop developers have a computer software background and have a degree in information systems, software engineering, computer science, or mathematics.
    • IT Developers are responsible for development, programming, coding of Information Technology solutions. IT Developers are responsible for documenting detailed system specifications, participation in unit testing and maintenance of planned and unplanned internally developed applications, evaluation and performance testing of purchased products. IT Developers are responsible for including IT Controls to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application. IT Developers are assigned to moderately complex development projects.
    Essential Functions:
    • Write code for moderately complex system designs. Write programs that span platforms. Code and/or create Application Programming Interfaces (APIs).
    • Write code for enhancing existing programs or developing new programs.
    • Review code developed by other IT Developers.
    • Design, implement, and maintain big data solutions using Spark/Scala
    • Work with the team to identify business needs and pain points that can be addressed with data and analytics
    • Understand complex data sets and ETL processes, and how they can be optimized using Spark
    • Design and develop high-performance Spark jobs for data transformation, cleansing, and enrichment
    • Tune Spark jobs for optimal performance and resource utilization
    • Monitor Spark cluster health and performance, and take corrective action as needed
    • Collaborate with other teams to integrate Spark jobs into the overall data pipeline
    • Write unit tests and integration tests for Spark jobs
    • Debug production issues and provide root cause analysis
    • Keep abreast of new features and capabilities in Spark, and how they can be leveraged to improve our data processing
    • Document Spark jobs and related processes
    • Provide input to and drive programming standards.
    • Write detailed technical specifications for subsystems. Identify integration points.
    • Report missing elements found in system and functional requirements and explain impacts on subsystem to team members.
    • Consult with other IT Developers, Business Analysts, Systems Analysts, Project Managers and vendors.
    • "Scope" time, resources, etc., required to complete programming projects. Seek review from other IT Developers, Business Analysts, Systems Analysts or
    • Project Managers on estimates.
    • Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.
    • Supports transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.
    • Researches vendor products / alternatives. Conducts vendor product gap analysis / comparison.
    • Accountable for including IT Controls and following standard corporate practices to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application.
    • The essential functions listed represent the major duties of this role, additional duties may be assigned
    Job Requirements:
    • Experience and understanding with unit testing, release procedures, coding design and documentation protocol as well as change management procedures
    • Proficiency using versioning tools.
    • Thorough knowledge of Information Technology fields and computer systems
    • Demonstrated organizational, analytical and interpersonal skills
    • Flexible team player
    • Ability to manage tasks independently and take ownership of responsibilities
    • Ability to learn from mistakes and apply constructive feedback to improve performance
    • Must demonstrate initiative and effective independent decision-making skills
    • Ability to communicate technical information clearly and articulately
    • Ability to adapt to a rapidly changing environment
    • In-depth understanding of the systems development life cycle
    • Proficiency programming in more than one object-oriented programming language
    • Proficiency using standard desktop applications such as MS Suite and flowcharting tools such as Visio
    • Proficiency using debugging tools
    • High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy
    Specific Tools/Languages Required:
    • HADOOP
    • Spark
    • Scala
    • Ab Initio
    REQUIRED EDUCATION/EXPERIENCE:
    • Related Bachelor's degree in an IT related field or relevant work experience
    • 5 -8 years related work experience in IT development/programming/coding professional within a Hadoop environment (or equivalent combination of transferable experience and education)
    • Strong, demonstrated experience as a senior developer using Hadoop, Spark & Scala (all three required)
    • Experience with Agile Methodology
    • Experience with Ab initio Technology preferred
    TekWissen Group is an equal opportunity employer supporting workforce diversity.