Hadoop Architect - Columbus, United States - Comtech Global

    Default job background
    Description

    Role:

    Hadoop ArchitectLocation: Columbus, OH (Hybrid role 2 days onsite)** Will be required to be onsite at 1980 West Broad St, Columbus, minimum of 2 days a week.

    Please share resumes topgo AT THE RATE OF Comtechglobal DOT COMComplete Description:

    The client is requesting the services of a Senior Architect 2 to function as the technical expert assisting the Smart Columbus team developing the SCOS (Smart Columbus Operating System).

    The selected candidate will work directly with the SCOS team in researching, building, and implementing the computer applications that will fulfill Smart Columbus's mission.

    This position will function as a highly skilled SAR2 with specific responsibilities that include:


    • Operate effectively as a big data SME with knowledge of the latest tools, applications, and cloud infrastructures necessary to implement a powerful open-source system.
    • Operate as technical expert with intimate knowledge regarding all components that comprise the entire solution set.
    • Serve as the SME on the host of applications and environments required to serve the users.
    • Operate as SME in the implementation of Open-Source Big data solutions from infrastructure to applications.
    • Assist with developing the delivery Epics, Stories and Tasks that will comprise the solution set for BI.
    • Operate as the cross technology and platform SME.
    • Act as Open-Source advocate and practitioner.
    • Assist in the execution of the process operating as Agile evangelist and master.
    • Mentor teams involved in developing highly technical solutions and promote culture that values input from all team members.
    • Promote continuous learning and deliver value to the customer.
    • Assist in the development of standards, procedures, and operating systems applications.
    • Assist with revision of existing solutions to increase operating efficiency or adapt to new requirements.
    • Prepare records and reports.
    • Demonstrated end to end big data environment build leadership and architecture development.
    • Creative thinker with experience in implementing Open Source.
    • Student of advanced Open-Source BI tools, having implemented at least three BI applications end to end. Preference to Tableau, Pentaho, Pentaho, BIRT, JasperReport, SpagoBI, KNIME, ReportServer, Bitnami ReportServer Stack.
    • Expert knowledge with a track record of solution delivery using various data integration solutions in situations requiring real time (synchronous and asynchronous) messaging, Publish/Subscribe models, Microservices patterns, Middleware, and other related methodologies in a multi-user, multi-platform, multi-tier environment

    Required Skills:

    • 5 yrs. experience in Hadoop HDFS Build and administration, certification preferred.
    • 5 yrs. experience in working with, installing, and setting up HBase, MongoDB, Maven, Document DB, Amazon DynamoDB, Bigtable, Cassandra, or Druid.
    • 5 yrs. experience working daily within an Agile team.
    • 5 yrs. experience working within a paired programming environment.
    • 7 yrs. in Software development as a developer.
    • 7 yrs. experience in IT infrastructure deployment.
    • 7 yrs. experience in Open-Source software deployment.
    • 7 yrs. experience in Web systems and application deployment.
    • 7 yrs. experience in Requirements Gathering and Use Case development.
    • 7 yrs. experience in Java, considered a journeyman in this language.
    • 7 yrs. experience delivering inside of a full TDD, CI, DevOps environment within an Agile framework.

    Desired Skills:

    • Experience working on machine learning, deep learning, artificial intelligence and natural language problems and solutions.
    • Experience working on autonomous vehicle solutions.
    Your Resume * (.doc,.docx,.pdf files are only allowed)

    #J-18808-Ljbffr