Data Engineer - united states, United States - Cynet Systems

    Cynet Systems
    Cynet Systems united states, United States

    4 weeks ago

    Default job background
    Full time
    Description

    Responsibilities:


    Implement data solutions provided by architects that integrate enterprise streaming data from multiple systems, enabling real-time ingestion & distribution along with real-time data insights.

    Develop highly scalable, flexible, resilient & cost-efficient data solutions to ingest, process and utilize our data across the enterprise.


    Collaborate with cross-functional teams, including business stakeholders, engineers, enterprise architects and IT teams, to develop and deliver high-quality data solutions.

    Lead complex technology initiatives including those that are enterprise wide with broad impact.

    Utilize a thorough understanding of available technology, tools, and existing designs.

    Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.

    Participate in code reviews, providing valuable feedback to maintain code quality and adherence to best practices.

    Maintain detailed documentation of your work and changes following the best practices.

    Ensure high operational efficiency and quality of your solutions to meet SLAs and support commitment to our customers.

    Work with Infrastructure Engineers and System Administrators as appropriate in designing the infrastructure.

    Support ongoing data management efforts for Development, QA, and Production environments.


    Qualifications:
    Bachelor's or master's degree in computer science, software engineering or a related field.

    Minimum of 8-10 years of experience in data engineering.

    Expertise in cloud databases such as AWS Aurora Postgres, DynamoDB, MongoDB for efficient data processing, ingestion & distribution.

    Expertise in optimizing database performance through sharding, partitioning, indexing, query optimization, and other tuning techniques.

    Expertise in cloud-based data storage solutions like Amazon S3 or Google Cloud Storage.

    Expert SQL and database admin skills working with large structured and unstructured data.

    Experience in workflow orchestration tools like Airflow Orkes etc.


    Proficiency in data streaming services such as Kafka, Kinesis, and IBM MQ Expertise in developing programs to produce and read payloads from data streaming solutions like Kafka, Kinesis Data Streams.

    Experience in java programming language including frameworks like Spring Boot etc.

    Experience with API development (REST, GraphQL), API Gateways etc.

    Strong understanding of scalable containerized computing and solutions for example Kubernetes, AWS ECS etc.


    Proficiency in data formats such as JSON, HL7, EDI, and XML Good understanding of MDM systems/concepts and tools like Reltio, IBM Infosphere etc.


    System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira Infrastructure as Code (IaC) technologies such as CloudFormation or Terraform Familiarity with AI/MLOps concepts and Generative AI technology (Good to have) Deep technical knowledge of 1 public cloud services (preferably AWS) Excellent problem-solving and analytical skills Strong communication and collaboration skills with both technical and non-technical stakeholders.

    Experience in the healthcare industry, preferably focusing.


    • monsterit