Programmer Analyst|1911 Programmer Analyst|1911 - San Diego, United States - ACL Digital

    Default job background
    Description

    Job Description:
    Top 5 Required Skills1. AWS2. Data Engineering3. SQL4. Python5. Informatica

    Technologies (specific tools and/or programs you would want to see on their resumes)

    • AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
    Keywords (specific words, projects, programs you would want to see on their resumes)

    • AWS, Redshift, Apache Airflow, Glue, Python, Informatica, Snowflake, IICS, ERP, Salesforce
    • Data discovery, modeling, engineering, pipeline, data warehouse
    Education Requirement

    • Bachelor of Science in Computer Science, Information Technology, Data Science, or a related field.
    Required Years of Experience

    • 3 years min
    Physical Requirements

    • Push Max Weight Limit = 0
    • Pull Max Weight Limit = 0
    • Lift Max Weight Limit = 0
    • Forklift Required (Y/N): N
    Driving Requirements

    • Are there driving responsibilities no matter how minimal with this role? No
    • (If Yes)How many hours per week? n/a
    Work Location Requirement: 100% Onsite
    Work Address: 5775 Morehouse Drive, San Diego, CA

    Qualcomm Home Building:
    O


    Work Days:
    Mon-Fri

    Exact Shift Time: 8:30-5:00pm PST

    • Weekly /
    Daily Expected Hours: 40.0 / 8.0


    As a Data Engineer at Qualcomm, you will play a pivotal role in shaping and implementing robust data architecture while constructing efficient data pipelines aligned with our organizational goals.

    This position supports various business functions, including Supply Chain, Finance, Sales & Marketing, and other corporate areas.

    The ideal candidate will possess a deep understanding of data architecture principles, data discovery/modeling, and the seamless integration of emerging technologies into a cohesive data ecosystem that fosters innovation, operational excellence, and strategic insights.


    Principal Duties and Responsibilities:

    • Collaborate with cross-functional and global teams to comprehend business requirements and translate them into effective solutions.
    • Design and manage resilient, scalable data models and architectures that cater to the evolving needs of the business, emphasizing efficiency, quality, and security.
    • Implement data pipelines using tools such as Apache Airflow, AWS Glue, Redshift, S3, Python, and Informatica Cloud.
    • Facilitate communication within and outside the project team to resolve conflicts related to implementation schedules, design complexities, and other challenges.
    Good to have qualifications

    • Experience in Data Engineering leveraging AWS Redshift, S3, Glue, Airflow, Python, SQL and Kubernetes
    • Familiarity with Informatica Cloud and Informatica Power Center is essential.
    • Strong expertise in data modeling tools and methodologies, encompassing both structured and unstructured data environments.
    • Demonstrated experience in data governance, data quality management, and data security practices.
    • Exceptional analytical and problem-solving skills, enabling the translation of intricate technical challenges into actionable solutions.
    • Ability to quickly learn and adapt to new technologies.
    • Strong sense of ownership and growth mindset.
    • Curiosity about the problem domain and an analytical approach.
    • Strong influence skills to drive business adoption and change.
    • Experience in data discovery of source systems like Oracle E-Business Suite and Salesforce is preferable

    Comments for Suppliers:
    #J-18808-Ljbffr