No more applications are being accepted for this job
GCP Technical Architect - San Juan Capistrano, United States - Pyromis
Description
Position Details:
GCP Technical Architect
Location:
San Juan Capistrano, California
Openings:
2
Salary Range:
Description:
Job Title:
GCP Technical Architect
Location:
Charlotte, NC (Onsite)
Position Type:
Contract
Responsibilities
3 + years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g.
BigQuery, Composer ,
Dataflow , Dataproc, DLP, BigTable, Pub/Sub, Cloud Function etc.).
Minimum 4 + years experience with data management strategy formulation, architectural blueprinting, and effort estimation.
Good understanding and knowledge of Teradata/ Hadoop warehouse
Advocate engineering and design best practices including design patterns, code reviews and automation (e.g., CI|CD, test automation)
Cloud capacity planning and Cost-based analysis.
Worked with large datasets and solving difficult analytical problems.
Regulatory and Compliance work in Data Management.
Tackle design and architectural challenges such as Performance, Scalability, and Reusability
E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management.
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform
Good Understanding of Data Pipeline Design and Data Governance concepts
Experience in code deployment from lower environment to production.
Good communication skills to understand business requirements.
Required Skills and Abilities:
Mandatory Skills - BigQuery ,Composer, Python/ Java, GCP Fundamentals, Teradata/Hadoop
Secondary Skills – Abinitio, Dataproc, Kubernetes, DLP, Pub/Sub, Dataflow, Shell Scripting,SQL, Security(Platform & Data) concepts.
Expertise in Data Modeling
Detailed knowledge of Data Lake and Enterprise Data Warehouse principles
Expertise in ETL Migration from On-Primes to GCP Cloud
Familiar with Hadoop ecosystems, HBase, Hive, Spark or emerging data mesh patterns.
Ability to communicate with customers, developers, and other stakeholders.
Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer
#J-18808-Ljbffr