No more applications are being accepted for this job
Data Platform Engineer - San Francisco, United States - Motion Recruitment
Description
Ourlarge cryptocompany is looking for a contractData Platform Engineer.This is aremote contract position.
Contract Duration: 12-Months
Required Skills & Experience
4+ years of relevant work experience.
Fluent SQL skills for ETL/ELT, data quality, and reporting.
Strong Python or Spark backend development skills.
General experience working with Snowflake, Databricks or Trino.
Experience with core AWS services and concepts (S3, IAM, ASG, RDS).
Leverage your Python, Airflow, and Looker/Superset expertise to scale our data pipelines and models.
Desired Skills & Experience
Computer Science/Engineering degree is preferred.
Snowflake/Databricks/Hadoop training certificate is a plus.
[Optional] Experience with Kafka or Pub/Sub for streaming ecosystem.
What You Will Be Doing
Daily Responsibilities
Data Platform team builds and operates systems to centralize all of the client's data, making it easy for teams across the company to access, transform, and consume/serve that data for analytics, ML, and powering end-user experiences.
As an engineer on the team you will manage scalable data storage and pipelines, optimize data model, system efficiency and developer experience.
For example, you will onboard new datasets from MongoDB/RDS/Kafka to Snowflake and Databricks, conduct impact analysis based on data lineage, and refactor data model for quality, reusability, performance and cost efficiency.
#J-18808-Ljbffr