GCP Data Engineer
Sunnyvale, CA
6+ Months
$65/HR on C2C
***** Candidates should be local to CA and should have 10+ years of experience. Candidate should be able to work on our W2 *****
Top Skills: Java, PySpark, Data, Airflow, Distributed Systems, AWS and GCP
- Proficiency in programming languages such as Java or Python.
- Experience with big data processing frameworks such as Hadoop, Spark.
- Expertise in data modeling, data warehousing, and data integration techniques.
- Familiarity with data storage and retrieval technologies such as SQL, NoSQL, or graph databases.
- Understanding of cloud computing platforms such as Azure or Google Cloud.
- Experience with containerization technologies such as Docker or Kubernetes.
- Knowledge of version control systems such as Git.
- Ability to design and implement scalable and efficient data pipelines.
- Experience with continuous integration and deployment tools such as Jenkins and scheduling tools like Airflow.