GCP Data Engineer jobs with Spark, Scala, GCP
Location: Sunnyvale CA (Hybrid)
Duration: 12+ Months
Need Strong gcp data engineer resume
Need 11-12+ years of experience.
Mandatory:
Spark – 8+ Yrs. of Exp
Scala – 8+ Yrs. of Exp
GCP – 3+ Yrs. of Exp
Hive – 8+ Yrs. of Exp
SQL – 8+ Yrs. of Exp
ETL Process / Data Pipling – 8+ Years of experience
Retail (preferred)
Requirements:
8+ years of hands-on experience with developing data warehouse solutions and data products.
4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow, or a workflow orchestration solution are required.
4 + years of experience in GCP, GCS Data proc, BIG Query
2+ years of hands-on experience in modelling (Erwin) and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi-TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor’s degree in computer science or equivalent experience.
Thanks and regards
mayank@empowerprofessionals.com
Read more:
top 10 staffing companies in usa
Join linkedin 42000+ US Active recruiters Network
Join No.1 Telegram channel for daily US JOBS and Updated HOTLIST