GCP Data Engineer in San Jose, CA

Hi ,

Please find the below description and let me know your interest along with one copy of your updated resume at spaul@emonics.com

Title: GCP Developer
Location: San Jose , CA
Type: Full Time

GCP Data Developer at Zensar participates in end-to-end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by:
•       Understanding customer requirements and create technical proposition
•       Managing and owning all aspects of technical development and delivery
•       Understanding requirements and writing technical documents
•       Ensuring code review and developing best practises
•       Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery 
•       Estimating effort, identifying risk, and providing technical support whenever needed
•       Demonstrating the ability to multitask and re prioritizing responsibility based on dynamic requirements
•       Mentoring teams as needed

Skills required to contribute:
8+ Years of overall Development experience with –
1.      5+ Experience with Google Cloud Platform (GCP) products including BigQuery, Cloud Storage, Cloud Functions, DataProc, DataStudio.
2.      Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM roles/bindings, logging explorer, troubleshooting the issues.
3.      Understanding of CI/CD pipeline, terraform scripting for deploying the objects, IAM bindings.
4.      Knowledge of having data modelling (Erwin) and governance, Objects review and best practices.
5.      Good knowledge of Data Warehouse concepts, ETL pipelines including Informatica/Talend, IICS, any RDBMS (nice to have Teradata)
6.      Excellent communication and presentation skills.
7.      Extensive experience in Google Cloud stack – Google Cloud Storage, Google BigQuery, Google Data Flow, Google DataProc, Google Data Studio etc.
8.      Experience in job scheduling using Oozie or Airflow or any other ETL scheduler 
9.      Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
10.     Good experience in designing & delivering data analytics solutions using GCP Cloud native services.
11.     Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design
12.     Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies.
13.     Experienced in internal as well as external stakeholder management
14.     Professional Google Cloud Data engineer certification will be added advantage
15.     Nice to have skills: Working experience with Snowflake, Databricks, Open-source stack like Hadoop Bigdata, Hive etc.

Unfeigned Regards

Saptarshi Paul
Technical Recruiter           
Cell ::  201 -204- 0849
Fax: (201) 604 6123
Email address- spaul@emonics.com
Emonics LLC
1260 Centennial Ave, Suite 1A
Piscataway, NJ- 08854

———– —–  US STAFFING ESSENTIALS  ————————————–
For Latest daily Remote US Jobs , 
Search on Google         C2C REQUIREMENTS
For daily Updated Hotlist ,
Search on google            CORP TO CORP HOTLIST

Leave a Reply

Your email address will not be published. Required fields are marked *