Close ads

Top 10 New Jobs GCP Data Engineer Quick Overview

Job Title: GCP Data Engineer
Long Term Contract
Job Description:

Key Responsibilities:
1. Data Pipeline Development: Design, build, and maintain data pipelines on GCP, utilizing tools such as Dataflow, Dataprep, and BigQuery for batch and real-time data processing.
2. Data Ingestion: Implement efficient data ingestion processes to collect data from various sources, ensuring data is collected reliably and securely.
3. Data Transformation: Perform data transformation, cleansing, and enrichment to prepare data for analysis and reporting, leveraging tools like Dataflow and Cloud Dataprep.
4. Data Quality Assurance: Establish data quality checks and monitoring mechanisms to ensure data accuracy, consistency, and completeness.
5. Data Warehousing: Design and manage data warehousing solutions using Google BigQuery or other appropriate GCP services.


6. Security and Compliance: Ensure data security and compliance with relevant data privacy regulations, implementing access controls, encryption, and auditing as necessary.
7. Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and best practices to facilitate knowledge sharing within the team.
8. Collaboration: Collaborate closely with data scientists, analysts, and other cross-functional teams to understand data requirements and deliver solutions aligned with business needs.
9. Monitoring and Troubleshooting: Implement monitoring and alerting solutions to proactively identify and address data pipeline issues.
10. Cost Optimization: Optimize data storage and processing costs on GCP by managing resource allocation efficiently.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a data engineer, with a focus on GCP technologies.
  • Strong expertise in GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Proficiency in SQL and database management.
  • Experience with data modelling and ETL (Extract, Transform, Load) processes.
  • Familiarity with cloud-native data architectures and best practices.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Ability to work independently and meet project deadlines.

Preferred Qualifications:

  • GCP certification(s) in relevant areas.
  • Experience with CI/CD (Continuous Integration/Continuous Deployment) pipelines for data engineering.
  • Knowledge of data governance and data cataloging tools on GCP.

Thanks & Regards
Pradeep kumar .V.N.R

| Vuesol Technologies Inc.

IT Recruiter
Contact: 470-649-5121

read more:

top 10 staffing companies in usa

Corp to corp remote jobs

Updated bench sales hotlist

US IT recruiter vendor list

List of direct clients in USA

More Corp to corp hotlist

Join linkedin 42000+ US Active recruiters Network

Join No.1 Telegram channel for daily US JOBS and Updated HOTLIST 

Leave a Reply

Your email address will not be published. Required fields are marked *