Hello,
I have one position with one of our clients, please let me know if you are interested.
Job Title: GCP Data Engineer
Job Location: Manhattan, NYC – Local Only
Contract: 12+ Months
USC/GC only
JOB DESCRIPTION:
Must have recent experience (within the past 6 years) in the financial services sector (banking or brokerage)—no exceptions
Client is seeking Senior GCP Data Engineers who are subject matter experts in Google Cloud Platform (GCP), Python, SQL, and data pipelines. The ideal candidate will have a proven track record of designing and developing scalable, secure, and high-performance data engineering solutions in the GCP ecosystem—specifically within banking or financial services environments. This role is critical to building and maintaining the infrastructure that powers enterprise-grade analytics and data-driven decision-making.
Key Responsibilities
• Design, build, and optimize end-to-end data pipelines using GCP-native services such as Dataflow, Dataproc, and Pub/Sub.
• Implement data ingestion, transformation, and processing workflows using Apache Beam, Apache Spark, and scripting in Python.
• Manage and optimize data storage using BigQuery, Cloud Storage, and Cloud SQL to ensure performance, scalability, and cost-efficiency.
• Enforce enterprise-grade data security and access controls using GCP IAM and Cloud Security Command Center.
• Monitor and troubleshoot data pipelines using Stackdriver and Cloud Monitoring to ensure high availability and low latency.
• Collaborate closely with analysts, data scientists, and cross-functional product teams to understand business needs and deliver robust data solutions.
Required Skills and Qualifications
• 12+ years of overall IT experience, with deep specialization in data engineering.
• 8+ years of hands-on experience designing, building, and maintaining data pipelines in enterprise environments.
• 5+ years of recent experience working with Google Cloud Platform (GCP)—specifically within a major U.S. bank or brokerage firm (required, no exceptions).
• Strong expertise in:
· GCP services: Dataflow, Dataproc, Pub/Sub, BigQuery, Cloud Storage, Cloud SQL.
· Data processing frameworks: Apache Beam and Apache Spark.
· Scripting and automation: Advanced proficiency in Python and SQL for data manipulation, transformation, and querying.
• Proven experience implementing GCP IAM policies and managing data access/security at scale.
• Demonstrated ability to ensure low-latency, high-throughput data systems through performance tuning and best practices.
• Deep understanding of data compression, storage optimization, and cost-effective cloud design.
———
Thanks & Regards
Abhilash Chaudhary
Senior Technical Recruitment Executive
Phone: +1-(771)-333-7122
Email: Abhilash.Chaudhary@codinix.com