Location: Bolingbrook, IL (Hybrid mode – 3 days biweekly, Need local profiles)
visa: H1b,H4,USC
Must share the PP number
Skills:
GCP BigQuery, Data Flow, Cloud Compute, Cloud Storage, Workload Management, Cost Optimization.
GCP Administration (70%)
• Perform administrative tasks such as user access, service accounts creation, work with Cloud Security for provisioning and troubleshooting.
• Monitor day to day performance of the GCP services and take proactive steps to avoid failure or data pipelines execution or slowness of the jobs.
• Manage upgrades for GCP services such as Cloud Composer. Work closely with cloud engineering to enable/configure GCP services.
• Manage Data Security, PII Tagging, Authorizations. Work with Compliance team to support compliance requirements such as SOX and CCPA.
• Cloud Cost Manage as SRE
• Log analytics for Bigquery , workload management to manage and allocate compute resources for data analysis
• Should have knowledge of upgrading of Composer in GCP
• Provide technical support to GCP users, troubleshooting issues, and resolving errors.
GCP Development (30%)
• Lead offshore platform GCP engineers to support the tasks development.
• Develop CI/CD pipelines for deployment of artifacts.
• Create dashboards in Looker for Big Query performance monitoring, slots management, user queries.
• Work with Platform Architects for building new capabilities/frameworks or setting up policies/guidelines for development teams.
• Perform POC of new GCP services including Big Query new features and implementation.
• Work with Cloud security for building security frameworks and perform vulnerability assessment to detect/resolve platform security issues.
• Develop Data Engineering pipelines for Data Ingestion into GCP Platform/Big Query.
Requirements:
• Bachelor's degree in Computer Science, Information Technology, or related field.
• Hands-on experience in the administration of GCP services such as Cloud dataflow, Cloud Pub-Sub, Big Query, Cloud Storage, Cloud Functions, Cloud Composer, DataProc etc.
• Experience/familiarity with ETL process, streaming data, Kafka, API and SQL.
• Experience in scripting using Python and Windows/Unix commands.
• Expert in Cost management as SRE
• Log analytics for Bigquery , workload management to manage and allocate compute resources for data analysis
• Experience with CI/CD, Git/Jenkins workflows and SDLC cycle supporting multiple non-production and production environments.
• Experience in automation of the deployment of services and configuration updates is a plus.
To unsubscribe from future emails or to update your email preferences click here