GCP Architect–WI (Onsite)–NEED ARCHITECTS NOT ENGINEERS OR DEVELOPERS

Hi Vendors,This is AKASH, Staffing Specialist at Delta Systems. I am reaching out to you on an exciting job opportunity with one of our clients. Please find the requirement below.

Please share resumes to akash.goyal@delta-ssi.net


Role: GCP Architect

Work location: Abbotsford (US:54405), WI (Onsite)

 

JOB DESCRIPTION:

 

This position requires in-depth knowledge and expertise in GCP services, architecture, and best practices. They will collaborate with the lead architects to design, implement, test, debug, deploy, document and manage scalable and reliable ‘Big Data’ pipelines in GCP. They will also be responsible for staying up-to-date with the latest GCP technologies. Perform in an Agile Scrum driven environment to deliver new and innovative solutions. Keep up-to-date with relevant technology and industry standards in order to maintain and improve functionality for authored applications.

 

Responsibilities

• At the direction of lead architects, develop and implement technical efforts to build, and deploy GCP “Big Data” pipelines for large-scale data processing, reporting, and advanced analytics.

• Participate in all aspects of the software development lifecycle for GCP solutions, including planning, requirements, development, testing, and quality assurance.

• Ensure application performance, uptime, and scale, maintaining high standards for code quality and thoughtful design

• Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures.

• Maintain GCP environments in accordance with the company security guidelines.

• Track development and operational support via user stories and decomposed technical tasks in a provided issue tracking software, including GIT, Maven, and JIRA.

 

Desired skills and qualifications

 

• Proven ability to collaborate with multidisciplinary teams of business analysts, developers, data scientists, and subject-matter expertise

• Expertise in building and optimizing data pipelines, architectures, Data Warehouse concepts and datasets.

• Expertise in data concepts such as profiling, joins, aggregation, projection and explosion.

• Excellent coding, debugging and testing skills.

• Analytical, troubleshooting, organizational skills and problem solving skills, applied to the Big Data domain.

• Writing high-performance, reliable and maintainable code.

• Expertise in version control tools like GIT and/or equivalent.

• Knowledge of industry standards & best practices across multiple technologies.

• Knowledge and experience of Unix (Linux) Platforms and Shell scripting, such as Python.

• Knowledge of workflow/schedulers like Cloud Composer or Airflow.

• Hands-on experience with CICD tools like Screwdriver, Jenkins.

• Hands-on experience of Hadoop (Dataproc) & its ecosystem, Pig, Hive, Spark

• Expertise to manipulate data through query languages like SQL, HQL.

• Expertise with various data formats, such as AVRO, Parquet, ORC and/or similar.

• Excellent knowledge of database structures, theories, principles, and practices.

• Experience with GCP Database technologies (Eg: BigQuery).

 

Akash Goyal | Technical Recruiter

Delta System & Software, Inc.

akash.goyal@delta-ssi.net

Linked ID- linkedin.com/in/akash-goyal-4470551a0

www.deltassi.com

| USA | Canada | India | Europe | Middle East | Australia |

Leave a Reply

Your email address will not be published. Required fields are marked *