how to find c2c jobs || Data Engineer || Phoenix, AZ || Contract


Data Engineer

Data Engineer



Location::          Hybrid, 3 days Phoenix, AZ

Duration::         12+ months

Interview Mode :: Phone and Skype



Job Description-


Sits in Phx and will be in office 3 days/week

Contract to hire

Has to be very strong with GCP

On the same SLPM team that all the Kotlin/React/Java roles are on

Client is going to build a new Financial Crimes Platform to bring these things together in a fast modern system

Need strong GCP!


As a Data Engineer at our client, you will play a crucial role in designing, building, and maintaining the data architecture that powers our analytical and operational capabilities. You will work closely with developers, QA’s, Scrum Master and other stakeholders to ensure data availability, reliability, and integrity. Your expertise will be key in transforming raw data into actionable insights that drive our business forward.


Key Responsibilities:


Design and Develop Data Pipelines:


Create and maintain scalable ETL (Extract, Transform, Load) processes to gather, process, and store data from various sources.

Optimize data pipelines for performance and reliability.

Database Management:


Design, implement, and manage robust, scalable, and efficient databases.

Ensure data security, availability, and performance.

Data Integration:


Integrate data from multiple sources, ensuring consistency and accuracy.

Work with APIs and other data integration tools.

Data Quality and Governance:


Implement data quality checks and ensure data governance practices are followed.

Monitor data quality and integrity, addressing any issues promptly.



Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs.

Work closely with DevOps and IT teams to ensure seamless deployment and operation of data solutions.

Documentation and Best Practices:


Document data processes, workflows, and architectures.

Promote best practices in data engineering and contribute to the continuous improvement of the data infrastructure.




Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.

Proven experience as a Data Engineer or in a similar role.

Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL).

Experience with big data technologies (e.g., Hadoop, Spark, Kafka).

Proficiency in programming languages such as Kotlin and Java.

Familiarity with front-end technologies, particularly React.js, for data visualization and UI integration.

Experience with Google Cloud Platform and its data-related services (e.g., BigQuery, Dataflow, Pub/Sub).

Strong understanding of data warehousing concepts and experience with data warehouse solutions (e.g., BigQuery).

Knowledge of data modeling, data architecture, and ETL processes.

Excellent problem-solving skills and attention to detail.

Strong communication and collaboration skills.


Click here for More remote and onsite Contract / Fulltime USA JOBS 


To apply for this job email your details to