Senior Data Engineer c2c jobs – Charlotte, NC – Contract


Senior Data Engineer c2c jobs

Senior Data Engineer



Location: Charlotte NC (Onsite day one – Looking for Local resources)

Duration: 6 to 12 Months

Pay Rate : $60/hr on C2C



Experience Level – 10+ years.


We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. As a Senior Data Engineer, you will play a key role in designing, building, and maintaining data pipelines and infrastructure, utilizing your expertise in Python, Snowflake, dbt (Data Build Tool), Airflow, and data modeling and ELT (Extract, Load, Transform) concepts.


Key Responsibilities:

·         Data Pipeline Design and Development: Design, develop, and maintain scalable and robust data pipelines using Python, Snowflake, and dbt, ensuring efficient data extraction, transformation, and loading processes.

·         Workflow Orchestration: Implement and manage workflow orchestration using Airflow to schedule, monitor, and automate data processing tasks and workflows.

·         Data Modeling: Apply advanced data modeling techniques to design and optimize data schemas and structures, ensuring data integrity, performance, and scalability.

·         ELT Implementation: Implement ELT (Extract, Load, Transform) processes to efficiently transform and load data into Snowflake data warehouse, leveraging dbt for data transformations.

·         Performance Optimization: Identify and implement performance optimization strategies to enhance the speed, efficiency, and reliability of data pipelines and processing workflows.

·         Collaboration and Mentorship: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements and deliver effective data solutions. Mentor junior team members and provide technical guidance and support as needed.

·         Continuous Improvement: Stay updated with the latest technologies, tools, and best practices in data engineering, and drive continuous improvement initiatives to enhance data engineering processes and capabilities.



·         Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

·         Strong hands-on experience in data engineering roles, with a strong focus on building data pipelines and infrastructure.

·         Proficiency in Python programming language for data manipulation, scripting, and automation.

·         Extensive experience with Snowflake data warehouse platform, including data modeling, SQL development, and performance tuning.

·         Hands-on experience with dbt (Data Build Tool) for data transformation and modeling.

·         Experience with workflow orchestration tools such as Airflow for scheduling and monitoring data workflows.

·         Deep understanding of data modeling principles and techniques, including dimensional modeling and schema design.

·         Strong knowledge of ELT (Extract, Load, Transform) concepts and implementation strategies.

·         Excellent problem-solving skills and the ability to troubleshoot and resolve complex data engineering issues.

·         Strong communication and collaboration skills, with the ability to work effectively in a team environment.


Click here for More remote and onsite Contract / Fulltime USA JOBS  


To apply for this job email your details to