Hi Partner’s
This is Sai Ganesh from Virtuial networx Inc. we have urgent priority requirements with my implementation partner Hexaware.
Please share resumes to sai.k@virtualnetworx.com or call me at 469-209-6236.
Role: Snowflake with Airflow | 253160
Location: Chicago, IL (Hybrid – 3 days a week)
Client: The Northern Trust
Responsibilities:
Design, implement, and maintain data pipelines on Snowflake, ensuring scalability, reliability, and performance.
Develop and optimize data ingestion processes from various sources, including Azure Blob Storage, Azure Data Lake, databases, APIs, and streaming data sources.
Implement data transformation workflows using SQL, Python, and Airflow to cleanse, enrich, and aggregate raw data for downstream consumption.
Collaborate with data scientists and analysts to understand data requirements and implement solutions that enable advanced analytics and machine learning.
Design and implement data governance policies and procedures to ensure data quality, security, and compliance with regulatory requirements.
Perform performance tuning and optimization of Snowflake data warehouse, including query optimization, resource management, and partitioning strategies.
Develop monitoring, alerting, and logging solutions to ensure the health and availability of data pipelines and Snowflake infrastructure.
Stay up-to-date with the latest trends and technologies in data engineering, cloud computing, and workflow orchestration, and recommend relevant tools and practices to enhance our data infrastructure.
Qualifications:
Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent work experience).
Minimum of 8years of experience working as a Data Engineer, with a focus on cloud-based data platforms.
Strong expertise in Snowflake data warehouse, including experience with Snowflake architecture, SQL, and performance optimization.
Hands-on experience with Azure cloud platform, including Azure Blob Storage, Azure Data Lake, and Azure SQL Database.
Proficiency in workflow orchestration tools such as Apache Airflow, including DAG definition, task scheduling, and error handling.
Experience with data modeling concepts and techniques, including dimensional modeling and data warehousing best practices.
Strong programming skills in SQL and Python, with experience in data manipulation, transformation, and analysis.
Solid understanding of data governance, security, and compliance requirements, particularly in a regulated industry.
Excellent problem-solving skills and the ability to troubleshoot complex issues in data pipelines and infrastructure.
Strong communication skills and the ability to collaborate effectively with cross-functional teams.