HI,
Job Title/Role: Tech Lead – Data Engineer
Location: San Antonio, TX (5 days onsite as per customer expectation)
Experience: 10+ years
Implementation partner : HCL TECH
Rate : $50 per hour
Location: San Antonio, TX (5 days onsite as per customer expectation)
Experience: 10+ years
Implementation partner : HCL TECH
Rate : $50 per hour
Job Description:
We are looking for a highly skilled Tech Lead – Data Engineer with 10+ years of experience to work on enterprise-level data engineering projects. The role will involve migrating solutions to cloud platforms (AWS/Azure), optimizing performance, and designing scalable data systems using technologies such as Snowflake, DBT, SQL, Python, and ETL tools (Datastage or Informatica).
Mandatory Skills
Snowflake :
DBT :
SQL :
Python :
Datastage or Informatica :
Unix :
DBT :
SQL :
Python :
Datastage or Informatica :
Unix :
Required Knowledge and Expertise:
Strong knowledge of SQL and cloud-based technologies.
Familiarity with data warehousing concepts, data modeling, and metadata management.
Understanding of data lakes, multi-dimensional models, and data dictionaries.
Experience in migration to AWS or Azure Snowflake platform.
Expertise in performance tuning and resource monitor setup in Snowflake.
Knowledge of Snowflake modeling—roles, databases, schemas.
SQL performance tuning, query optimization, and database tuning.
Familiarity with ETL tools and cloud-driven skills.
Roles and Responsibilities:
Design, create, test, and implement enterprise-level applications with Snowflake.
Implement features for identity and access management.
Create authorization frameworks for improved access control.
Work on query optimization and address performance/scalability issues in systems.
Manage transactional systems with distributed data processing algorithms.
Take ownership of projects from start to finish, ensuring successful implementation and delivery.
Build, monitor, and optimize ETL and ELT processes.
Migrate solutions from on-premises setups to cloud-based platforms (AWS, Azure).
Familiarity with data warehousing concepts, data modeling, and metadata management.
Understanding of data lakes, multi-dimensional models, and data dictionaries.
Experience in migration to AWS or Azure Snowflake platform.
Expertise in performance tuning and resource monitor setup in Snowflake.
Knowledge of Snowflake modeling—roles, databases, schemas.
SQL performance tuning, query optimization, and database tuning.
Familiarity with ETL tools and cloud-driven skills.
Roles and Responsibilities:
Design, create, test, and implement enterprise-level applications with Snowflake.
Implement features for identity and access management.
Create authorization frameworks for improved access control.
Work on query optimization and address performance/scalability issues in systems.
Manage transactional systems with distributed data processing algorithms.
Take ownership of projects from start to finish, ensuring successful implementation and delivery.
Build, monitor, and optimize ETL and ELT processes.
Migrate solutions from on-premises setups to cloud-based platforms (AWS, Azure).
To unsubscribe from future emails or to update your email preferences click here