Position – AWS Data Engineer
Location – Remote
US-Citizen with Passport Copy
Client – HCL
Job Description:
- Automate creation, transformation, and integration of raw and enriched data models in the data
- Lakehouse.
- Implement automated ETL pipelines for data ingestion, transformation, and consumption Design and implementation for integrating Lakeformation across AWS Data Platform Accounts. Design and implementation for integrating Lakeformation with Keycloak.
- Design and implementation for an integrated IAM solution including EKS Service Accounts. Design and implementation of HashiCorp Vault with integration to AWS Services such as Secrets Manager, Certificate Manager and Keycloak.
- Development of integration of HashiCorp Vault with AWS Data Services such as RDS Aurora PostgreSQL, RDS MSSQL and other RDS solutions as required.
- Provision MSSQL server on AWS using Terraform
- Deploy read replica for MSSQL server using Terraform
- Implement monitoring & logging for MSSQL server using Terraform
- Transition existing pipeline to MSSQL server
- Collaborate with the business application owner on the existing data architecture, including data ingestion, data pipelines, business logic, data consumption patterns, and analytics requirements Design and document the target data architecture, pipelines, processing and analytics architecture Identify opportunities for optimization and consolidation
- Collaboration with data team on decomposition of business logic and data transformation patterns
Thanks And Regards.
Ahtesham Khan
—
You received this message because you are subscribed to the Google Groups “c2c urgent mail” group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2curgentmail+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2curgentmail/CABbach0TEVz13EVeLW-6TjYEEGTAaLecC7ZahSVjU21V_x3zog%40mail.gmail.com.