AWS Data Architect || St. Louis, MO || Onsite


AWS Data Architect jobs in usa

AWS Data Architect jobs in us


Hope you are doing well,

Please reply to me if you are interested in the below position.


Local or Nearby candidates only.


Role: AWS Data Architect

Locations: St. Louis, MO (Fully Onsite)

Duration: 12+ Months Contract


Note: Candidate needs to be in the office 5 Days every week. 



Job Description:

Data Architecture Design:

Design and implement scalable, high-performance data architectures on AWS, utilizing services such as Amazon S3, Amazon Redshift, Amazon RDS, and more.
Collaborate with data stakeholders to understand data requirements and translate them into effective data solutions.

ETL Development:

Develop and maintain ETL (Extract, Transform, Load) processes to ingest, cleanse, and transform data in EMR and Glue from various sources into usable formats for analytics and reporting.
Optimize ETL workflows for performance and efficiency.
Must have Hands on Programming Experience in Spark, Scala.

Data Modeling:

Create and manage data models that support business reporting and analytics needs.
Ensure data quality and consistency by implementing data validation and data cleansing routines.

Data Security and Compliance:

Implement data security best practices and ensure compliance with relevant regulations (e.g., GDPR, HIPAA) and company policies.
Monitor and audit data access and usage to maintain data integrity and confidentiality.

Monitoring and Optimization:

Monitor data pipelines and infrastructure for performance, availability, and reliability.
Identify and resolve data-related issues and bottlenecks promptly.
Continuously optimize data processing and storage for cost-effectiveness.

Documentation and Collaboration:

Document data engineering processes, workflows, and best practices.
Collaborate with cross-functional teams, including data scientists and analysts, to understand their data needs and provide data solutions.

Other Expertise:

Experience in designing CICD pipelines using AWS Code Commit and Code Pipeline.
Experience in ingesting stream data from Kafka.
Experience in working with Apache Hudi and Iceberg tables.
Experience in working and understanding of Cloudera Platform



To apply for this job email your details to