Hi,
We have urgent requirement of Principal Data Architect in Nashville, TN (Day 1 onsite)
Here are the job details
Job Title: Principal Data Architect
Location: Nashville, TN (Day 1 onsite)
Position Summary:
- We are seeking an experienced and highly skilled Data Architect to join our dynamic team.
- The ideal candidate will have a strong background in data architecture, cloud-based data solutions, and data engineering, with hands-on experience with Google Cloud Platform (GCP).
- In this role, you will be responsible for designing, implementing, and optimizing scalable, high-performance data solutions that support advanced analytics, AI/ML, and real-time decision-making across our enterprise.
- As a key technical leader, you will collaborate closely with data engineers, software engineers, cloud architects and business stakeholders to develop a modern data ecosystem that enables seamless integration of structured and unstructured data, ensures robust governance and security, and drives innovation in healthcare data management.
- You will leverage technologies such as BigQuery, Dataproc, Dataflow, Pub/Sub, Vertex AI and Apache Iceberg/Delta Lake to build a unified and efficient data platform.
- Data Architects are expected to come into the position with a great deal of prior knowledge and experience and need to demonstrate the ability to learn quickly and drive a team-oriented environment.
- Extensive experience in data engineering and data management is important in this position as well as exposure to Business Intelligence/Artificial Intelligence tools and concepts, as a strategic goal of the organization is to build an integrated and enterprise grade Lakehouse environment.
- This is an exciting opportunity to shape the next-generation data strategy for a leading healthcare organization, enabling insights that improve patient outcomes, enhance operational efficiency, and support regulatory compliance.
- If you are passionate about data architecture, cloud technologies, and transforming healthcare through data-driven insights, we invite you to join our team.
Major Responsibilities:
- This role will focus on setting technical direction on groups of applications and similar technologies as well as taking responsibility for the implementation of technically robust solutions encompassing all business, architecture, and technology constraints.
- Lead the design and implementation of scalable and efficient data architecture solutions using GCP technologies such as BigQuery, Cloud Dataflow, Cloud Spanner, Pub/Sub, and Google Cloud Storage.
- Architect and manage data lakes and data warehouses with an emphasis on performance, scalability, and cost-efficiency.
- Deep understanding of Lakehouse design patterns, including Delta Lake, Apache Iceberg, and Hudi.
- Experience designing scalable, high-performance, and cost-effective data architectures on GCP.
- Strong knowledge of data partitioning, indexing, schema evolution, and ACID compliance within a Lakehouse environment.
- Design and optimize ETL/ELT pipelines and data workflows for high-volume, high-velocity data across the cloud environment.
- Collaborate with stakeholders to understand business requirements and develop strategies for data architecture that align with organizational goals.
- Drive the integration of data from disparate sources, ensuring data quality, consistency, and reliability across various platforms.
- Work with data engineering teams to ensure seamless data ingestion, transformation, and consumption processes in the cloud environment using Agile practices and principles.
- Develop and enforce best practices around standards, frameworks, data quality, data integration, data governance, security, data retention, and compliance for data management in the cloud.
- Mentor and provide technical guidance to junior data engineers and architects on the use of cloud data platforms and engineering practices.
- Stay current with emerging data technologies and GCP services and assess their applicability to the organization’s needs.
- Provide thought leadership in cloud-based data solutions, contributing to the development of an enterprise data strategy.
- Architect, manage, and own full data lifecycle from raw data acquisition through transformation to end user consumption.
- Provide guidance on technology choices and design considerations for migrating data to the Cloud.
- Translate project-specific requirements into a cloud structure that meets those requirements, as well as considering the project’s resource use and scalability requirements.
- Maintains a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and accessed.
- Ensure architectural, quality, and governance adherence through design reviews.
- Experience with building consumable data lakes, analytics applications, and tools
- Collaborate closely with individuals across the technology organizations to help promote awareness of the data architecture and ensure that enterprise assets of competence are leveraged.
Education & Experience:
- Bachelor’s degree in computer science, related technical field, or equivalent experience Required
- Master’s degree in computer science or related field Preferred
- 3+ years of experience in Cloud Data or Information Architect Required
- 5+ years of experience in Healthcare Preferred
- 10+ years of experience in Information Technology Required
Knowledge, Skills, Abilities, Behaviors:
- 10+ years of experience in data architecture, data engineering, or cloud engineering with a focus on designing large-scale data solutions.
- 3+ year(s) experience of hands-on experience with GCP platform and experience with many of the following components:
- BigQuery, Cloud Storage, Dataflow
- Cloud Run, GKE, Cloud Functions
- Spark Streaming, Kafka, Pub/Sub
- Bigtable, Cloud SQL, Cloud Spanner
- JSON, Avro, Parquet, Iceberg, Delta formats
- Cloud Composer, DataProc, CI/CD, Cloud Logging
- Vertex AI, Pydantic, LangChain, FastAPI, Docker
- Extensive experience in designing and managing data warehouses, data lakes, and cloud-based data ecosystems.
- Expertise in ETL/ELT tools and technologies, with a strong background in automating data pipelines.
- Strong programming skills in Python, Java, Scala, SQL, and/or other relevant languages for data engineering.
- Experience with DevOps and automation tools for deploying and managing data solutions in the cloud.
- Knowledge of data security, privacy, and compliance regulations (GDPR, HIPAA, etc.) within the context of cloud data architectures.
- Experience with Oracle, SQL Server, and other database platforms.
- Adeptness to learn new assignments, technologies, and applications quickly and manage multiple assignments simultaneously.
- Ability to troubleshoot, maintain, reverse engineer, and optimize existing ETL pipelines.
- Ability to analyze and interpret complex data and offer solutions to complex clinical problems.
- Ability to work independently on assigned tasks.
- Strong written and verbal communication skills including the ability to explain complex technical issues in a way that non-technical people may understand.
Certifications (a plus, but not required):
- GCP Professional Data Engineer
- GCP Professional Cloud Architect
PHYSICAL DEMANDS/WORKING CONDITIONS (Specific statements of physical effort required and description of work environment; e.g., prolonged sitting at CRT. required travel %)
- Prolonged sitting or standing at computer workstation including use of mouse, keyboard, and monitor.
- Requires ability to provide after-hours support.
—
You received this message because you are subscribed to the Google Groups “Daily recruitment demands” group.
To unsubscribe from this group and stop receiving emails from it, send an email to daily-recruitment-demands+unsubscribe@googlegroups.com.
To view this discussion, visit https://groups.google.com/d/msgid/daily-recruitment-demands/9964d682-15b9-46fd-80a4-4df424248a64n%40googlegroups.com.