
Great Hire
๐ Roles and Responsibilities
๐ Design, develop, and maintain scalable data pipelines and ETL processes.
๐ Build and optimize data models and data lakes/warehouses to support analytics and reporting.
๐ Collaborate with data scientists, analysts, and stakeholders to understand data needs and deliver solutions.
๐ Integrate data from multiple sources (structured and unstructured) into centralized systems.
๐ Ensure data integrity, quality, and security across platforms.
๐ Automate data workflows and monitor system performance.
๐ Document data flows, architecture, and technical processes.
๐ Troubleshoot data issues and provide technical support for data operations.
๐ Qualifications Required
๐ Bachelorโs or Masterโs degree in Computer Science, Engineering, or a related field.
๐ 3+ years of experience in data engineering or related roles.
๐ Proficiency in SQL and programming languages like Python, Java, or Scala.
๐ Experience with big data tools and frameworks (e.g., Hadoop, Spark, Kafka).
๐ Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.
๐ Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
๐ Strong understanding of ETL/ELT processes and data pipeline orchestration tools (e.g., Airflow, DBT).
๐ Excellent problem-solving and communication skills.
To apply for this job email your details to jankipatel2greathire@gmail.com