Airflow Data Engineer Need h1b ONLY

Airflow Data Engineer  Remote
Job Summary:
We are seeking a skilled Data Engineer join our data engineering team. The successful candidate will be responsible for enhancing and automating our data workflows using Astronomer Airflow. This role involves improving data processing capabilities, optimizing task scheduling, and reducing our dependency on Databricks for orchestration. The specialist will also manage and monitor our data pipelines to ensure efficient and reliable operations.
Key Responsibilities:
  • Astronomer Airflow Setup:
    • Install and configure Astronomer Airflow.
    • Establish necessary connections and ensure the Airflow web server and scheduler are operational.
    • Integrate CI/CD pipelines and GitHub for version control and automated deployment of workflows.
  • Workflow Migration:
    • Migrate existing workflows to the Astronomer Airflow platform.
    • Ensure workflows run error-free and meet performance benchmarks.
  • Team Training:
    • Conduct training sessions for the data engineering team.
    • Provide supporting materials and documentation.
    • Ensure team proficiency in creating and managing workflows and DAGs.
  • Monitoring and Alerting:
    • Set up monitoring tools to track workflow performance.
    • Configure alerts for critical issues.
    • Establish regular reporting procedures for ongoing monitoring.
  • Optimization and Documentation:
    • Review and optimize workflows for performance.
    • Create detailed documentation covering workflows, configurations, and best practices.
    • Establish a feedback loop for continuous improvement.
  • Expand Orchestration Capabilities:
    • Integrate Airflow with Azure functions, REST endpoints, and event-driven architectures.
    • Implement support for external services to extend orchestration beyond Databricks.
    • Create modular workflows for integrating ML/AI batch workloads.
  • Job and Data Dependency Management:
    • Develop a system to handle job dependencies.
    • Implement checks to ensure jobs run only if the underlying data has changed.
Required Qualifications:
  • Proven experience with Astronomer Airflow and data workflow orchestration.
  • Strong understanding of CI/CD pipelines and version control systems (e.g., GitHub).
  • Experience with cloud platforms (e.g., Azure) and integrating external services.
  • Proficiency in Python and SQL.
  • Excellent problem-solving skills and attention to detail.
Preferred Qualifications:
  • Experience with Databricks and Snowflake.
  • Familiarity with machine learning and artificial intelligence workflows.
  • Strong communication and training skills.
  • Ability to work collaboratively in a team environment.
 
 



Akhil Bangi

Professional Recruiter

Cerebra Consulting Inc

270 Lancaster Ave, Suite-D2, Malvern, PA 19355

| 8148312414   Ext 176

| www.cerebra-consulting.com  Bangi.akhil@cerebra-consulting.com&nbsp

PARTNERS| Oracle | Azure | AWS | Salesforce | Data Analytics | DevOps

Leave a Reply

Your email address will not be published. Required fields are marked *