Get C2C/W2 Jobs & hotlist update

Python Developer || Jersey City, NJ / Boston, MA (hybrid) || Contract

 

Job Title: Python Developer

Location:  Jersey City, NJ / Boston, MA (hybrid)

Duration : Long Term Contract

Interview: Virtual

Visa :: USC/GC/H4 EAD

From the manager:

-Looking for: Python, Apache Airflow, Oracle Database, PL/SQL and Stored Procedures experience is a must

-Experience with a scheduling tool of some sort is a must. BBH uses RunMyJob (RMJ), they used Autosys in the past but they will take someone with any type of scheduling tool

-BBH needs to modernize a large codebase of 700-800 legacy Perl scripts that interact with an Oracle database. They are looking to move away from these legacy scripts to a more streamlined and testable solution and re-engineer with Python and Apache Airflow

-Need to test the scripts and update them using Python and Airflow – must have strong Python and Apache Airflow

-Financial Services industry a plus but not required

-Assess the existing legacy Perl scripts and redevelop these scripts using Python and Apache Airflow

-Automate and streamline jobs and processes

-Integrate with an Oracle database using PL/SQL

We’re seeking an experienced Python Developer to lead the automation and orchestration of complex data workflows. The ideal candidate will have hands-on experience designing robust, fault-tolerant, and auditable pipelines across on-prem Oracle systems, integrating with job schedulers like RunMyJobs, and modernizing legacy processes using Apache Airflow.

You will play a critical role in replacing legacy Perl/PLSQL scheduling logic with modern, Python-based DAG orchestration while ensuring traceability, data quality, and recoverability.

Key Responsibilities:
• Develop, deploy, and maintain Python-based automation scripts to orchestrate jobs across Oracle 19c on-prem systems.
• Design and implement Airflow DAGs to manage complex interdependent ETL workflows.
• Migrate existing job logic from Perl, RunMyJob, and PL/SQL-based scheduling into modular, observable Airflow DAGs.
• Build custom Airflow operators/sensors for integration with Oracle, REST APIs, file drops (SFTP/FTP), and external triggers.
• Implement robust error handling, alerting and retry mechanisms across job pipelines.
• Collaborate with DBAs and application teams to understand job dependencies, critical paths, and data lineage.
• Establish job execution logs, audit trails, and SLA monitoring dashboards.
• Participate in code reviews, documentation, and onboarding new jobs into the orchestrator.

Required Skills and Experience:
• 5+ years of Python development experience, with strong understanding of system/process automation.
• 2+ years of Apache Airflow building production DAGs.
• Solid understanding of Oracle 19c database, SQL tuning, and PL/SQL concepts.
• Experience orchestrating jobs that move large volumes of data across enterprise systems. • Familiarity with job schedulers (RunMyJob, Autosys, etc.) and how to replace/abstract them using orchestration tools. • Strong debugging skills across logs, databases, and filesystem for failed jobs or partial runs. • Experience building REST API integrations, SFTP/file movement logic, and parameter-driven automation.

Bonus / Preferred Experience:
• Prior experience modernizing legacy data workflows from Perl or PL/SQL stored procs.
• Hands-on knowledge of Git/Bitbucket, Jenkins, CI/CD pipelines for code-controlled job rollouts.
• Familiarity with financial data models (e.g., holdings, transactions, NAVs, tax lots).
• Basic understanding of data governance, audit, and operational risk in financial systems.

 

 

 

Thanks & regards!

 

Sonu Chauhan

Sr. Technical Recruiter

Phone: +1 3025490259 Ext. 102

Email: Sonu@virishatech.com

Address: 600 N Broad Street Suite 5 #269, Middletown, DE 19709 |


🔔 Get our daily C2C jobs / Hotlist notifications on WHATSAPP CHANNEL

Leave a Reply

Your email address will not be published. Required fields are marked *