Get C2C/W2 Jobs & hotlist update

Big Data Developer with Azure Databricks( No H1B – Hybrid NJ )

Location: NJ (Hybrid 3-4 days in Office)

Financial Background MUST

Rate: $60/hr on C2C

 

Visa – Any except H1B

 

 

Responsibilities:

  • Develop, optimize, and maintain big data pipelines using Apache Spark, Azure Databricks, and related tools.
  • Write and deploy complex production systems in Scala, Java, or Python, ensuring high performance and reliability.
  • Leverage Terraform for automating cloud infrastructure deployment on Azure or other major cloud providers.
  • Design and implement CI/CD pipelines for data workflows using modern tools like GitLab, promoting automation and continuous improvement.
  • Apply best practices for testing, instrumentation, observability, and alerting to maintain data pipeline health and performance.
  • Contribute to system architecture and low-level design, ensuring modularity, scalability, and security.
  • Understand and implement data models, data structures, and algorithms for efficient data processing.
  • Use containerization technologies such as Docker and Kubernetes for development, build, and runtime environments.
  • Work within Agile development teams, participating in planning, daily stand-ups, and iterative releases.
  • Collaborate effectively in a global team, influencing key architectural decisions and sharing best practices.

 

Requirements:

  • 10+ Years of experience is required.
  • Proven experience developing big data solutions with Apache Spark and Azure Databricks.
  • Hands-on experience with Terraform for infrastructure as code (IaC) on major cloud platforms, ideally Azure.
  • Programming expertise in Scala, Java, or Python for complex production systems.
  • Extensive Python experience, especially in data engineering contexts.
  • Background in platform engineering roles on cloud platforms, with strong knowledge of Azure services.
  • Practical knowledge of testing frameworks, instrumentation, observability, and alerting tools.
  • Experience building and maintaining CI/CD pipelines with modern cloud-friendly systems like GitLab.
  • Deep understanding of information modelling, data structures, and algorithms.
  • Hands-on experience with containerization (Docker) and orchestration tools (Kubernetes).
  • Strong understanding of technical architecture and low-level system design.
  • Familiarity with Agile methodologies and best practices in software development

 

 

Regards:

Vamsi
hr@shayaancorp.com O. 732 798 5943

Shayaan Corporation

 

Leave a Reply

Your email address will not be published. Required fields are marked *