local of MA :: Data Engineer

Hi,

I hope this email finds you well. I'm reaching out to share an incredible opportunity that I believe aligns perfectly with your skills and aspirations.

 

Job Title: Data Engineer
Location: Boston MA (Hybrid)
Job Type: Contract

 

Must be local of MA. 

 

As a Data Engineer, your primary responsibilities will include:

  • Reviewing and understanding all required data sources, both structured and unstructured.
  • Developing and testing data pipelines for data ingestion, acquisition, and storage.
  • Automating ingestion pipelines and orchestrating them with standard scheduling tools.
  • Deploying data engineering jobs on cloud platforms and ensuring proficiency with major hyperscalers (AWS, Azure, or GCP).
  • Collaborating with source systems to build metadata and create source-to-target mappings.
  • Working with data storage and warehouse teams to ensure optimal performance and tuning of data engineering jobs.

 

Required Education & Experience:

  • Bachelor’s degree or higher in fields such as Finance, Economics, Mathematics, Computer Science, Statistics, Process and Mechanical Engineering, Operations Research, Data Science, Accounting, Business Administration, or related areas.
  • 5+ years of relevant work experience.

 

Required Soft Skills:

  • Strong ownership and accountability in delivering high-quality work while effectively managing priorities and deadlines.
  • Ability to recommend and implement improvements to processes.
  • Excellent written and verbal communication skills, including the ability to create and deliver presentations.
  • Ability to communicate concisely, tailoring messages to the topic, audience, and competing priorities.
  • Strong analytical thinking and ability to ask probing questions to drive clarity and make informed decisions.

 

Technical Skills:

  • Expert knowledge of major hyperscaler data engineering tools (e.g., AWS Glue, Spark, Azure Data Factory, Informatica, Talend).
  • Preferred experience with Databricks, Snowflake, and related tools in the data ecosystem.
  • Experience in building data pipelines optimized for large datasets, including integration, storage, cleansing, and transformation.
  • Familiarity with a wide range of data storage solutions and an understanding of efficient utilization.
  • Ability to translate data requirements into technical designs, understanding the differences between various storage solutions.
  • Experience with the Software Development Lifecycle (SDLC).

Thanks & Regards

 

Noor Mohammad

Email:  noor@arohatechnologies.com

URL: http://www.arohatechnologies.com 


You received this message because you are subscribed to the Google Groups “DailyC2CRequirments” group.
To unsubscribe from this group and stop receiving emails from it, send an email to dailyopen_c2c__requirments+unsubscribe@googlegroups.com.
To view this discussion, visit https://groups.google.com/d/msgid/dailyopen_c2c__requirments/CA%2B6ZLR2aCfUVK%3D0-Pwqd%3DKDk9Fn%2Bap5%3D6YQnP1i2k0VE961p6Q%40mail.gmail.com.

Leave a Reply

Your email address will not be published. Required fields are marked *