Get C2C/W2 Jobs & hotlist update

Multiple Contract Position-Immediate Interview

Requirement: 1

Sr. Java Developer with GenAI
Phoenix, AZ / Irving, TX / Charlotte, NC / North Brunswick, NJ
(Need Onsite day 1, hybrid 3 days from office).
Duration: 12 months
Position type: 12 months contract on W2
Required Skillset – (AI Code Generator/ AWS/Sagemaker, Spring AI, Harness, Postgres, MongoDB, Github CoPilot)

Job Description:
We are seeking a highly experienced Senior Java Developer with a strong background in Generative AI and expertise in using GitHub Copilot to join our innovative team.
The ideal candidate will possess over 10 years of experience in software development, with a proven track record of designing and implementing complex applications.
Candidate will lead projects that leverage Generative AI technologies to enhance the software solutions and drive efficiency across development processes.

Responsibilities:
Lead the design, development, and deployment of Java-based applications incorporating Generative AI solutions.
Leverage GitHub Copilot to improve development speed and code quality, serving as a mentor for team members on best practices.
Collaborate with data scientists and AI engineers to integrate AI models and algorithms into software applications.
Architect and implement scalable microservices and RESTful APIs aligned with business requirements.
Ensure the performance, security, and reliability of applications throughout their lifecycle.
Conduct code reviews, providing constructive feedback and guidance to junior developers.
Stay abreast of emerging technologies and trends in AI and software development to drive innovation.
Collaborate with cross-functional teams to define project scope, objectives, and deliverables

Requirements:
Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
Experience in (AI Code Generator/ AWS sagemaker Spring AI/Harness, Postgres, MongoDB, Github CoPilot)
Minimum of 8 – 10 years of proven experience as a Java Developer with a strong portfolio of successful projects.
Extensive experience with Generative AI concepts, frameworks, and applications.
Proficiency in utilizing GitHub Copilot and other AI-assisted development tools to optimize coding processes.
Deep understanding of the Spring Framework, Hibernate, and other Java technologies.
Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their AI services.
Experience with database design, management, and SQL (MySQL, PostgreSQL, etc.).
Exceptional problem-solving skills, attention to detail, and a commitment to quality.
Strong communication, leadership, and mentorship abilities.

Requirement: 2
Job Title: Sr. Python Data Engineer (Python, MongoDB, PowerBI and Data Extraction, ability to write code)
Location: Hartford, CT
Position type: 12 months contract on W2
Duration: 12+ months extendable.
Required Skills:  Python, MondoDB, PowerBI and Data ExtractionOverview

Mandatory skills:
Strong Python candidate having exp in MondoDB, PowerBI and Data ExtractionOverview
We are seeking a motivated and skilled Data Engineer to join our team. The ideal candidate will have strong proficiency in Python, knowledge of MongoDB, and experience with Power BI. This role will primarily focus on building data extracts and data feeds for downstream systems, including Ledger, Reporting, and Boudreaux. You will be responsible for writing the code necessary to deliver the required data based on predefined definitions

Responsibilities:
Data Extraction and Transformation: Write efficient and scalable Python code to extract and transform data from various sources, ensuring it meets the requirements for downstream systems.
MongoDB Management: Utilize MongoDB for storing and managing data. Design and optimize data models and queries to ensure high performance and reliability.
Power BI Integration: Collaborate with stakeholders to deliver data in a format suitable for reporting and visualization in Power BI. Ensure that the data aligns with business needs and reporting requirements.
Data Feed Development: Develop and maintain data feeds for downstream systems, including Ledger and Boudreaux, ensuring timely and accurate data delivery.
Collaboration with Stakeholders: Work closely with business analysts and other stakeholders to understand data definitions and requirements, translating them into technical specifications.
Documentation: Maintain comprehensive documentation of data pipelines, data models, and technical specifications to support knowledge sharing and compliance.
Quality Assurance: Implement data validation and quality checks to ensure the accuracy and integrity of the data being delivered.

Requirements:
Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field.
Minimum of 9 years of experience in data engineering or a related field.
Strong experience in Python programming, particularly in data manipulation and ETL processes.
Solid knowledge of MongoDB and experience in database design and management.
Familiarity with Power BI for creating reports and dashboards.
Experience in building data extracts and feeds for business applications.
Analytical Skills: Strong analytical and problem-solving skills with a focus on delivering high-quality data solutions.
Communication: Excellent verbal and written communication skills, with the ability to clearly convey technical concepts to both technical and non-technical stakeholders.
Team Player: Ability to work collaboratively in a fast-paced, team-oriented environment.


You received this message because you are subscribed to the Google Groups “sys1point” group.
To unsubscribe from this group and stop receiving emails from it, send an email to sys1point+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/sys1point/CAKibJ3pV7JSv%3D8Ezz1RABAWdK_Lj7dVzkTmDWqF4CVQpi%2BJN7g%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Leave a Reply

Your email address will not be published. Required fields are marked *