Sr. Snowflake Architect

Sr. Snowflake Architect
Remote
Experience: 15-20 years

We are seeking a highly skilled and experienced Snowflake Architect to join our team. As a Snowflake Architect, you will be responsible for coming up with road map to successfully migrate the existing data sets from the legacy big data environment to the new snowflake environment. Get the approval from the stakeholders and business. Finally work on successfully migrating all the datasets, pyspark jobs, control m jobs and schedules from the legacy big data environment to the new snowflake environment to meet our clients’ needs. You will lead our development to ensure the successful migration of the legacy system to the new snowflake environment and also implement the new projects within the snowflake environment. If you have a strong background in end to end implementation and migration of legacy data sets to snowflake, and a passion for creating innovative solutions, we would love to hear from you.

NOTE: This job is not for freshers or candidates with intermediate experience. This is a Senior Architect role and the ideal candidate will be leading a team of 15 developers and will be reporting directly to the Chief Data Officer and CEO. So NO OPT candidates or candidates with intermediate experience.

Responsibilities:
– Collaborate with stakeholders and come up with a road map to migrate the existing data sets, pyspark programs and scheduled jobs from big data environment to Snowflake environment.

– Lead the development effort to successfully migrate the legacy data sets with huge data volume to the new snowflake environment.

– Lead the development team in implementing best practices for data migration to the snowflake environment.

– Lead the effort to migrate all the complex PySpark programs, scheduled jobs in control m to snowspark programs and schedulers within snowflake environment.

– Work on the legacy pyspark programs as and when necessary.

– Conduct code reviews to ensure adherence to coding standards and quality guidelines

– Identify and resolve technical issues and bottlenecks

– Stay up-to-date with industry trends and emerging technologies

– Mentor and provide guidance to junior developers

Skills:
– Strong experience with Agile methodologies
– Proficiency in Python, Java, and other programming languages
– Knowledge of analytics tools and techniques for data analysis
– Experience with big data technologies such as Spark and Hadoop
– Familiarity with cloud platforms like Azure
– Understanding of database design principles and SQL
– Knowledge of Informatica or other ETL tools

Qualifications:

-Minimum four (4) years experience in data migration and end to end implementation of Snowflake environment

-Minimum six (6) years experience in programming languages like Python, PySpark etc

-Minimum twelve (12) years experience in Informatica or any other ETL tools.

-Minimum twelve (12) years experience working in data ware house environment.

Leave a Reply

Your email address will not be published. Required fields are marked *