Get C2C/W2 Jobs & hotlist update

Read Notes Before Sending Profiles || Direct Client Hiring Cloudera Big Data Administrator || Location: Reston, VA or Washington DC (100% Remote). || No H1B Profiles

Hi,

I hope you're doing well.

This is Mohd Waseem, a recruiter with Lumen Solutions Group Inc., a dynamic small and minority-owned, Disadvantaged Business Enterprise headquartered in Florida, USA. As a leading consulting services and solutions provider, we focus on IT Staffing, Business / IT Strategy, Business Process Blueprints, Enterprise Architecture, Enterprise Transformation for our clients.

Our client base includes Fortune 500, Government, non-profit and emerging growth companies.

 

Please review the job description below. If you are interested in this position, please forward your updated resume for immediate consideration and preferred time to discuss this opportunity further.

 

Job Title: Cloudera Big Data Administrator

Job Location:  Reston, VA or Washington DC (100% Remote)

Job Type: Contract to Hire (C2H)

 

Notes:

  • This is for This is a CLOUDERA BIG DATA ADMINISTRATOR POSITION and NOT A DEVELOPER POSITION
  • No H1B Profiles Allowed on this Role.
  • Must have 7 years of experience in USA
  • Only for DC/MD/VA/DE/NC/PA /NJ / NY /FL/ TX Candidates with Local ID Proof

 

Job Description:

We are seeking an experienced Cloudera Big Data Administrator for a contract-to-hire position in Reston, VA. The ideal candidate will possess strong expertise in managing and optimizing Cloudera CDP Public Cloud (v7.2.17 or higher) environments and will have experience with administering and troubleshooting big data services such as NiFi, Kafka, SOLR, HBase, and other related technologies. This is not a developer role but a hands-on administrator role, focusing on managing big data clusters, ensuring high availability, and optimizing performance for enterprise systems.

Key Responsibilities:

  • Build and manage Cloudera clusters, including setting up NiFi, SOLR, HBase, Kafka, and Knox on the cloud using CDP Public Cloud v7.2.17 or higher.
  • Configure and maintain high availability for services like Hue, Hive, HBase REST, SOLR, and IMPALA on BDPaaS platform clusters.
  • Monitor and troubleshoot the health of all services using Cloudera Manager.
  • Write shell scripts for health checks and automated responses to warnings or failures.
  • Administer and manage Kafka brokers, streams, offsets, and integration with IBM MQ.
  • Handle flow management, registry server management, and NiFi integrations with Kafka, HBase, and SOLR.
  • Manage and optimize HBase database operations, SOLR shards, and collections, including troubleshooting long-running queries.
  • Collaborate with cross-functional teams (Application Development, Security, and Platform Support) to implement configuration changes for improved cluster performance.

Required Skills and Experience:

  • Proficiency with Cloudera CDP Public Cloud v7.2.17 or higher.
  • Strong administration and troubleshooting skills with Apache Kafka (Kafka Streams API, KStreams & KTables, broker management, offset management).
  • Expertise in Apache NiFi administration, flow management, and integrations.
  • Experience managing HBase (database management, troubleshooting) and SOLR (managing shards, logging levels, high availability).
  • Familiarity with AWS services like EC2, S3, EBS, and EFS.
  • Hands-on experience with YARN, Hue, and security policies using Ranger.
  • Proficiency in scripting for automation and performance tuning.
  • Strong understanding of Kerberos, TLS/SSL, and resolving workload issues for data scientists.

Additional Skills:

  • Knowledge of Python/R for designing and implementing automated data pipelines.
  • Familiarity with infrastructure expansion, cluster migration, major upgrades, and COOP/DR setups.
  • Experience with testing, governance, data quality, and documentation efforts.
  • Expertise in supporting and troubleshooting big data ecosystems, including Hadoop databases and related services.
  • Experience with streaming technologies like Kafka, Spark, and Kudu is a plus.

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Minimum 5+ years of experience as a Cloudera Big Data Administrator or in a similar role.

 

Client Notes:

Sr. Cloudera Big Data Admin – Min 5 years of experience

Cloudera admin has several experiences with CDP

Hbase, Nifi, SOLR, KAFKA

Hands on experience for Cloudera, CDP, Public cloud 7.2 and newer version

Only administrator for Cloudera CDP Public Cloud

Experience with AWS EC2 and S3

Role is to upgrade of CDP, Troubleshooting, AMI upgrade, HBase for enrollment and claim, Bulk load, QA, Different environments setup

AWS🡪AMS is NICE TO HAVE

NOT LOOKING FOR ON PREM ADMINISTRATOR

 

Regards,

Mohd Waseem

US IT Recruiter, Lumen Solutions Group Inc.

Email: mohdwaseem6428@gmail.com


You received this message because you are subscribed to the Google Groups “Exclusive C2C Requirements” group.
To unsubscribe from this group and stop receiving emails from it, send an email to hot-requirements-2022+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hot-requirements-2022/CAO0F7YQU6PiCQ4Lk52YoSh1c-3haasYTerwQRKJDbxWAX-7XLw%40mail.gmail.com.

Leave a Reply

Your email address will not be published. Required fields are marked *