

Hadoop Admin Ops / SRE
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Hadoop Admin Ops / SRE position for 6 months, offering a pay rate of "X" per hour. Key skills include Hadoop, Spark, Kafka, and cluster management. Experience with Cloudera and Data Science products is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Ansible #Sybase #Spark (Apache Spark) #Unix #Kafka (Apache Kafka) #ML (Machine Learning) #Oracle #AI (Artificial Intelligence) #Kudu #Disaster Recovery #Scripting #Agile #Linux #Perl #Monitoring #Deployment #Kerberos #Impala #Hadoop #Docker #DevOps #Jupyter #HDFS (Hadoop Distributed File System) #YARN (Yet Another Resource Negotiator) #Python #Java #SQL (Structured Query Language) #Big Data #Cloudera #Capacity Management #Shell Scripting #HBase #Zookeeper #Debugging #Forecasting #Automation #BitBucket #Jenkins #Cloud #Data Science #Talend
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hadoop Admin Ops / SRE role supporting NextGen Platforms built around Big Data Technologies (Hadoop, Spark, Kafka, Impala, Hbase, Docker-Container, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, DataRobot, C3, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite etc.
DevOps Analyst is involved in the full life cycle of an application and part of an agile development process. They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors, and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility, and education/experience within this role.
Works on complex, major, or highly visible tasks in support of multiple projects that require multiple areas of expertise
Team member will be expected to provide subject matter expertise in managing Hadoop and Data Science Platform operations with a focus around Cloudera Hadoop, Jupyter Notebook, Openshift, Docker-Container Cluster Management and Administration
Integrates solutions with other applications and platforms outside the framework
He / She will be responsible for managing platform operations across all environments, which includes upgrades, bug fixes, deployments, metrics / monitoring for resolution and forecasting, disaster recovery, incident / problem / capacity management
Serves as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs
Required Skills:
Hadoop, Kafka, Spark, Impala, Hive, Hbase, etc.
Strong knowledge of Hadoop Architecture, HDFS, Hadoop Cluster, and Hadoop Administrator's role
Intimate knowledge of fully integrated AD/Kerberos authentication
Experience setting up optimum cluster configurations
Debugging knowledge of YARN
Hands-on with analyzing various Hadoop log files, compression, encoding, & file formats
Expert-level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, ZooKeeper, and Postgres
Strong technical knowledge: Unix/Linux; Database (Sybase/SQL/Oracle), Java, Python, Perl, Shell scripting, Infrastructure
Experience in Monitoring & Alerting, and Job Scheduling Systems
Being comfortable with frequent, incremental code testing and deployment
Strong grasp of automation / DevOps tools - Ansible, Jenkins, SVN, Bitbucket
Primary Skills:
Hadoop - Cloudera, Hortonworks, Apache
Secondary Skills:
Cloudera - HDFS, Impala, KUDU, HBase