

Hadoop Developer/Admin
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer/Admin in Alpharetta, GA, offering a 12-month contract at an unspecified pay rate. Key skills include Hadoop cluster setup, Cloudera CDP upgrades, Unix shell scripting, and experience with HDFS, Hive, and security management.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 8, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Alpharetta, GA
-
π§ - Skills detailed
#Ruby #Python #GIT #Security #Cloudera #Data Integrity #Elasticsearch #HDFS (Hadoop Distributed File System) #Unix #HBase #Monitoring #Cloud #Hadoop #Logstash #Impala
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Hadoop Admin/Developer
Location: Alpharetta, GA - only local candidates are needed
Duration: 12 months contract
Description and Requirements:
1. Expert in setting up Hadoop cluster from scratch and server maintenance.
1. Commissioning and decommissioning of the nodes to/from Hadoop cluster.
1. Implementing, managing, and administering the overall Hadoop infrastructure.
1. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.
1. Monitoring the Hadoop cluster connectivity and performance of the cluster for application teams.
1. Replicating huge amount data from one Hadoop cluster to another.
1. Applying encryption to secure the data in the Hadoop cluster.
1. Applying the configuration changes and testing the cluster in collaboration with the application teams and business users.
1. Sound knowledge of Hadoop services - HDFS, Hive, HBase, Impala, Hue, Ranger etc.
1. Experience with Cloudera CDP Hadoop upgrade.
1. Experience with access management and setting up user roles in Hadoop.
1. Able to create and debug complex Unix shell scripts.
Nice to have:
1. Analyzing and troubleshooting the issues with the clusters.
1. Set up and maintain appropriate infrastructure to maintain performance and data integrity.
1. Experience with ELK Stack: Elasticsearch, Logstash, Kibana, Apache Hadoop
1. Ensuring appropriate monitoring & alerting of Elasticsearch cluster.
1. Expert in networking and security settings to support network related pipelines.
1. Good to have Ruby and/or Python, Unix and GIT Knowledge.