

Hadoop Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer with a 12+ month contract in Charlotte, NC (Hybrid). Key skills include Cloudera Hadoop, Linux, Python scripting, and cloud components. W2 only; no C2C/C2H. Advanced troubleshooting and performance tuning experience required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#YARN (Yet Another Resource Negotiator) #Unix #Cloudera #REST (Representational State Transfer) #REST API #Cloud #Monitoring #Metadata #Security #API (Application Programming Interface) #Apache Spark #HDFS (Hadoop Distributed File System) #Programming #Sqoop (Apache Sqoop) #Impala #HBase #Capacity Management #Database Performance #NiFi (Apache NiFi) #Spark (Apache Spark) #Linux #SSIS (SQL Server Integration Services) #Python #Scripting #Hadoop #Storage
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Hadoop Developer
Duration: 12+ Months
Location: Charlotte, NC (Hybrid onsite)
Contract Type: W2 ONLY (NO C2C/C2H)
Job Description
The Data analytics Platform client services team is seeking a candidate who is proficient in Cloudera Hadoop ecosystem and its components (HDFS, YARN, HIVE,Tez, Impala, Spark, MapReduce, HBase) , Ozone and Private Cloud Components (CDW, CDW and CML) .
The EET RTC Data Analytics team member will be responsible for providing technical and administrative support for Hadoop, Ozone, Linux, Cloud and HBase platforms in a fast-paced operations in environment supporting business critical applications using Hadoop components and Cloud components.
The Analyst should have good problem-solving, strong, and advanced troubleshooting of challenging and complex problem on Hadoop, Cloud and Linux related issues for the Hadoop clients.
Good knowledge and experience in Unix and Python scripting to develop platform monitoring, application management and CM API and Rest API capabilities and capacity management development tools.
Responsibilities
β’ Proven understanding and knowledge with Cloudera Hadoop, YARN, Hive, Tez, IMPALA, Apache Spark, Sqoop, Ozone, Private Cloud Data Services, HBase, NiFi, security Ranger, Ozone, Hive Metadata with SSIS.
β’ Administer, troubleshoot, perform problem isolation, and correct problems discovered in clusters
β’ Performance tuning of Hadoop clusters and ecosystem components and jobs. This includes the management and review of Hadoop log files and identify root cause and provide solution.
β’ Troubleshoot platform problems and connectivity issues. Diagnose and address application and database performance issues using performance monitors and various tuning techniques.
β’ Interact with Storage and Systems administrators on Linux/Unix/VM operating systems and Hadoop Ecosystems
β’ Capabilities to automate and manual tasks, create alerts and platform problems tools
β’ Platform Cluster Capacity Management with tenant Storage, Compute management. Tenant storage monitoring and alerting and reporting.
β’ Document programming problems and resolutions for future reference.