Hadoop Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer in Charlotte, NC, on a W2 contract. Key skills include Hadoop/Big Data experience, proficiency in programming languages (Java, Scala, Python), and familiarity with tools like Hive, Spark, and Kubernetes.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
May 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Big Data #GIT #Shell Scripting #Spark (Apache Spark) #Data Pipeline #Ansible #HBase #Scala #Agile #Kafka (Apache Kafka) #Version Control #Java #Linux #Python #Sqoop (Apache Sqoop) #Batch #Hadoop #Scripting #Kubernetes #Impala #Programming #Docker
Role description
Hadoop Developer Charlotte, NC Contract (W2 Only) Job Description Must have: Hadoop/ bigdata experience with any programming language Desired Skills: β€’ Experience working with Hadoop/Big Data and Distributed Systems β€’ Working experience on tools like Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, MapReduce, etc. β€’ Working experience with container orchestration platform like Kubernetes and experience with Docker β€’ Hands on programming experience in perhaps Java, Scala, Python, or Shell Scripting, to name a few β€’ Experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines β€’ Experience working in Agile development process and deep understanding of various phases of the Software Development Life Cycle β€’ Experience using Source Code and Version Control systems like SVN, Git, etc. β€’ Experience in configuration management tool like Ansible β€’ Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets β€’ Ability to comprehend customer requests and provide the correct solution β€’ Strong analytical mind to help take on complicated problems β€’ Desire to resolve issues and dive into potential issues Required Skills: β€’ Experience working with Hadoop/Big Data eco system and Distributed Systems β€’ Hands on experience in at least one programming language β€’ Experience and proficiency with Linux operating system is a must β€’ Ability to adapt and continue to learn new technologies is important β€’ Experience using Source Code and Version Control systems like SVN, Git, etc. Thanks Lalit Lalit@hirextra.com