

SRM Digital LLC
Apache Hadoop Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Apache Hadoop Architect with a contract length of "unknown," offering a pay rate of "unknown," and requires expertise in Hadoop components, Linux, Java, Python, and ETL processes. Experience in data security and governance is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Spark (Apache Spark) #SQL (Structured Query Language) #Data Governance #Data Processing #Python #Java #Scala #HBase #Linux #Automation #Hadoop #PySpark #Data Security #Compliance #Security #Programming #HDFS (Hadoop Distributed File System) #"ETL (Extract #Transform #Load)" #YARN (Yet Another Resource Negotiator)
Role description
Job Overview:
We are seeking an experienced Apache Hadoop Admin/Platform Architect to join our Data & Analytics team. The ideal candidate will be responsible for managing, optimizing, and architecting Hadoop ecosystems and related data platforms. This role requires deep technical expertise in Hadoop components, strong programming skills, and a solid understanding of data warehousing, ETL processes, and data security principles.
Key Responsibilities:
• Architect, deploy, and administer large-scale Apache Hadoop environments.
• Manage core Hadoop components including HDFS, YARN, MapReduce, Spark, Hive, HBase, and Phoenix.
• Design and implement efficient data processing and ETL workflows.
• Optimize performance and scalability of the Hadoop ecosystem.
• Develop and maintain scripts and automation using Linux command-line tools.
• Work with programming languages such as Java, Python, and PySpark for data processing and integration.
• Ensure data security, governance, and compliance across the Hadoop platform.
• Troubleshoot complex technical issues and provide performance tuning recommendations.
• Collaborate with cross-functional teams to support data analytics and reporting initiatives.
Required Skills & Qualifications:
• Proven hands-on experience with Apache Hadoop and its ecosystem (HDFS, YARN, MapReduce, Spark, Hive, HBase, Phoenix).
• Strong Linux administration and command-line expertise.
• Proficiency in Java, Python, PySpark, and SQL.
• Strong analytical and problem-solving skills.
• Knowledge of data security, data governance, and data warehousing principles.
• Experience working in ETL and large-scale data processing environments.
Job Overview:
We are seeking an experienced Apache Hadoop Admin/Platform Architect to join our Data & Analytics team. The ideal candidate will be responsible for managing, optimizing, and architecting Hadoop ecosystems and related data platforms. This role requires deep technical expertise in Hadoop components, strong programming skills, and a solid understanding of data warehousing, ETL processes, and data security principles.
Key Responsibilities:
• Architect, deploy, and administer large-scale Apache Hadoop environments.
• Manage core Hadoop components including HDFS, YARN, MapReduce, Spark, Hive, HBase, and Phoenix.
• Design and implement efficient data processing and ETL workflows.
• Optimize performance and scalability of the Hadoop ecosystem.
• Develop and maintain scripts and automation using Linux command-line tools.
• Work with programming languages such as Java, Python, and PySpark for data processing and integration.
• Ensure data security, governance, and compliance across the Hadoop platform.
• Troubleshoot complex technical issues and provide performance tuning recommendations.
• Collaborate with cross-functional teams to support data analytics and reporting initiatives.
Required Skills & Qualifications:
• Proven hands-on experience with Apache Hadoop and its ecosystem (HDFS, YARN, MapReduce, Spark, Hive, HBase, Phoenix).
• Strong Linux administration and command-line expertise.
• Proficiency in Java, Python, PySpark, and SQL.
• Strong analytical and problem-solving skills.
• Knowledge of data security, data governance, and data warehousing principles.
• Experience working in ETL and large-scale data processing environments.






