Realign LLC

Cloud Era/Big Data Administrator-3

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Era/Big Data Administrator in Torrance, CA, on a contract basis. The position requires 3-5 years in the Hadoop ecosystem, 6+ years in Java/Linux, and experience with AWS. Strong communication skills are essential.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Torrance, CA
-
🧠 - Skills detailed
#Kerberos #Unix #Data Lake #PySpark #SQL (Structured Query Language) #Scripting #R #Public Cloud #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Documentation #Java #Python #Scala #Replication #MySQL #Linux #Security #Spark (Apache Spark) #Hadoop #Virtualization #Sqoop (Apache Sqoop) #LDAP (Lightweight Directory Access Protocol) #YARN (Yet Another Resource Negotiator) #AWS EMR (Amazon Elastic MapReduce) #DBA (Database Administrator) #HDFS (Hadoop Distributed File System) #Cloud #Computer Science #Big Data #Compliance #HBase #SaaS (Software as a Service) #Automation
Role description
Job Type: Contract Job Category: IT Role: Cloud Era/Big Data Administrator Location: Torrance, CA (Day 1 Onsite) both FTE or Contract Job Description: Looking for highly skilled Onprem Cloud Era/Big Data Administrator to manage big data platform: Roles Handle day-to-day operations; Administer, and monitor Big Data Platform components (Hadoop, Hive, HBase, BigSQL, etc) Support Technical and Application team requests - data copies across environments, data cleanup, query tuning, etc. Troubleshoot and resolve issues related to user queries, application jobs, etc. Monitor and troubleshoot Big Data Platform, including hosted DBs, tools and services Creating encryption zones ,managing encryption keys and copying data between the encryption zones. Support projects/initiatives related to data acquisition, processing and utilization through Big Data Platform Maintain and support Big Data Platform (CDP Private Cloud) and IBM BigSQL across multiple environments Install new versions, patches; support upgrades of CDP Private Cloud and IBM BigSQL Monitor and troubleshoot platform issues and performance tuning Support security policies, documentation and compliance Document processes and policies for Hadoop System Administration & Platform support activities Work with end users to identify and resolve issues Work with vendors to resolve issues as needed Provide governance and support to project teams . Upgrade to CDP Private Cloud to CDP Public Cloud . Skills: Bachelors' or Master's degree in Computer Science or applicable field of study 3-5 Years Of Experience in Hadoop Ecosystem (HDFS, MapReduce, Hive, HBase, Spark, Scala, Yarn, Sqoop, Ranger, Knox, Kerberos, LDAP, SOLR) 6+ Years Of Experience with Java and Linux/Unix 4+ Years Of Experience with Security, Scripting and Automation 2+ Years Of Experience In R, Python, Flume, Storm, Kafka,Pyspark 2+ Years Of Experience with IBM BigSQL Or Similar MPP SQL Engines 2+ Years Of Experience with MYSQL. . 1+ Years of Experience with CI/CD Experience with administration and performance tuning of Hadoop distributions (open source and commercial) Highly competent with Data Lake concepts and Big Data capabilities Good understanding of data virtualization , replication, Data G. Experience with AWS or other Cloud-based PaaS/SaaS environments . Experience with AWS EMR Strong communication skills Required Skills DATABASE ADMINISTRATOR