IPolarity

Hadoop Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Engineer based in O’Fallon, Missouri, requiring local candidates. The full-time position involves supporting a Big Data ETL platform, with key skills in Hadoop, Spark, and data governance. Contract duration exceeds 6 months.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 29, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
O'Fallon, MO
-
🧠 - Skills detailed
#Oracle #Data Quality #Data Warehouse #SQL (Structured Query Language) #CHEF #HDFS (Hadoop Distributed File System) #Data Lake #Jupyter #Spark (Apache Spark) #Cloud #Scripting #Shell Scripting #Jira #Jenkins #Axon #GCP (Google Cloud Platform) #NiFi (Apache NiFi) #Azure #Python #Big Data #Scala #GIT #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Unix #Databricks #Airflow #Hadoop #AWS (Amazon Web Services) #Java
Role description
• • • • Note: NEED LOCAL CANDIDATES ONLY • • • • Local to Missouri - In-Person Intyerview Job Title: Hadoop Engineer Location: O’Fallon, Missouri (3 Days onsite / week) Job Type: Fulltime Interview Process We are only looking for profiles local to O’Fallon, Missouri. Role & Responsibilities: Support Big data ETL platform built on top of Hadoop Required Skills: 1. Hadoop (HDFS/Ozone, Hive), Spark(Python/ Scala/ Java), SparkUI, Unix shell scripting technologies 1. Understanding of Data Warehouse/Data Lake/Lake House related ETL /ELT concepts, data quality, governance and performance tuning aspects. 1. Strong analytical and problem-solving skills. Desired Skills: 1. Familiarity with ITSM tools like Remedy, JIRA. Understanding of Work Order(WO), incident(INC), problem(PBI), and change(CRQ) management. 1. Knowledge of Python, JupyterNotes, NiFi, NiFi Registry, Oracle (SQL, PL-SQL), C, DMX/Syncsort,, CI/CD (GIT, Jenkins/ Chef), AirFlow, Kafka/Axon Streaming 1. Exposure to cloud platforms (Azure, AWS, or GCP) and tools like Databricks is a plus.