Galent

Apache NiFi Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Apache NiFi Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Apache NiFi, Java, ETL/ELT processes, SQL, and big data technologies. Experience with cloud platforms is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Docker #Kubernetes #Compliance #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Scala #Automation #Security #Apache NiFi #Linux #Spark (Apache Spark) #Data Pipeline #Big Data #Java #SQL (Structured Query Language) #Databases #AWS (Amazon Web Services) #Batch #Monitoring #Data Analysis #Data Governance #Azure #Hadoop #Data Integration #Kafka (Apache Kafka) #NiFi (Apache NiFi)
Role description
We are seeking a skilled Data Engineer to design, build, and maintain real‑time and batch data pipelines using Apache NiFi. This role focuses on developing reliable, scalable, and secure data integration solutions across multiple platforms. Key Responsibilities • Design, build, and configure complex real‑time and batch data flows, including ETL/ELT pipelines, using Apache NiFi • Develop and customize NiFi processors, scripts, and automation tools to support unique data transformation requirements • Monitor, troubleshoot, and optimize NiFi data flows to ensure high performance, reliability, and throughput • Integrate data from multiple sources, including databases, APIs, and streaming platforms, while adhering to security and compliance standards • Identify, debug, and resolve production issues in collaboration with cross‑functional teams • Follow organizational data governance, security, and operational best practices Required Skills and Qualifications • Demonstrated experience designing, managing, and maintaining Apache NiFi data flows • Proficiency in Java for developing custom processors or scripts • Working knowledge of ETL/ELT processes, SQL, and big data technologies such as Kafka, Spark, and Hadoop • Experience with Linux environments and cloud or container platforms (AWS, Azure, Docker, Kubernetes) Work Environment & Essential Functions • Primarily computer‑based work involving data analysis, development, and system monitoring • May require participation in troubleshooting or support activities during business hours or scheduled rotations Equal Employment Opportunity Statement We are an equal opportunity employer and are committed to creating an inclusive environment for all employees. Employment decisions are made without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, age, disability, veteran status, or any other protected characteristic under applicable law.