AMSYS Innovative Solutions, LLC

Data Engineer (Apache NiFi / Snowflake Integration)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Apache NiFi and Snowflake integration, with a contract length of "unknown", offering a pay rate of "$/hour". Key skills include 3–8+ years of experience, custom NiFi processor development in Java, and cloud integration expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
November 14, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#DataOps #Monitoring #ODBC (Open Database Connectivity) #NiFi (Apache NiFi) #Data Warehouse #Java #JSON (JavaScript Object Notation) #Azure #Snowflake #JDBC (Java Database Connectivity) #Cloud #Deployment #Airflow #GCP (Google Cloud Platform) #REST (Representational State Transfer) #Spark (Apache Spark) #Data Ingestion #Kafka (Apache Kafka) #Data Engineering #Data Pipeline #REST API #XML (eXtensible Markup Language) #Data Modeling #Data Integration #Apache NiFi #Automation #"ETL (Extract #Transform #Load)" #Python #DevOps #AWS (Amazon Web Services) #Databases #Scala
Role description
We are seeking a highly skilled Data Engineer with deep hands-on experience in Apache NiFi to design, build, and optimize large-scale data integration pipelines. This role focuses on creating and maintaining custom NiFi processors/connectors, integrating NiFi with Snowflake (OpenFlow), and supporting enterprise data ingestion and transformation workflows. The ideal candidate is a Data Engineer who specializes in building secure, scalable, and high-performance data flows across diverse systems and cloud platforms. Responsibilities • Design, develop, and maintain data pipelines using Apache NiFi, including workflows, processors, templates, and controller services. • Build custom NiFi processors/connectors (primarily in Java; Python is a plus). • Integrate NiFi pipelines with Snowflake, OpenFlow, databases, APIs, and cloud services. • Develop ETL/ELT processes to support structured and unstructured data ingestion. • Optimize pipelines for performance, throughput, reliability, and error handling. • Implement DataOps best practices including monitoring, automation, and CI/CD for pipeline deployment. • Troubleshoot data flow issues and ensure secure, compliant data movement across systems. • Collaborate with data engineering, analytics, and application teams on requirements and enhancements. Required Skills • 3–8+ years as a Data Engineer or Data Integration Engineer. • Strong hands-on experience with Apache NiFi (flow design, processors, controller services). • Proven experience developing custom NiFi processors—Java is required. • Hands-on integration experience with Snowflake or similar cloud data warehouses. • Expertise in REST APIs, JSON, XML, JDBC/ODBC, and real-time/streaming ingestion patterns. • Solid foundation in ETL/ELT, data modeling, and pipeline orchestration. • Experience with AWS, Azure, or GCP cloud services (preferred). • Familiarity with Kafka, Spark, Airflow, or similar data engineering tools is a plus. Nice-to-Have • DataOps/DevOps experience with CI/CD pipelines. • Performance tuning for high-volume NiFi environments. • Prior experience in enterprise or regulated environments.