

Centraprise
Apache NiFi
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Apache NiFi Data Engineer on a contract basis, requiring expertise in Apache NiFi and the GCP ecosystem. Key skills include strong SQL and Python proficiency, with a focus on real-time data processing and cloud compliance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
October 1, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Hartford, CT
-
🧠 - Skills detailed
#Airflow #GIT #SQL (Structured Query Language) #Data Lake #Python #Dataflow #Security #Cloud #Automation #Apache NiFi #Compliance #Data Security #NiFi (Apache NiFi) #"ETL (Extract #Transform #Load)" #Data Processing #Scripting #Storage #BigQuery #Data Architecture #GCP (Google Cloud Platform) #Version Control #Data Wrangling #Data Warehouse #Apache Beam
Role description
Primary Skillset: Apache NiFi
Secondary Skillset: GCP ecosystem (BigQuery, Pub/Sub, Dataflow, Dataproc, Cloud Storage, Composer, etc)
• Proven experience with Apache NiFi for real-time ingestion, routing, and transformation.
• Strong knowledge of the GCP ecosystem, including:
• Pub/Sub (event streaming)
• BigQuery (data warehouse)
• Dataflow/Apache Beam (data processing)
• Cloud Storage (data lake)
• Composer/Airflow (orchestration)
• Strong SQL skills for querying, modeling, and performance optimization.
• Proficiency in Python for scripting, automation, and data wrangling.
• Experience with real-time streaming concepts (windowing, late-arriving data, deduplication).
• Knowledge of data architecture principles (ETL/ELT, data lakes, data warehouses).
• Familiarity with CI/CD practices and version control (Git).
Understanding of data security, governance, and compliance in cloud environments
Primary Skillset: Apache NiFi
Secondary Skillset: GCP ecosystem (BigQuery, Pub/Sub, Dataflow, Dataproc, Cloud Storage, Composer, etc)
• Proven experience with Apache NiFi for real-time ingestion, routing, and transformation.
• Strong knowledge of the GCP ecosystem, including:
• Pub/Sub (event streaming)
• BigQuery (data warehouse)
• Dataflow/Apache Beam (data processing)
• Cloud Storage (data lake)
• Composer/Airflow (orchestration)
• Strong SQL skills for querying, modeling, and performance optimization.
• Proficiency in Python for scripting, automation, and data wrangling.
• Experience with real-time streaming concepts (windowing, late-arriving data, deduplication).
• Knowledge of data architecture principles (ETL/ELT, data lakes, data warehouses).
• Familiarity with CI/CD practices and version control (Git).
Understanding of data security, governance, and compliance in cloud environments