

NLB Services
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Bentonville, AR, requiring 8+ years of IT experience, 4+ years in GCP, and skills in Spark, Kafka, and Python. Contract length is unspecified, with an onsite work requirement.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Bentonville, AR
-
🧠 - Skills detailed
#Data Quality #GCP (Google Cloud Platform) #Databricks #Data Pipeline #Data Architecture #Java #Big Data #Kafka (Apache Kafka) #Programming #Data Engineering #Data Warehouse #Docker #NoSQL #"ETL (Extract #Transform #Load)" #Python #Scala #Data Processing #PySpark #BigQuery #Data Ingestion #Data Accuracy #Spark SQL #Kubernetes #SQL (Structured Query Language) #SQL Queries #Databases #Spark (Apache Spark) #Batch #Azure #ML (Machine Learning) #Apache Spark #Apache Kafka #Redshift #Monitoring #Airflow #Data Lake #Cloud #Snowflake
Role description
Senior Data Engineer
Bentonville AR (Onsite 5 days)
Looking for local candidate only
Experience Level
Total IT Experience – Minimum 8+ years
GCP - 4 + years of recent GCP experience
Description:
We are seeking a Data Engineer with Spark & Streaming skills to build real-time, scalable data pipelines using tools like Spark, Kafka, and cloud services (GCP) to ingest, transform, and deliver data for analytics and ML.
Responsibilities:
As a Senior Data Engineer, you will Design, develop, and maintain ETL/ELT data pipelines for batch and real-time data ingestion, transformation, and loading using Spark (PySpark/Scala) and streaming technologies (Kafka, Flink). Build and optimize scalable data architectures, including data lakes, data warehouses (BigQuery), and streaming platforms. Performance Tuning: Optimize Spark jobs, SQL queries, and data processing workflows for speed, efficiency, and cost-effectiveness
Data Quality: Implement data quality checks, monitoring, and alerting systems to ensure data accuracy and consistency.
Required Skills & Qualifications:
Programming: Strong proficiency in Python, SQL, and potentially Scala/Java.
Big Data: Expertise in Apache Spark (Spark SQL, DataFrames, Streaming).
Streaming: Experience with messaging queues like Apache Kafka, or Pub/Sub.
Cloud: Familiarity with GCP, Azure data services.
Databases: Knowledge of data warehousing (Snowflake, Redshift) and NoSQL databases.
Tools: Experience with Airflow, Databricks, Docker, Kubernetes is a plus.
Senior Data Engineer
Bentonville AR (Onsite 5 days)
Looking for local candidate only
Experience Level
Total IT Experience – Minimum 8+ years
GCP - 4 + years of recent GCP experience
Description:
We are seeking a Data Engineer with Spark & Streaming skills to build real-time, scalable data pipelines using tools like Spark, Kafka, and cloud services (GCP) to ingest, transform, and deliver data for analytics and ML.
Responsibilities:
As a Senior Data Engineer, you will Design, develop, and maintain ETL/ELT data pipelines for batch and real-time data ingestion, transformation, and loading using Spark (PySpark/Scala) and streaming technologies (Kafka, Flink). Build and optimize scalable data architectures, including data lakes, data warehouses (BigQuery), and streaming platforms. Performance Tuning: Optimize Spark jobs, SQL queries, and data processing workflows for speed, efficiency, and cost-effectiveness
Data Quality: Implement data quality checks, monitoring, and alerting systems to ensure data accuracy and consistency.
Required Skills & Qualifications:
Programming: Strong proficiency in Python, SQL, and potentially Scala/Java.
Big Data: Expertise in Apache Spark (Spark SQL, DataFrames, Streaming).
Streaming: Experience with messaging queues like Apache Kafka, or Pub/Sub.
Cloud: Familiarity with GCP, Azure data services.
Databases: Knowledge of data warehousing (Snowflake, Redshift) and NoSQL databases.
Tools: Experience with Airflow, Databricks, Docker, Kubernetes is a plus.






