LanceSoft, Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer contract position lasting over 6 months, offering a competitive pay rate. Key skills required include 4+ years in Apache Spark and SQL, and 3+ years in Kafka and GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
January 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Batch #Apache Spark #Stories #Kafka (Apache Kafka) #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Cloud #Python #Java #Agile #Scrum #Jira #Airflow #Data Pipeline #Scala #Spark (Apache Spark) #BigQuery #Microservices #Data Quality #Databases #Security #Data Engineering
Role description
Position Overview We are seeking a Senior Data Engineer for a long-term contract role supporting enterprise-scale data initiatives. This position will focus on designing, building, and enhancing modern data pipelines across cloud and on-prem environments while collaborating closely with Agile delivery teams. This role requires strong hands-on coding skills, ownership of end-to-end data solutions, and the ability to communicate effectively with business and technical stakeholders. Key Responsibilities • Participate in daily Scrum calls and provide regular status updates • Pick up assigned JIRA stories and collaborate with Analysts and Product Owners to clarify requirements • Design, develop, maintain, and enhance data engineering solutions across multiple data sources, including databases, file systems (structured and unstructured), and cloud platforms • Build, test, and optimize data pipelines using batch and streaming technologies • Develop scalable and secure pipelines leveraging Kafka, BigQuery, file systems, and cloud-native tools • Write high-performance data pipelines using Spark DataFrames and BigQuery • Ensure data quality, performance optimization, and security best practices • Take ownership of end-to-end testing for assigned features • Communicate effectively with business users and technical stakeholders Required Skills (Top Non-Negotiables) • Apache Spark (Spark Framework): 4+ years • Kafka: 3+ years • Google Cloud Platform (GCP): 3+ years • SQL: 4+ years • Airflow: 3+ years • Strong hands-on coding and pipeline development experience Preferred / Nice-to-Have Skills • Java: 3+ years • Python: 3+ years • Microservices: 2+ years • Agile / Scrum experience