Robert Half

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3–5+ years of experience, focusing on data lakes and Snowflake. It is a 6–12 month contract, fully onsite in Oklahoma City, offering a hands-on position requiring strong SQL and Python skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
February 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Oklahoma City Metropolitan Area
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Data Governance #Data Pipeline #Matillion #Scala #Data Ingestion #Data Modeling #AWS S3 (Amazon Simple Storage Service) #Snowflake #Data Engineering #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #API (Application Programming Interface) #ML (Machine Learning) #Data Quality #BI (Business Intelligence) #Automation #Python #Kafka (Apache Kafka) #Security #Data Analysis #Azure #Cloud #dbt (data build tool) #SQL (Structured Query Language) #Scripting #GCP (Google Cloud Platform) #SnowPipe #Airflow #Data Lake #Fivetran
Role description
Type: Contract / W-2 ONLY, no C2C candidates and no 3rd party vendors Data Engineer – 100% Onsite (Oklahoma City, OK) Contract: 6–12 Months Work Environment: Fully Onsite Overview We are seeking a highly skilled Data Engineer to join our client’s team in Oklahoma City for a 6–12 month onsite engagement. The ideal candidate will have strong experience building and maintaining data lakes, developing pipelines, and working within Snowflake environments. This role is hands-on and requires the ability to collaborate closely with cross-functional teams in a fast-paced setting. Responsibilities • Design, develop, and maintain data pipelines to support data ingestion, transformation, and integration across enterprise systems. • Build and optimize data lake architectures, ensuring reliability, scalability, and performance. • Work extensively with Snowflake to create schemas, manage data warehousing processes, and optimize warehouse performance. • Develop ETL/ELT processes using modern data engineering tools and best practices. • Collaborate with data analysts, BI teams, and application developers to ensure data availability and data quality. • Implement and maintain data governance, data quality, and security standards. • Troubleshoot data issues and optimize pipeline performance to ensure timely and accurate data delivery. Required Skills & Experience • 3–5+ years of experience as a Data Engineer or similar role. • Proven experience with Data Lake architecture (Azure Data Lake, AWS S3-based lakes, or similar). • Hands-on, professional experience with Snowflake (warehouses, streams, tasks, Snowpipe, performance tuning). • Strong SQL development skills and experience with ETL/ELT best practices. • Experience with Python or other scripting languages for data transformation and automation. • Familiarity with cloud platforms (AWS, Azure, or GCP). • Understanding of data modeling, data governance, and data quality practices. • Ability to work 100% onsite in Oklahoma City. Preferred Qualifications • Experience with tools such as dbt, Airflow, Fivetran, Matillion, or similar. • Knowledge of API integrations and streaming data technologies (Kafka, Kinesis, EventHub). • Experience supporting analytics, BI, or machine learning workloads. Project Details • Duration: 6–12 months • Location: Oklahoma City, OK (onsite only) • Type: Contract / W-2 ONLY, no C2C candidates and no 3rd party vendors