

BayOne Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12-month remote contract, paying $45-$50/hour. Requires 5+ years in SQL, Python, Git, Snowflake, Linux, AWS, and data pipelines. Strong communication skills and analytical abilities are essential; AI Engineering experience is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
January 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Datalakes #Terraform #Python #dbt (data build tool) #Data Pipeline #AWS (Amazon Web Services) #Fivetran #Kafka (Apache Kafka) #Data Warehouse #Linux #SQL (Structured Query Language) #GIT #Cloud #Apache Kafka #Snowflake #Data Engineering
Role description
Job Title: Data Engineer
Location: Remote
Duration: 12 Months Contract
Pay Rate: $45-$50/H. W2
MINIMUM QUALIFICATIONS:
• 5+ years experience with SQL, Python, Git & Snowflake(or any other cloud-based data-lakes)
• 5+ years experience with Linux, Shell & AWS Cloud Architecture
• 5+ years of experience in developing & maintaining complex Data Pipeline
• 3+ years of experience handling both semi-structured and unstructured data
• Outstanding written and oral communication skills. Able to communicate sophisticated concepts to a variety of audiences
• Strong analytical and critical thinking skills
BONUS POINTS FOR:
• Experience with AI Engineering & Dev Ops(Tekton, ArgoCD, Terraform/OpenTofu)
• Experience with dbt & fivetran
• Proven experience with Apache Kafka & data streaming to Data Warehouse (preferably Snowflake)
Job Title: Data Engineer
Location: Remote
Duration: 12 Months Contract
Pay Rate: $45-$50/H. W2
MINIMUM QUALIFICATIONS:
• 5+ years experience with SQL, Python, Git & Snowflake(or any other cloud-based data-lakes)
• 5+ years experience with Linux, Shell & AWS Cloud Architecture
• 5+ years of experience in developing & maintaining complex Data Pipeline
• 3+ years of experience handling both semi-structured and unstructured data
• Outstanding written and oral communication skills. Able to communicate sophisticated concepts to a variety of audiences
• Strong analytical and critical thinking skills
BONUS POINTS FOR:
• Experience with AI Engineering & Dev Ops(Tekton, ArgoCD, Terraform/OpenTofu)
• Experience with dbt & fivetran
• Proven experience with Apache Kafka & data streaming to Data Warehouse (preferably Snowflake)






