

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering $45/hr on W2. Remote work is available. Requires 3+ years in SQL, cloud data pipelines, Python, and Generative AI. Must complete an assessment and provide a managerial reference.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
360
-
ποΈ - Date discovered
August 16, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
New York, United States
-
π§ - Skills detailed
#Spark (Apache Spark) #Cloud #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Python #Automation #Data Engineering #Airflow #Data Pipeline #Teradata #AI (Artificial Intelligence) #BigQuery #Database Systems #SQL (Structured Query Language) #Azure #Scripting #Data Orchestration
Role description
Job Title: Data EngineerX3
Location :: Remote
only W2 , USC GC CAN APPLY
MAX RATE $45/HR on w2 without benefits
Interview Process:
β’ 1 round with team leads β 30 minutes coding exercise + 30 minutes technical questions
β’ 1 round with lead/manager, technical & behavioral questions, no coding
Requirements BEFORE submittal to client:
β’ Completed Feenyx assessment (will send out link once the assessment has been created)
β’ Video call
β’ Please include a managerial reference in the submittal and I will assist in reaching out
Must Haves:
β’ 3+ years of experience writing SQL within database systems such as BigQuery, Teradata
β’ 1+ years of experience with Generative AI: Familiarity with GenAI concepts, prompt engineering, LLMs (preferably Gemini and Copilot) and frameworks.
β’ 3+ years of hands-on experience building modern data pipelines within a major cloud platform (GCP, AWS, Azure), preferably GCP.
β’ 3+ years of experience with Python or other comparable scripting language in a data engineering setting
β’ 3+ years of experience with data orchestration, and pipeline engineering services such as Airflow, Spark, preferably Composer / DataProc
β’ 3+ years of experience deploying to, and managing CI/CD within, cloud database platforms.
β’ Demonstrates a keen awareness of the member experience while seeking opportunities for automation and innovation to automate and optimize business operations.
Job Title: Data EngineerX3
Location :: Remote
only W2 , USC GC CAN APPLY
MAX RATE $45/HR on w2 without benefits
Interview Process:
β’ 1 round with team leads β 30 minutes coding exercise + 30 minutes technical questions
β’ 1 round with lead/manager, technical & behavioral questions, no coding
Requirements BEFORE submittal to client:
β’ Completed Feenyx assessment (will send out link once the assessment has been created)
β’ Video call
β’ Please include a managerial reference in the submittal and I will assist in reaching out
Must Haves:
β’ 3+ years of experience writing SQL within database systems such as BigQuery, Teradata
β’ 1+ years of experience with Generative AI: Familiarity with GenAI concepts, prompt engineering, LLMs (preferably Gemini and Copilot) and frameworks.
β’ 3+ years of hands-on experience building modern data pipelines within a major cloud platform (GCP, AWS, Azure), preferably GCP.
β’ 3+ years of experience with Python or other comparable scripting language in a data engineering setting
β’ 3+ years of experience with data orchestration, and pipeline engineering services such as Airflow, Spark, preferably Composer / DataProc
β’ 3+ years of experience deploying to, and managing CI/CD within, cloud database platforms.
β’ Demonstrates a keen awareness of the member experience while seeking opportunities for automation and innovation to automate and optimize business operations.