

Global Technology Partners
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with over 10 years of experience, requiring expertise in Python, PySpark, Spark, AWS, and Databricks. The contract lasts for an unspecified duration, with a pay rate of $55.00 - $70.00 per hour, and is on-site.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date
October 4, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Delaware, IA 52036
-
π§ - Skills detailed
#AI (Artificial Intelligence) #Automation #AWS EMR (Amazon Elastic MapReduce) #S3 (Amazon Simple Storage Service) #Airflow #Data Pipeline #Cloud #Python #Data Engineering #Databases #Big Data #NoSQL #Programming #PySpark #Azure #Apache Airflow #SQL (Structured Query Language) #Computer Science #Databricks #Spark (Apache Spark) #Datasets #Apache Spark #AWS (Amazon Web Services) #Terraform #Scala
Role description
F2F interview is required.
Candidate will be working on w2 payroll. No Corp-Corp candidates please.
Only apply if you are meeting the requirements, else your candidature will be rejected.
We are seeking a highly skilled Data Engineer with strong expertise in Python, PySpark, Spark, AWS, and Databricks to join our data and cloud engineering team. The ideal candidate will have hands-on experience designing scalable data pipelines and building cloud-native data platforms.
Must-Have Skills:
Bachelorβs or Masterβs degree in Computer Science, Engineering, or a related field.
More than 10 years of professional experience in data engineering roles.
Strong analytical, problem-solving, and communication skills.
Ability to work independently and as part of a collaborative team.
Strong proficiency in Python programming.
Strong hands-on experience with PySpark and Apache Spark.
Practical knowledge of AWS cloud services (e.g., S3, EMR, Glue).
Advanced SQL skills for working with large datasets.
Experience working with Databricks for big data analytics.
Experience with Terraform for infrastructure automation and provisioning.
Nice-to-Have / Preferred Skills:
Exposure to Large Language Models (LLMs) or Generative AI projects.
Experience using Azure OpenAI or similar AI cloud services.
Familiarity with Apache Airflow for orchestration of data workflows.
Experience working with Cassandra or other NoSQL databases.
Experience optimizing and managing AWS EMR clusters.
Job Type: Contract
Pay: $55.00 - $70.00 per hour
Work Location: In person
F2F interview is required.
Candidate will be working on w2 payroll. No Corp-Corp candidates please.
Only apply if you are meeting the requirements, else your candidature will be rejected.
We are seeking a highly skilled Data Engineer with strong expertise in Python, PySpark, Spark, AWS, and Databricks to join our data and cloud engineering team. The ideal candidate will have hands-on experience designing scalable data pipelines and building cloud-native data platforms.
Must-Have Skills:
Bachelorβs or Masterβs degree in Computer Science, Engineering, or a related field.
More than 10 years of professional experience in data engineering roles.
Strong analytical, problem-solving, and communication skills.
Ability to work independently and as part of a collaborative team.
Strong proficiency in Python programming.
Strong hands-on experience with PySpark and Apache Spark.
Practical knowledge of AWS cloud services (e.g., S3, EMR, Glue).
Advanced SQL skills for working with large datasets.
Experience working with Databricks for big data analytics.
Experience with Terraform for infrastructure automation and provisioning.
Nice-to-Have / Preferred Skills:
Exposure to Large Language Models (LLMs) or Generative AI projects.
Experience using Azure OpenAI or similar AI cloud services.
Familiarity with Apache Airflow for orchestration of data workflows.
Experience working with Cassandra or other NoSQL databases.
Experience optimizing and managing AWS EMR clusters.
Job Type: Contract
Pay: $55.00 - $70.00 per hour
Work Location: In person