

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position for 1 year in Glendale, AZ, offering $66.71 per hour. Requires 5+ years of experience, expertise in big data cloud platforms, proficiency in Python/Scala/SQL, and knowledge of distributed computing frameworks.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
528
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Glendale, AZ 85305
-
π§ - Skills detailed
#Deployment #Azure Data Factory #Python #Data Catalog #Storage #ADLS (Azure Data Lake Storage) #Data Governance #AI (Artificial Intelligence) #Snowflake #Computer Science #Delta Lake #Batch #Azure #MLflow #Palantir Foundry #Spark (Apache Spark) #Keras #AzureML #TensorFlow #SQL (Structured Query Language) #Pandas #Databricks #AWS (Amazon Web Services) #Datasets #Data Engineering #Hadoop #ADF (Azure Data Factory) #Distributed Computing #Data Science #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #PyTorch #Scala #Big Data #Data Pipeline #PySpark #Monitoring #Cloud
Role description
Company Overview
The Global Edge Consultants is a woman-owned staffing firm dedicated to connecting talented professionals with projects across the globe. With expertise in diverse industries such as Oil & Energy, Information Technology, and Manufacturing, we pride ourselves on delivering exceptional service and building lasting relationships based on trust and integrity.
Title: Data EngineerType: Contract β 1 Year (Possible Extension)Location: Glendale, AZSchedule: Hybrid β Standard Schedule (On-site days required)
Position Overview:We are seeking an experienced Data Engineer to join our team, supporting the design, development, deployment, and operations of advanced big data pipelines. This role will work closely with data engineers, data scientists, and business SMEs to prepare and process data for use cases including predictive analytics, generative AI, and computer vision. The ideal candidate will bring deep technical expertise, collaborative skills, and a passion for mentoring others to build a world-class data engineering capability.
Key Responsibilities:
Design, model, and analyze big data solutions, ensuring scalability and reliability.
Ingest, process, and model structured, unstructured, batch, and real-time data using modern tools and frameworks.
Collaborate across teams to deliver high-quality data sources for advanced analytics and AI initiatives.
Mentor junior engineers, fostering technical growth and best practices.
Build and optimize scalable pipelines, transformations, and datasets using Databricks, Spark, Azure Data Factory, and/or Palantir Foundry.
Ensure strong data governance, access control, and secure view implementation.
Basic Qualifications:
Bachelorβs degree in Computer Science or related field, plus 5+ years of relevant experience.
Proven expertise with big data cloud platforms (Azure, AWS, Snowflake, Palantir, etc.).
Proficiency in Python, Scala, SQL, Pandas, PySpark, or equivalent, with a focus on testable, maintainable code.
Experience developing database-heavy services or APIs.
Strong knowledge of distributed computing frameworks (Hadoop, Spark) and scalable storage (Delta Lake, ADLS, etc.).
Understanding of workflow orchestration, monitoring, and stream processing technologies.
If you're ready to make an impact in the world of data engineering and work with a team that values excellence, we invite you to apply today at The Global Edge Consultants!
Job Type: Contract
Pay: $66.71 per hour
Expected hours: 40 per week
Benefits:
Dental insurance
Health insurance
Paid time off
Application Question(s):
Do you have the Following: Proven expertise with big data cloud platforms (Azure, AWS, Snowflake, Palantir, etc.).
Proficiency in Python, Scala, SQL, Pandas, PySpark, or equivalent, with a focus on testable, maintainable code.
Experience developing database-heavy services or APIs.
Strong knowledge of distributed computing frameworks (Hadoop, Spark) and scalable storage (Delta Lake, ADLS, etc.).
Understanding of workflow orchestration, monitoring, and stream processing technologies.
Do you Possess any of these preferred Qualifications? Preferred Qualifications:
Experience with schema evolution, data versioning, and Delta Lake optimization
Exposure to data cataloging solutions in Foundry Ontology
Professional experience implementing complex ML architectures in popular frameworks such as Tensorflow, Keras, PyTorch, Sci-kit Learn, and CNTK
Professional experience implementing and maintaining MLOps pipelines in MLflow or AzureML
Experience:
Data Engineering: 5 years (Required)
Work Location: Hybrid remote in Glendale, AZ 85305
Company Overview
The Global Edge Consultants is a woman-owned staffing firm dedicated to connecting talented professionals with projects across the globe. With expertise in diverse industries such as Oil & Energy, Information Technology, and Manufacturing, we pride ourselves on delivering exceptional service and building lasting relationships based on trust and integrity.
Title: Data EngineerType: Contract β 1 Year (Possible Extension)Location: Glendale, AZSchedule: Hybrid β Standard Schedule (On-site days required)
Position Overview:We are seeking an experienced Data Engineer to join our team, supporting the design, development, deployment, and operations of advanced big data pipelines. This role will work closely with data engineers, data scientists, and business SMEs to prepare and process data for use cases including predictive analytics, generative AI, and computer vision. The ideal candidate will bring deep technical expertise, collaborative skills, and a passion for mentoring others to build a world-class data engineering capability.
Key Responsibilities:
Design, model, and analyze big data solutions, ensuring scalability and reliability.
Ingest, process, and model structured, unstructured, batch, and real-time data using modern tools and frameworks.
Collaborate across teams to deliver high-quality data sources for advanced analytics and AI initiatives.
Mentor junior engineers, fostering technical growth and best practices.
Build and optimize scalable pipelines, transformations, and datasets using Databricks, Spark, Azure Data Factory, and/or Palantir Foundry.
Ensure strong data governance, access control, and secure view implementation.
Basic Qualifications:
Bachelorβs degree in Computer Science or related field, plus 5+ years of relevant experience.
Proven expertise with big data cloud platforms (Azure, AWS, Snowflake, Palantir, etc.).
Proficiency in Python, Scala, SQL, Pandas, PySpark, or equivalent, with a focus on testable, maintainable code.
Experience developing database-heavy services or APIs.
Strong knowledge of distributed computing frameworks (Hadoop, Spark) and scalable storage (Delta Lake, ADLS, etc.).
Understanding of workflow orchestration, monitoring, and stream processing technologies.
If you're ready to make an impact in the world of data engineering and work with a team that values excellence, we invite you to apply today at The Global Edge Consultants!
Job Type: Contract
Pay: $66.71 per hour
Expected hours: 40 per week
Benefits:
Dental insurance
Health insurance
Paid time off
Application Question(s):
Do you have the Following: Proven expertise with big data cloud platforms (Azure, AWS, Snowflake, Palantir, etc.).
Proficiency in Python, Scala, SQL, Pandas, PySpark, or equivalent, with a focus on testable, maintainable code.
Experience developing database-heavy services or APIs.
Strong knowledge of distributed computing frameworks (Hadoop, Spark) and scalable storage (Delta Lake, ADLS, etc.).
Understanding of workflow orchestration, monitoring, and stream processing technologies.
Do you Possess any of these preferred Qualifications? Preferred Qualifications:
Experience with schema evolution, data versioning, and Delta Lake optimization
Exposure to data cataloging solutions in Foundry Ontology
Professional experience implementing complex ML architectures in popular frameworks such as Tensorflow, Keras, PyTorch, Sci-kit Learn, and CNTK
Professional experience implementing and maintaining MLOps pipelines in MLflow or AzureML
Experience:
Data Engineering: 5 years (Required)
Work Location: Hybrid remote in Glendale, AZ 85305