

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in the retail industry, offering a contract of unspecified length with a pay rate of "unknown." Key skills include Azure Data Factory, Databricks, and big data frameworks. A Bachelor’s degree and 3–5 years of experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date discovered
September 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scrum #Azure Data Factory #GCP (Google Cloud Platform) #ADF (Azure Data Factory) #Agile #Spark (Apache Spark) #Snowflake #Data Analysis #DevOps #AWS (Amazon Web Services) #Big Data #Data Management #Databricks #"ETL (Extract #Transform #Load)" #Metadata #Data Framework #Data Ingestion #Cloud #Automation #Azure #Data Pipeline #Kafka (Apache Kafka) #Computer Science #Process Automation #Data Engineering #Data Modeling
Role description
Agility Partners is seeking a qualified Data Engineer to fill an open position with one of our clients. This is an exciting opportunity in the retail industry, where you’ll help build and optimize data infrastructure for impactful business insights. You’ll work with modern big data tools, cloud platforms, and automation frameworks in a collaborative, fast-paced environment. The Data Engineer will design, build, and maintain data pipelines and infrastructure, ensuring high-quality data is available for analytics and reporting. You’ll collaborate with engineering and architecture teams, automate processes, and support data-driven decision-making across the organization.
Responsibilities:
• Team up with engineering and enterprise architecture to define standards, design patterns, and DevOps automation
• Create and maintain data ingestion, quality testing, and audit frameworks
• Build and automate data pipelines using Azure Data Factory, Databricks/Spark, Snowflake, Kafka, and scheduler tools
• Conduct complex data analysis and respond to business and technology queries
Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, or related field
• 3–5 years of experience in data engineering or related disciplines
• Experience with big data frameworks (Spark, Hive), ETL processes, and cloud platforms (Azure, AWS, GCP)
• Proficiency in data modeling, pipeline automation, and metadata management
• Familiarity with DevOps, CI/CD, and Agile/Scrum environments
• Relevant certifications (AWS Big Data, Google Data Engineer, Azure Data Engineer) are a plus
• Strong communication and collaboration skills
Reasons to Love This Opportunity:
• Work with cutting-edge data technologies and cloud solutions
• Opportunity to drive innovation and process automation in a dynamic team
• Support data literacy and training programs for business and technology users
• Be part of a collaborative environment focused on professional growth and impact
Agility Partners is seeking a qualified Data Engineer to fill an open position with one of our clients. This is an exciting opportunity in the retail industry, where you’ll help build and optimize data infrastructure for impactful business insights. You’ll work with modern big data tools, cloud platforms, and automation frameworks in a collaborative, fast-paced environment. The Data Engineer will design, build, and maintain data pipelines and infrastructure, ensuring high-quality data is available for analytics and reporting. You’ll collaborate with engineering and architecture teams, automate processes, and support data-driven decision-making across the organization.
Responsibilities:
• Team up with engineering and enterprise architecture to define standards, design patterns, and DevOps automation
• Create and maintain data ingestion, quality testing, and audit frameworks
• Build and automate data pipelines using Azure Data Factory, Databricks/Spark, Snowflake, Kafka, and scheduler tools
• Conduct complex data analysis and respond to business and technology queries
Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, or related field
• 3–5 years of experience in data engineering or related disciplines
• Experience with big data frameworks (Spark, Hive), ETL processes, and cloud platforms (Azure, AWS, GCP)
• Proficiency in data modeling, pipeline automation, and metadata management
• Familiarity with DevOps, CI/CD, and Agile/Scrum environments
• Relevant certifications (AWS Big Data, Google Data Engineer, Azure Data Engineer) are a plus
• Strong communication and collaboration skills
Reasons to Love This Opportunity:
• Work with cutting-edge data technologies and cloud solutions
• Opportunity to drive innovation and process automation in a dynamic team
• Support data literacy and training programs for business and technology users
• Be part of a collaborative environment focused on professional growth and impact