

Call Quest Solution
Data Engineer- W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position with a hybrid schedule in Orlando, FL, offering a contract length of "unknown" and a pay rate of "unknown." Requires 3+ years of data engineering experience, proficiency in Apache Spark, Hadoop, SQL, and cloud technologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Orlando, FL
-
🧠 - Skills detailed
#Business Analysis #.Net #Python #Informatica #Leadership #Data Warehouse #Data Ingestion #API (Application Programming Interface) #Computer Science #Spark (Apache Spark) #Airflow #Data Engineering #Apache Spark #Scala #Hadoop #"ETL (Extract #Transform #Load)" #SQL Server #Datasets #C# #AWS (Amazon Web Services) #EDW (Enterprise Data Warehouse) #SQL (Structured Query Language) #Data Pipeline #Data Science #Kafka (Apache Kafka) #Data Modeling #Data Strategy #Cloud #Java #SnowPipe #Strategy #Snowflake
Role description
Work Arrangement
Hybrid Schedule:
• Monday–Wednesday: Onsite
• Thursday–Friday: Work from Home
Location Requirement:
• Candidates must currently be local to Orlando, FL
• Relocation for this role is not permitted
Role Overview
The Data Engineer will partner with cross-functional business teams to design, build, and support scalable data pipelines, data warehouses, and analytics platforms that drive business value and improve customer satisfaction.
This role focuses on hands-on data engineering, working across cloud and on-prem environments, and supporting enterprise analytics initiatives aligned with the company’s data strategy.
Key Responsibilities
• Design, build, test, deploy, and support secure, scalable data pipelines
• Maintain and monitor enterprise data warehouses and analytics platforms
• Integrate new data sources into the central data ecosystem
• Develop reusable, automated solutions for data ingestion and transformation
• Perform root cause analysis to support business questions and improvements
• Build and consume APIs to exchange data between systems
• Support cloud and on-prem data and web service solutions
• Collaborate with project managers, business analysts, and data scientists
• Partner with leadership to ensure data infrastructure meets evolving needs
• Continuously evaluate technical strategy and recommend improvements
Required Experience
• 3+ years of data engineering or related IT experience
• 1+ year experience with:
• Apache Spark
• Hadoop
• Java/Scala
• Python
• AWS architecture
• 1+ year experience with Microsoft .NET technologies (C#) and Windows/web application development
• 2+ years experience in:
• Data modeling
• Database development using PL/SQL, SQL Server 2016+, and Snowflake
• 1+ year experience building ETL/data pipelines in cloud and on-prem environments using tools such as:
• Snowpipe
• Informatica
• Airflow
• Kafka (or similar)
Technical Skills & Expertise
• Strong SQL skills and relational database development
• Experience building and optimizing ETL/ELT pipelines
• Data modeling and data warehouse management
• Working with large, disconnected, and unstructured datasets
• API development and integration
• Strong analytical and problem-solving skills
• Ability to translate business requirements into technical solutions
Education
• Bachelor’s degree in Computer Science, Information Systems, or related field (required)
• Master’s degree (preferred)
• Equivalent professional experience may be accepted in lieu of formal education
Certifications (Preferred)
• Snowflake certification
• Python certification
Work Arrangement
Hybrid Schedule:
• Monday–Wednesday: Onsite
• Thursday–Friday: Work from Home
Location Requirement:
• Candidates must currently be local to Orlando, FL
• Relocation for this role is not permitted
Role Overview
The Data Engineer will partner with cross-functional business teams to design, build, and support scalable data pipelines, data warehouses, and analytics platforms that drive business value and improve customer satisfaction.
This role focuses on hands-on data engineering, working across cloud and on-prem environments, and supporting enterprise analytics initiatives aligned with the company’s data strategy.
Key Responsibilities
• Design, build, test, deploy, and support secure, scalable data pipelines
• Maintain and monitor enterprise data warehouses and analytics platforms
• Integrate new data sources into the central data ecosystem
• Develop reusable, automated solutions for data ingestion and transformation
• Perform root cause analysis to support business questions and improvements
• Build and consume APIs to exchange data between systems
• Support cloud and on-prem data and web service solutions
• Collaborate with project managers, business analysts, and data scientists
• Partner with leadership to ensure data infrastructure meets evolving needs
• Continuously evaluate technical strategy and recommend improvements
Required Experience
• 3+ years of data engineering or related IT experience
• 1+ year experience with:
• Apache Spark
• Hadoop
• Java/Scala
• Python
• AWS architecture
• 1+ year experience with Microsoft .NET technologies (C#) and Windows/web application development
• 2+ years experience in:
• Data modeling
• Database development using PL/SQL, SQL Server 2016+, and Snowflake
• 1+ year experience building ETL/data pipelines in cloud and on-prem environments using tools such as:
• Snowpipe
• Informatica
• Airflow
• Kafka (or similar)
Technical Skills & Expertise
• Strong SQL skills and relational database development
• Experience building and optimizing ETL/ELT pipelines
• Data modeling and data warehouse management
• Working with large, disconnected, and unstructured datasets
• API development and integration
• Strong analytical and problem-solving skills
• Ability to translate business requirements into technical solutions
Education
• Bachelor’s degree in Computer Science, Information Systems, or related field (required)
• Master’s degree (preferred)
• Equivalent professional experience may be accepted in lieu of formal education
Certifications (Preferred)
• Snowflake certification
• Python certification





