Call Quest Solution

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a competitive pay rate. Required skills include 3+ years in data engineering, expertise in Apache Spark, AWS, SQL Server, and ETL tools. A Bachelor’s degree is mandatory; certifications in Snowflake and Python are preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Florida, United States
-
🧠 - Skills detailed
#Spark (Apache Spark) #Data Modeling #Datasets #SnowPipe #Kafka (Apache Kafka) #Informatica #Apache Spark #Computer Science #SQL Server #Automation #"ETL (Extract #Transform #Load)" #.Net #SQL (Structured Query Language) #Python #Airflow #Data Engineering #C# #Data Interpretation #Data Warehouse #Hadoop #Strategy #Business Analysis #Data Pipeline #AWS (Amazon Web Services) #Cloud #Java #Data Integration #Data Quality #Scripting #Data Processing #Snowflake #Scala
Role description
Role Overview We are seeking a Data Engineer to design, build, and support scalable data solutions that drive business insights and customer satisfaction. This role partners closely with cross-functional teams to deliver data pipelines, analytics-ready datasets, and reliable data platforms aligned with enterprise architecture and analytics strategy. Required Experience • 3+ years of data engineering or related IT experience • 1+ year of experience with Apache Spark, Hadoop, Java/Scala, Python, and AWS architecture • 1+ year of experience with Microsoft .NET technologies (C#) and Windows/web application development • 2+ years of experience in data modeling and database development using: • PL/SQL • SQL Server 2016+ • Snowflake • 1+ year building data pipelines and ETL workflows in cloud and on-prem environments using: • Snowpipe, Informatica, Airflow, Kafka, or similar tools Key Responsibilities Design, build, and maintain scalable data pipelines and ETL workflows Develop and optimize data models and data warehouse structures Partner with business and analytics teams to deliver data-driven solutions Work with large, structured and unstructured datasets to extract insights Build and consume APIs to move data between systems Perform root cause analysis to support business questions and improve data quality Ensure data solutions align with enterprise architecture and analytics standards Technical Skills • Strong SQL and relational database expertise • Data modeling and data warehouse development • Cloud-based and on-prem data integration • Scripting and automation using Python and related tools • Experience with large-scale data processing frameworks Education & Certifications • Bachelor’s degree in Computer Science, Information Systems, or related field required • Master’s degree preferred (or equivalent professional experience) • Snowflake and Python certifications preferred Professional Competencies • Strong communication, analytical, and problem-solving skills • High attention to detail and organization • Ability to work independently and collaboratively • Solid business analysis and data interpretation skills