

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Quality Engineer in Jacksonville, FL, on a contract basis. Requires 5+ years in data testing, strong SQL, AWS experience, and familiarity with Agile. Key tasks include data validation, test automation, and ensuring data integrity.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
400
-
ποΈ - Date discovered
June 10, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Jacksonville, FL
-
π§ - Skills detailed
#Quality Assurance #Monitoring #SQL (Structured Query Language) #Snowflake #Data Warehouse #Regression #Data Pipeline #dbt (data build tool) #Code Reviews #AWS (Amazon Web Services) #Data Reconciliation #Python #Data Transformations #Agile #Business Analysis #Cloud #Lambda (AWS Lambda) #Scrum #Automation #Pytest #Data Quality #Scripting #"ETL (Extract #Transform #Load)" #Data Modeling #Data Accuracy #Data Integrity #Data Engineering #S3 (Amazon Simple Storage Service)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Quality Engineer / Data Testing Engineer β Snowflake & AWS
Location: Jacksonville, FL
Contract
Required Qualifications:
β’ Minimum 5 years of experience in data testing, ETL/ELT validation, or data quality engineering roles.
β’ Strong SQL skills and experience testing in a Snowflake or similar cloud data warehouse environment.
β’ Hands-on experience with AWS services (S3, Glue, Lambda) and data pipeline validation.
β’ Experience with test automation frameworks and tools (e.g., PyTest, Great Expectations, dbt tests, or custom Python frameworks).
β’ Solid understanding of data warehousing, data modeling, and pipeline orchestration.
β’ Familiarity with Agile/Scrum development processes.
β’ Excellent problem-solving skills, with strong attention to detail and data accuracy.
Role Overview:
β’ We are looking for a Data Quality Engineer with strong experience in testing data pipelines, validating data transformations, and ensuring data integrity across complex Snowflake and AWS-based data platforms. This role will involve creating automated data validation frameworks, developing test cases for ELT processes, and working closely with data engineers and business stakeholders to ensure high-quality, trusted data delivery.
Key Responsibilities:
β’ Design and implement robust data quality assurance frameworks for Snowflake-based ELT pipelines.
β’ Create and maintain test cases, test data, and validation scripts to ensure data accuracy, completeness, consistency, and timeliness.
β’ Validate data transformations, aggregations, and loading processes from various source systems to Snowflake.
β’ Develop and execute automated data validation using SQL and scripting (e.g., Python).
β’ Perform regression testing, data reconciliation, and row-level/aggregate-level validations across different stages of the pipeline.
β’ Work with data engineers, business analysts, and stakeholders to define testing requirements and success criteria.
β’ Participate in code reviews, Agile/Scrum ceremonies, and release validation activities.
β’ Document all test scenarios, defects, and test results, and track through resolution.
β’ Assist in monitoring production data pipelines to detect and address anomalies and data quality issues.