

Data QE
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Quality Engineer (Data QE) on a contract basis for $55.00 per hour, requiring in-person work. Candidates must have 2+ years of experience with SQL, AWS, and testing frameworks, preferably in the financial industry.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC 28202
-
π§ - Skills detailed
#Cloud #Stories #GIT #GitHub #Jenkins #Data Lake #Programming #Snowflake #API (Application Programming Interface) #Docker #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Java #RDS (Amazon Relational Database Service) #Jira #Lambda (AWS Lambda) #Swagger #DynamoDB #Python #Automation #AWS (Amazon Web Services) #TestNG #Observability #TeamCity #Scrum #JUnit #Data Quality #Web Services #SQL (Structured Query Language) #Batch #Debugging #SQL Queries #Agile #Kubernetes #S3 (Amazon Simple Storage Service)
Role description
Job OverviewWe seek a detail-oriented and analytical Data Quality Engineer (Data QE) to join our dynamic team. The ideal candidate will play a crucial role in ensuring our data systems' integrity, accuracy, and reliability. As a Data QE, you will be responsible for developing and implementing testing strategies, conducting data validation, and collaborating with cross-functional teams to enhance our data processes. This position is essential for maintaining high-quality data standards that support business decisions.
Technical Requirements
Solid experience in testing SQL procedures and the ability to understand/write complex SQL queries to perform data validation
Hands-on experience creating and executing test plans, test cases, and test scripts for Database/ETL Workflows
Minimum of 2 years of hands-on experience working with AWS services like Glue, S3, RDS, DynamoDB, Lambda, and exposure to Bedrock model integration or experience with other cloud vendors
AI Tools / Models / Training Data Set exposure is a plus for adoption in testing methodologies
Hands-on experience analyzing data, comparing with mapping documents, and debugging to identify the root cause
Familiarity with Docker and Kubernetes; Snowflake and Data Lake
Hands-on experience Testing and Automation development for Batch Jobs, Data Feeds, API / Web services / Swagger
Experience with test frameworks like Junit, TestNG; Code Versioning tools like GIT
Experience with CI/CD with TeamCity/Octopus/Jenkins; and integrate AI-based quality gates and observability into CI/CD pipelines like GitHub Co-pilot
Envision opportunities and apply AI low code/ no code automation techniques to improvise the test coverage
Experience with Jira or similar Agile process tools
Experience with web application and API testing.
Preferences:
Cloud certification(s) are a plus
Gen AI hands on around Test preparation, data generation, and automation solutions
Financial Industry experience
Ability to thrive in a fast-paced environment where resourcefulness, determination, and strong problem-solving skills are necessary for success
Positive attitude and ability to take ownership on releases/ features / stories / tasks to deliver with quality and lead scrum teams.
Programming languages: Python and Java.
Job Type: Contract
Pay: $55.00 per hour
Application Question(s):
Experience in testing SQL
Experience in AI Tools / Models / Training Data Set
Experience in Docker and Kubernetes; Snowflake and Data Lake
Experience with test frameworks like Junit, TestNG; Code Versioning tools like GIT
Experience with CI/CD with TeamCity/Octopus/Jenkins
Experience with Jira or similar Agile process tools
Experience with web application and API testing.
Work Location: In person
Job OverviewWe seek a detail-oriented and analytical Data Quality Engineer (Data QE) to join our dynamic team. The ideal candidate will play a crucial role in ensuring our data systems' integrity, accuracy, and reliability. As a Data QE, you will be responsible for developing and implementing testing strategies, conducting data validation, and collaborating with cross-functional teams to enhance our data processes. This position is essential for maintaining high-quality data standards that support business decisions.
Technical Requirements
Solid experience in testing SQL procedures and the ability to understand/write complex SQL queries to perform data validation
Hands-on experience creating and executing test plans, test cases, and test scripts for Database/ETL Workflows
Minimum of 2 years of hands-on experience working with AWS services like Glue, S3, RDS, DynamoDB, Lambda, and exposure to Bedrock model integration or experience with other cloud vendors
AI Tools / Models / Training Data Set exposure is a plus for adoption in testing methodologies
Hands-on experience analyzing data, comparing with mapping documents, and debugging to identify the root cause
Familiarity with Docker and Kubernetes; Snowflake and Data Lake
Hands-on experience Testing and Automation development for Batch Jobs, Data Feeds, API / Web services / Swagger
Experience with test frameworks like Junit, TestNG; Code Versioning tools like GIT
Experience with CI/CD with TeamCity/Octopus/Jenkins; and integrate AI-based quality gates and observability into CI/CD pipelines like GitHub Co-pilot
Envision opportunities and apply AI low code/ no code automation techniques to improvise the test coverage
Experience with Jira or similar Agile process tools
Experience with web application and API testing.
Preferences:
Cloud certification(s) are a plus
Gen AI hands on around Test preparation, data generation, and automation solutions
Financial Industry experience
Ability to thrive in a fast-paced environment where resourcefulness, determination, and strong problem-solving skills are necessary for success
Positive attitude and ability to take ownership on releases/ features / stories / tasks to deliver with quality and lead scrum teams.
Programming languages: Python and Java.
Job Type: Contract
Pay: $55.00 per hour
Application Question(s):
Experience in testing SQL
Experience in AI Tools / Models / Training Data Set
Experience in Docker and Kubernetes; Snowflake and Data Lake
Experience with test frameworks like Junit, TestNG; Code Versioning tools like GIT
Experience with CI/CD with TeamCity/Octopus/Jenkins
Experience with Jira or similar Agile process tools
Experience with web application and API testing.
Work Location: In person