Tuppl

BDD Implementation with Quality (Data Ingestion & Pipeline) :::

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Quality Engineer specializing in BDD Implementation (Data Ingestion & Pipeline) in NYC, NY/Fort Mill, SC (Hybrid). Contract length is unspecified, with a pay rate of "unknown." Requires 12+ years in QA, strong BDD framework experience, and proficiency in Python or Java.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 14, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Fort Mill, SC
-
🧠 - Skills detailed
#Selenium WebDriver #Data Governance #Data Privacy #AWS Glue #Data Pipeline #Cloud #Data Warehouse #"ETL (Extract #Transform #Load)" #Java #SQL (Structured Query Language) #Automation #Snowflake #TestNG #Computer Science #Data Quality #Jenkins #Programming #Infrastructure as Code (IaC) #Business Analysis #Data Management #GitHub #Data Transformations #Kubernetes #Quality Assurance #Databases #Pytest #AWS (Amazon Web Services) #Data Engineering #Python #Docker #Data Integrity #Data Ingestion
Role description
Quality Engineer – BDD Implementation (Data Ingestion & Pipeline) Location: NYC, NY /Fort Mill, SC (Hybrid- 2-3 days) Job Description We are seeking a skilled Quality Engineer with hands-on experience in Behavior-Driven Development (BDD) to ensure the quality and reliability of our data ingestion and pipeline solutions. The ideal candidate will collaborate with cross-functional teams to define, implement, and automate BDD test scenarios for complex data workflows. Responsibilities β€’ Collaborate with product owners, data engineers, and business analysts to define acceptance criteria and BDD scenarios for data ingestion and pipeline processes. β€’ Design, develop, and maintain automated BDD test suites using frameworks such as Behave, Cucumber, pytest-bdd and Playwright for Ui validations. β€’ Validate data integrity, transformation logic, and end-to-end data flow from source to target systems. β€’ Identify, document, and track defects; work with development teams to resolve issues. β€’ Integrate automated tests into CI/CD pipelines for continuous quality assurance. β€’ Analyze test results, generate reports, and communicate findings to stakeholders. β€’ Contribute to test data management and environment setup for data pipeline testing. β€’ Stay current with industry best practices in data quality, test automation, and BDD methodologies. Required Skills & Qualifications β€’ Bachelor’s degree in Computer Science, Engineering, or related field. β€’ 12+ years of experience in software quality assurance or test automation, preferably in data engineering environments. β€’ Strong experience with BDD frameworks (Behave, Cucumber, pytest-bdd, Playwright) and Gherkin syntax. β€’ Experience with Implementing Test Runners such as TestNG, Selenium WebDriver to run tests and document reporting and results. β€’ Proficiency in Python, Java or another programming language used for test automation. β€’ Solid understanding of ETL/ELT processes, data ingestion, and data pipeline architectures. β€’ Experience testing data transformations, data quality, and data integrity. β€’ Familiarity with relational databases (SQL), data warehouses, and cloud data platforms (e.g., AWS Glue) and Snowflake. β€’ Experience with CI/CD tools (e.g., Jenkins, GitHub Actions). β€’ Excellent analytical, problem-solving, and communication skills. Preferred Skills β€’ Knowledge of data governance and data privacy best practices. β€’ Exposure to containerization (Docker, Kubernetes) and infrastructure as code.