Bayforce

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer contract position in Wilmington, DE or Buffalo, NY, lasting through 2026 with potential extension. Requires advanced Python, SQL, API engineering, and experience with Power BI. Minimum 5 years relevant experience needed.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 18, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Wilmington, DE
-
🧠 - Skills detailed
#Monitoring #Azure #Data Quality #GitLab #Cloud #Informatica Cloud #REST (Representational State Transfer) #Informatica #API (Application Programming Interface) #Python #"ETL (Extract #Transform #Load)" #GraphQL #SQL (Structured Query Language) #BI (Business Intelligence) #Anomaly Detection #Data Engineering #Collibra #Jira #Libraries #Metadata #Observability #Snowflake #Microsoft Power BI #Automation
Role description
Senior Data Quality Engineer (Contract) Location: Onsite – Wilmington, DE or Buffalo, NY (4 days/week in office) Duration: Through end of 2026, with expected extension through Dec 2027 Note: We do not work with third parties or vendors. Direct applicants only. About the role We’re looking for a hands-on Senior Data Engineer to build automation-first controls that detect, diagnose, and remediate data issues at scale. You’ll design and operationalize data quality validation pipelines, integrate with observability tools, and orchestrate end-to-end workflows across cloud and on-prem environments. What you’ll do β€’ Build automated data quality pipelines (profiling, rules, schema validation, anomaly detection, exception workflows) using Python and SQL β€’ Integrate systems via REST/GraphQL APIs (OAuth2, pagination, rate limits, retries/backoff, webhooks) β€’ Create reusable, versioned DQ rule libraries with unit tests, linting, and code quality standards β€’ Embed DQ controls into ETL/ELT processes with pre/post-load checks and SLA/SLO monitoring β€’ Route exceptions to tools like Jira/ServiceNow with rich metadata for audit and traceability β€’ Develop operational dashboards and workflow apps using Power BI and Power Apps β€’ Drive root-cause analysis using lineage, logs, and metrics; automate remediation where possible β€’ Contribute to CI/CD for DQ assets (GitLab or similar) and maintain runbooks/playbooks Minimum qualifications β€’ Bachelor’s + 5+ years relevant experience (or 9+ years combined education/work with 5+ years relevant) β€’ Advanced Python and SQL for automation and validation pipelines β€’ Strong API engineering experience (REST/GraphQL, OAuth2, error handling, webhooks/event-driven workflows) β€’ Experience with Power BI and Power Apps β€’ Proven ability to partner with engineering, governance, and business teams in fast-paced environments Nice to have β€’ DQ/observability platforms (Informatica Cloud DQ, Monte Carlo, Anomalo, Collibra OwlDQ) β€’ Data contracts, schema enforcement, governance-aligned DQ frameworks, CI/CD practices Cloud experience (Azure and/or Snowflake), event-driven tooling, and anomaly detection techniques