Jobs via Dice

Opening for Business Data Analyst :: Contract :: Wilmington, DE

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Business Data Analyst on a contract basis in Wilmington, DE, offering competitive pay. Requires a Master's degree, expertise in Python and SQL, and experience with ETL pipelines, data validation, and AWS services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 29, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Delaware City, DE
-
🧠 - Skills detailed
#GIT #Lambda (AWS Lambda) #Libraries #SQL (Structured Query Language) #Data Manipulation #Integration Testing #Data Orchestration #Automated Testing #Pytest #Data Governance #Deployment #Data Engineering #Data Quality #Python #Airflow #AWS (Amazon Web Services) #IAM (Identity and Access Management) #SQL Queries #Data Processing #Snowflake #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Datasets #Databases #Metadata #MySQL #Data Analysis #JSON (JavaScript Object Notation) #DynamoDB #Version Control
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, SOHO Square Solutions, is seeking the following. Apply via Dice today! Note: Only W2 Candidates will be Considered. Types of Work: ETL Development & Data Engineering... • Design, develop, and maintain robust ETL pipelines to aggregate and transform raw data into actionable datasets for control execution. • Optimize complex SQL queries and Python scripts to improve data processing speed and reliability across various database environments (Postgres, Snowflake, etc.). • Integrate disparate data sources-including unstructured JSON and relational warehouses-into a unified data layer for risk reporting. Automated Data Validation & Scripted QA... • Build and execute automated QA test suites using Python (e.g., PyTest, Great Expectations) to validate data completeness, accuracy, and timeliness. • Develop "Data-as-Code" testing frameworks to catch anomalies or schema drift before they impact downstream control processes. • Perform unit and integration testing on ETL code bases to ensure the logic reflects the underlying business and system rules. Data Governance & Lineage... • Manage data repositories and CI/CD pipelines to ensure seamless and governed deployment of data assets. • Drive adherence to data quality principles, including automated metadata capture and technical lineage mapping. • Evaluate integration points to ensure SQL logic accurately captures the state of the systems being reported on. General Responsibilities: • Pipeline Optimization: Identify bottlenecks in data delivery and implement Python-based solutions to automate manual data work. • Technical Partnership: Collaborate with Engineering and Ops to translate control requirements into technical specifications for ETL workflows. • Strategic Problem Solving: Use a quantitative mindset to solve data gaps, leveraging Python libraries for deep-dive analysis into data anomalies. • Communication: Clearly articulate technical risks and data discrepancies to non-technical stakeholders to drive remediation. Basic Qualifications: • Master's Degree in a quantitative or technical field. • Proven experience building and running ETL pipelines in a production environment. • Expert-level proficiency in Python and SQL, specifically for data manipulation and automated testing. • Experience with relational and non-relational databases (Postgres, MySQL, DynamoDB, Cassandra, or similar). Preferred Qualifications: • Experience building automated QA frameworks for data validation. • Hands-on experience with AWS services (S3, Glue, Lambda, IAM) to support serverless data processing. • Familiarity with data orchestration tools (e.g., Airflow, Prefect) and version control (Git). • Experience handling unstructured data (JSON) and transforming it for structured reporting.