MARS Solutions Group

Sr Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer, a 6-month contract position with a pay rate of "pay rate" located in "work location." Key skills include Python, SQL, AWS, ETL processes, and data analysis. A Bachelor's degree and 6 years of relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Milwaukee, WI
-
🧠 - Skills detailed
#Apache Spark #Spark (Apache Spark) #Redshift #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #SQL Server #Data Quality #Data Engineering #Agile #Data Integration #Data Analysis #SSIS (SQL Server Integration Services) #AWS (Amazon Web Services) #Datasets #Databases #Python #Qualitative Data #Terraform #API (Application Programming Interface) #Data Integrity #Programming #Stories #AWS Glue #Documentation #Dataflow #Data Access #BI (Business Intelligence) #Virtualization #Visualization #Replication #Debugging #DevOps #Batch #Data Warehouse
Role description
Summary: We are seeking a versatile professional to join our team in the role of Data Engineer. This position combines the responsibilities of Data Engineering and a Data & Reporting Analyst, requiring a well-rounded skill set and the ability to handle multiple facets of project delivery. Work on a cross-functional team of product managers, engineers, designers and subject matter experts in an agile methodology to deliver on outcomes. Provide quantitative and qualitative data analysis and reporting of patterns, insights, and trends to decision-makers to drive business decisions and address business questions with the ability to produce detailed data insight reports. Primary Duties & Responsibilities: • Lead day-to-day work to acquire, analyze, combine, synthesize, and structure data with clear definitions and sources for analytical consumption • Perform Extract, Transform, Load (ETL) processes to clean, transform, and load data from various sources into data warehouses. • Code Knowledge: Python, SQL, HCL (Terraform), Apache Spark. • Work with AWS environment (Redshift, AWS Glue). • Proficient in setting up and managing SQL Server Integration Services (SSIS) jobs to automate data workflows and ensure seamless data integration. • Consult with data consumers to identify meaningful datasets for analytical consumption • Facilitates collaboration, communication, coordination, and planning with individuals and teams from different functions within the organization, and who have different areas of expertise, to achieve common goals. • Perform thorough analysis to document current state, proposed future state, gap analysis, and supporting cost benefit analysis for project assignments within the endpoint space. • Assist in producing actionable reports that show key performance indicators and identify areas of improvement into current operations. (PowerBI, Excel) • Ability to articulate this knowledge as an expert at the department level and may be called on to provide insight at the enterprise level to provide detailed analysis. • Partner with peer group, business partners and product management to learn business area/domain, while continuing to learn and grow knowledge across the enterprise. • Author user stories and/or features independently and translates requirements to technical details to build and validate use cases around a product and communicates them effectively. • Ensure acceptance criteria is met on all user stories prior to completion, including executing testing and validation as needed. Qualifications: • Bachelor's degree or equivalent work experience. • At least 6 years of professional data engineering, software engineering, debugging, analysis, testing and software documentation experience. • Experienced in massaging and maintaining datasets to ensure data integrity and accuracy. • Advanced programming skills. • Experience with Agile methodologies/DevOps environment. • Strong understanding of database structures, theories, principles, and practices. • Strong understanding of Data Quality and Data Concepts. • Strong understanding of Data Integration Patterns and Tooling including ELT/ETL, EII, Replication, Event Streaming and Virtualization to support batch and real-time data needs. • Exceptional analytical, conceptual, and problem-solving skills. • Prior experience with research, data analysis and analytical tools including Excel, SQL, visualization tooling including PowerBI and Excel. • Adept at making data available for consumption by other report creators, ensuring data accessibility and usability for business intelligence and analytics. • Experience in connecting to databases, API calls, and PowerBI dataflows is a plus. • Certifications demonstrating mastery of role specific competencies.