Advantage Technical

Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Scientist/Data Infrastructure Engineer position for a 6-month contract, paying up to $62.90/hour. Key skills include Power BI, AWS, SQL, and data pipeline optimization. Requires 6 years of experience and a Bachelor's degree in a related field.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
496
-
🗓️ - Date
May 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Sterling Heights, MI
-
🧠 - Skills detailed
#Monitoring #Replication #Storage #Data Science #Data Warehouse #Redshift #Agile #Data Pipeline #BI (Business Intelligence) #Microsoft Power BI #Programming #Oracle #Alation #Data Quality #Dataiku #"ETL (Extract #Transform #Load)" #Statistics #SQL (Structured Query Language) #Data Access #Databases #Data Replication #Data Security #Data Modeling #Jira #AWS (Amazon Web Services) #MagicDraw #Scala #Data Management #Security #Datasets #Data Architecture #Documentation #Matlab #Scrum #SSAS (SQL Server Analysis Services) #Data Engineering #Mathematics
Role description
Data Infrastructure Engineer Compensation • Pay up to $62.90/hour. Role Overview As a core member of the data engineering team, you will design, build, and maintain the systems that ensure accurate, timely, and secure data access across the organization. This role involves working with large datasets, complex architectures, and cross‑functional teams to transform raw information into actionable insights. Key Responsibilities • Data Infrastructure Design — Architect, build, and maintain databases, data warehouses, and end‑to‑end data pipelines. • Data Collection & Transformation — Integrate data from diverse sources (databases, APIs, etc.) and develop efficient pipelines to convert raw data into usable formats, dashboards, and reports. • Data Security & Storage — Ensure data is stored securely, efficiently, and in a manner optimized for analysis and reporting. • Data Quality Management — Implement validation, consistency checks, and quality‑control processes to maintain reliable datasets. • Pipeline Optimization — Improve pipeline performance for speed, scalability, and availability. • Cross‑Functional Collaboration — Partner with analysts, data scientists, engineers, and business stakeholders to understand requirements and deliver the right data solutions. • System Monitoring & Troubleshooting — Monitor system performance, resolve issues, and perform routine maintenance. • Architecture Governance — Support configuration control, documentation, and change‑impact analysis for the enterprise data architecture, including identifying Authoritative Sources of Truth (ASOTs) and managing data replication into Midas and/or AWS environments. • Power BI Data Transformation — Support data modeling and transformation workflows within Power BI. Required Skills & Competencies • Excellent written and verbal communication skills. • Proven ability to work in team environments where individual contributions are critical. • Strong analytical abilities with foundational knowledge in probability and statistics and mathematical programming. • Proficiency with tools and platforms such as: • Power BI, Visual Basic, MS Excel • MATLAB/Simulink, Mathcad • MagicDraw, Midas • AWS / AWS Redshift • SSAS, Alation, Dataiku • Oracle, SQL • Familiarity with Atlassian Jira or ServiceNow. • Ability to troubleshoot server‑related issues. • Strong organizational skills with the ability to prioritize in a fast‑paced environment. • Ability to independently research, analyze, and develop solutions. • Ability to translate data into insights and actionable recommendations. • Knowledge in areas such as: • Applied mathematics • Linear, nonlinear, and integer programming • Network analysis • Queuing theory • Economic evaluation • Experience developing data models, process flows, and documenting/validating current‑state business processes. • Ability to work effectively in cross‑functional engineering environments. • Strong problem‑solving skills and sound decision‑making aligned with policies and procedures. • Experience defining data‑collection requirements and enabling downstream data use. • Understanding of technical strategies, architectures, and standards. • High attention to detail and strong self‑management. • Experience with structured problem‑solving methodologies. • Knowledge of technical systems relevant to combat vehicles (preferred). • Experience collaborating with development teams. • Understanding of Agile metrics and reporting. • At least 1 year of Scrum/Agile management experience. • In‑depth understanding of engineering functions. Experience Requirements • 6 years required, 10 years preferred. Education • Bachelor’s degree in Data Management, Data Science, Mathematics, Statistics, or related field. • Master’s degree preferred.