Brooksource

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract-to-hire, mainly remote, with a preference for candidates local to Metro Detroit. Key skills include strong SQL, experience with Snowflake, and hands-on Python. A Bachelor's degree and 5+ years in ELT systems are required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date
March 24, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Detroit, MI
-
🧠 - Skills detailed
#GIT #SaaS (Software as a Service) #Dataflow #Code Reviews #Automation #BigQuery #S3 (Amazon Simple Storage Service) #Visualization #Data Modeling #Java #Python #Datasets #Scala #Cloud #Agile #"ETL (Extract #Transform #Load)" #SQS (Simple Queue Service) #AWS (Amazon Web Services) #AWS Kinesis #Automated Testing #Data Quality #Computer Science #Data Engineering #Debugging #Monitoring #SQL (Structured Query Language) #dbt (data build tool) #Lambda (AWS Lambda) #MongoDB #Snowflake
Role description
Senior Data Engineer β€’ 6‑month contract‑to‑hire β€’ Mainly remote β€’ Preference for candidates local to Metro Detroit Responsibilities β€’ Build and maintain secure, scalable ETL/ELT pipelines. β€’ Develop and optimize data models and workflows for analytics and reporting. β€’ Create monitoring and alerting to ensure data quality and pipeline reliability. β€’ Collaborate with engineering, architecture, product, analytics, and IT teams. β€’ Translate business needs into technical solutions. β€’ Conduct code reviews and mentor junior engineers. β€’ Improve processes through automation, tooling, and best practices. β€’ Own critical data flows and reporting systems. β€’ Build dashboards and datasets in Domo. β€’ Participate in Agile ceremonies and surface blockers early. Requirements β€’ Strong SQL skills and data modeling expertise. β€’ Experience with Snowflake or similar cloud data platforms. β€’ Hands-on Python experience (or Java/Scala). β€’ Experience integrating with APIs and third-party data sources. β€’ Familiarity with AWS cloud-native services and infrastructure-as-code. β€’ Strong analytical, debugging, and problem-solving skills. β€’ Experience in Agile environments and influencing without authority. β€’ Strong organizational and time-management skills. Nice to Have β€’ Experience with Domo datasets, dataflows, and dashboards. β€’ Experience building integrations with SaaS APIs like Salesforce. β€’ DBT or automated testing experience. Education & Experience β€’ Bachelor’s degree in Computer Science or related field. β€’ 5+ years developing and maintaining ELT systems. β€’ Experience with Snowflake/BigQuery, SQL, and pipeline development required. β€’ Extra consideration for AWS or Snowflake certifications. Technologies β€’ Data & Visualization: Snowflake, Domo, MongoDB, Dynamo β€’ Cloud: AWS (Kinesis, Lambda, S3, SQS, CDK) β€’ Languages: Python, SQL β€’ Integrations: SaaS APIs (Salesforce, etc.) β€’ Practices: Agile, Git, CI/CD