

Brooksource
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract-to-hire, mainly remote, with a preference for candidates local to Metro Detroit. Key skills include strong SQL, experience with Snowflake, and hands-on Python. A Bachelor's degree and 5+ years in ELT systems are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date
March 24, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Detroit, MI
-
π§ - Skills detailed
#GIT #SaaS (Software as a Service) #Dataflow #Code Reviews #Automation #BigQuery #S3 (Amazon Simple Storage Service) #Visualization #Data Modeling #Java #Python #Datasets #Scala #Cloud #Agile #"ETL (Extract #Transform #Load)" #SQS (Simple Queue Service) #AWS (Amazon Web Services) #AWS Kinesis #Automated Testing #Data Quality #Computer Science #Data Engineering #Debugging #Monitoring #SQL (Structured Query Language) #dbt (data build tool) #Lambda (AWS Lambda) #MongoDB #Snowflake
Role description
Senior Data Engineer
β’ 6βmonth contractβtoβhire
β’ Mainly remote
β’ Preference for candidates local to Metro Detroit
Responsibilities
β’ Build and maintain secure, scalable ETL/ELT pipelines.
β’ Develop and optimize data models and workflows for analytics and reporting.
β’ Create monitoring and alerting to ensure data quality and pipeline reliability.
β’ Collaborate with engineering, architecture, product, analytics, and IT teams.
β’ Translate business needs into technical solutions.
β’ Conduct code reviews and mentor junior engineers.
β’ Improve processes through automation, tooling, and best practices.
β’ Own critical data flows and reporting systems.
β’ Build dashboards and datasets in Domo.
β’ Participate in Agile ceremonies and surface blockers early.
Requirements
β’ Strong SQL skills and data modeling expertise.
β’ Experience with Snowflake or similar cloud data platforms.
β’ Hands-on Python experience (or Java/Scala).
β’ Experience integrating with APIs and third-party data sources.
β’ Familiarity with AWS cloud-native services and infrastructure-as-code.
β’ Strong analytical, debugging, and problem-solving skills.
β’ Experience in Agile environments and influencing without authority.
β’ Strong organizational and time-management skills.
Nice to Have
β’ Experience with Domo datasets, dataflows, and dashboards.
β’ Experience building integrations with SaaS APIs like Salesforce.
β’ DBT or automated testing experience.
Education & Experience
β’ Bachelorβs degree in Computer Science or related field.
β’ 5+ years developing and maintaining ELT systems.
β’ Experience with Snowflake/BigQuery, SQL, and pipeline development required.
β’ Extra consideration for AWS or Snowflake certifications.
Technologies
β’ Data & Visualization: Snowflake, Domo, MongoDB, Dynamo
β’ Cloud: AWS (Kinesis, Lambda, S3, SQS, CDK)
β’ Languages: Python, SQL
β’ Integrations: SaaS APIs (Salesforce, etc.)
β’ Practices: Agile, Git, CI/CD
Senior Data Engineer
β’ 6βmonth contractβtoβhire
β’ Mainly remote
β’ Preference for candidates local to Metro Detroit
Responsibilities
β’ Build and maintain secure, scalable ETL/ELT pipelines.
β’ Develop and optimize data models and workflows for analytics and reporting.
β’ Create monitoring and alerting to ensure data quality and pipeline reliability.
β’ Collaborate with engineering, architecture, product, analytics, and IT teams.
β’ Translate business needs into technical solutions.
β’ Conduct code reviews and mentor junior engineers.
β’ Improve processes through automation, tooling, and best practices.
β’ Own critical data flows and reporting systems.
β’ Build dashboards and datasets in Domo.
β’ Participate in Agile ceremonies and surface blockers early.
Requirements
β’ Strong SQL skills and data modeling expertise.
β’ Experience with Snowflake or similar cloud data platforms.
β’ Hands-on Python experience (or Java/Scala).
β’ Experience integrating with APIs and third-party data sources.
β’ Familiarity with AWS cloud-native services and infrastructure-as-code.
β’ Strong analytical, debugging, and problem-solving skills.
β’ Experience in Agile environments and influencing without authority.
β’ Strong organizational and time-management skills.
Nice to Have
β’ Experience with Domo datasets, dataflows, and dashboards.
β’ Experience building integrations with SaaS APIs like Salesforce.
β’ DBT or automated testing experience.
Education & Experience
β’ Bachelorβs degree in Computer Science or related field.
β’ 5+ years developing and maintaining ELT systems.
β’ Experience with Snowflake/BigQuery, SQL, and pipeline development required.
β’ Extra consideration for AWS or Snowflake certifications.
Technologies
β’ Data & Visualization: Snowflake, Domo, MongoDB, Dynamo
β’ Cloud: AWS (Kinesis, Lambda, S3, SQS, CDK)
β’ Languages: Python, SQL
β’ Integrations: SaaS APIs (Salesforce, etc.)
β’ Practices: Agile, Git, CI/CD






