

Senior Data Engineer - Python & SQL
β - Featured Role | Apply direct with Data Freelance Hub
This role is a 12-month contract for a Senior Data Engineer in Minneapolis, MN, paying $60.00 - $80.00 per hour. Requires strong SQL and Python skills, experience with structured/unstructured data, and familiarity with AWS; financial services background preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
September 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Golden Valley, MN 55426
-
π§ - Skills detailed
#Visualization #Data Quality #Version Control #Python #SQL (Structured Query Language) #AWS (Amazon Web Services) #Redshift #GIT #Scala #S3 (Amazon Simple Storage Service) #Snowflake #Data Engineering #Code Reviews #dbt (data build tool) #Airflow #Cloud #Documentation #"ETL (Extract #Transform #Load)" #Looker #Data Modeling #Data Pipeline #Lambda (AWS Lambda)
Role description
β’
β’
β’ This is a 12-month contract role with potential for full-time conversion. This is a hybrid position (3-4 days in office), with the work location in Minneapolis, MN. Applicants must be authorized to work in the U.S. without the need for sponsorship.
β’
β’
β’ We are seeking a Senior Data Engineer to join a fast-paced, growing data engineering team within a forward-thinking investment firm. This role is ideal for a technically skilled data professional with strong experience in SQL and Python, and a passion for building data pipelines, modeling, and delivering modern data solutions. You must be comfortable working with both structured and unstructured data, and have a strong interest in cloud technologies. While experience with AWS, DBT, and Snowflake is a plus, the team is open to training the right candidate in these tools.
Primary Duties
Build and maintain scalable data pipelines using SQL and Python.
Work with both structured and unstructured data to support analytical and operational use cases.
Support cloud data engineering projects, especially those leveraging AWS (training available).
Collaborate on data modeling and transformation efforts across teams.
Assist with documentation, QA, and code reviews to ensure data quality and system reliability.
Communicate effectively with engineers, analysts, and business partners to deliver timely data solutions.
Continuously identify opportunities to optimize and improve data workflows.
Required Qualifications
Strong experience with SQL and Python.
Ability to work confidently with structured and unstructured data.
Strong analytical thinking and problem-solving skills.
Excellent written and verbal communication skills.
Self-starter who is eager to learn and grow in a fast-moving environment.
Nice to Have Skills
Experience with DBT (Data Build Tool).
Familiarity with Snowflake or similar cloud data platforms.
Exposure to AWS services (e.g., S3, Redshift, Glue, Lambda).
Experience with data visualization tools such as Looker (training provided).
Knowledge of version control (Git), CI/CD, and workflow orchestration tools like Airflow.
Background in financial services or data-intensive business environments.
Job Types: Full-time, Contract
Pay: $60.00 - $80.00 per hour
Work Location: Hybrid remote in Golden Valley, MN 55426
β’
β’
β’ This is a 12-month contract role with potential for full-time conversion. This is a hybrid position (3-4 days in office), with the work location in Minneapolis, MN. Applicants must be authorized to work in the U.S. without the need for sponsorship.
β’
β’
β’ We are seeking a Senior Data Engineer to join a fast-paced, growing data engineering team within a forward-thinking investment firm. This role is ideal for a technically skilled data professional with strong experience in SQL and Python, and a passion for building data pipelines, modeling, and delivering modern data solutions. You must be comfortable working with both structured and unstructured data, and have a strong interest in cloud technologies. While experience with AWS, DBT, and Snowflake is a plus, the team is open to training the right candidate in these tools.
Primary Duties
Build and maintain scalable data pipelines using SQL and Python.
Work with both structured and unstructured data to support analytical and operational use cases.
Support cloud data engineering projects, especially those leveraging AWS (training available).
Collaborate on data modeling and transformation efforts across teams.
Assist with documentation, QA, and code reviews to ensure data quality and system reliability.
Communicate effectively with engineers, analysts, and business partners to deliver timely data solutions.
Continuously identify opportunities to optimize and improve data workflows.
Required Qualifications
Strong experience with SQL and Python.
Ability to work confidently with structured and unstructured data.
Strong analytical thinking and problem-solving skills.
Excellent written and verbal communication skills.
Self-starter who is eager to learn and grow in a fast-moving environment.
Nice to Have Skills
Experience with DBT (Data Build Tool).
Familiarity with Snowflake or similar cloud data platforms.
Exposure to AWS services (e.g., S3, Redshift, Glue, Lambda).
Experience with data visualization tools such as Looker (training provided).
Knowledge of version control (Git), CI/CD, and workflow orchestration tools like Airflow.
Background in financial services or data-intensive business environments.
Job Types: Full-time, Contract
Pay: $60.00 - $80.00 per hour
Work Location: Hybrid remote in Golden Valley, MN 55426