

Jackson & Correia LLC
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "competitive". It is 100% remote, requiring quarterly travel to Massachusetts. Key skills include AWS, SQL, data modeling, and experience with Snowflake or Redshift.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
960
-
ποΈ - Date
February 19, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Massachusetts, United States
-
π§ - Skills detailed
#Data Science #Scala #Data Processing #IAM (Identity and Access Management) #Java #Data Engineering #Data Modeling #S3 (Amazon Simple Storage Service) #AWS Glue #Python #Security #Data Pipeline #Airflow #Programming #Scripting #PySpark #Data Integrity #Databases #Data Security #Redshift #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #AWS (Amazon Web Services) #Snowflake #Leadership #dbt (data build tool) #Datasets #Data Orchestration #SQL (Structured Query Language) #BI (Business Intelligence) #SQL Queries
Role description
A client of Jackson & Correia, headquartered in Massachusetts, is seeking a Data Solutions Engineer to build resilient and scalable data solutions across multiple functional domains, including Advancement, HR, and Finance.
This role is 100% remote, with travel to Massachusetts once per quarter, and will focus on implementing best practices in data engineering, ensuring data usability, and delivering high-quality, analysis-ready datasets that support enterprise analytics and reporting.
Position Overview
The Data Solutions Engineer will design, develop, and maintain scalable data models and solutions that meet business and analytical needs. Acting as a bridge between business stakeholders and the data platform team, this role ensures the usability, accuracy, and integrity of enterprise data.
Principal Duties & Responsibilities
β’ Own the design, development, and maintenance of scalable data models, dashboards, reports, and analyses
β’ Interface with Data Platform teams to extract, transform, and load data from diverse sources using AWS and internal tools
β’ Build, optimize, and deliver high-quality datasets to support administrative data needs
β’ Lead continuous improvement initiatives to automate or simplify reporting and analysis processes
β’ Translate business problem statements into actionable data model requirements
β’ Apply analytical and statistical rigor to answer business questions and drive decisions
β’ Write efficient, complex SQL queries; optimize outputs for analysis readiness
β’ Adopt, document, and enforce best practices in data engineering, including data discovery, naming conventions, operational excellence, and data security
β’ Troubleshoot operational quality issues in data processing and orchestration code (SQL/Python)
β’ Review and audit existing jobs, queries, and processes; recommend improvements to enhance accuracy and simplicity
β’ Collaborate with stakeholders and provide guidance on data usability while Data Platform Engineers focus on infrastructure reliability
β’ Perform additional duties as assigned
Required Qualifications
β’ Bachelorβs degree required
β’ 10+ years of data engineering experience, including programming, data modeling, warehousing, and building data pipelines
β’ Strong experience with modern databases such as Snowflake and Redshift
β’ Experience with data orchestration tools such as Airflow or Dagster
β’ Proficiency with AWS technologies, including S3, AWS Glue, dbt, IAM roles and permissions
β’ Advanced SQL skills with experience writing highly optimized queries on large datasets
β’ Proficiency in scripting languages such as Python, PySpark, Java, or Scala
β’ Proven ability to ensure data integrity and accuracy
β’ Strong communication skills with technical teams, business users, and senior management
β’ Ability to build trusted relationships and collaborate across diverse teams
β’ Comfortable working in fast-paced, complex, and ambiguous environments
Preferred Qualifications
β’ Experience providing technical leadership and mentoring other engineers on best practices in data engineering
β’ Familiarity with BI tools and data science models
A client of Jackson & Correia, headquartered in Massachusetts, is seeking a Data Solutions Engineer to build resilient and scalable data solutions across multiple functional domains, including Advancement, HR, and Finance.
This role is 100% remote, with travel to Massachusetts once per quarter, and will focus on implementing best practices in data engineering, ensuring data usability, and delivering high-quality, analysis-ready datasets that support enterprise analytics and reporting.
Position Overview
The Data Solutions Engineer will design, develop, and maintain scalable data models and solutions that meet business and analytical needs. Acting as a bridge between business stakeholders and the data platform team, this role ensures the usability, accuracy, and integrity of enterprise data.
Principal Duties & Responsibilities
β’ Own the design, development, and maintenance of scalable data models, dashboards, reports, and analyses
β’ Interface with Data Platform teams to extract, transform, and load data from diverse sources using AWS and internal tools
β’ Build, optimize, and deliver high-quality datasets to support administrative data needs
β’ Lead continuous improvement initiatives to automate or simplify reporting and analysis processes
β’ Translate business problem statements into actionable data model requirements
β’ Apply analytical and statistical rigor to answer business questions and drive decisions
β’ Write efficient, complex SQL queries; optimize outputs for analysis readiness
β’ Adopt, document, and enforce best practices in data engineering, including data discovery, naming conventions, operational excellence, and data security
β’ Troubleshoot operational quality issues in data processing and orchestration code (SQL/Python)
β’ Review and audit existing jobs, queries, and processes; recommend improvements to enhance accuracy and simplicity
β’ Collaborate with stakeholders and provide guidance on data usability while Data Platform Engineers focus on infrastructure reliability
β’ Perform additional duties as assigned
Required Qualifications
β’ Bachelorβs degree required
β’ 10+ years of data engineering experience, including programming, data modeling, warehousing, and building data pipelines
β’ Strong experience with modern databases such as Snowflake and Redshift
β’ Experience with data orchestration tools such as Airflow or Dagster
β’ Proficiency with AWS technologies, including S3, AWS Glue, dbt, IAM roles and permissions
β’ Advanced SQL skills with experience writing highly optimized queries on large datasets
β’ Proficiency in scripting languages such as Python, PySpark, Java, or Scala
β’ Proven ability to ensure data integrity and accuracy
β’ Strong communication skills with technical teams, business users, and senior management
β’ Ability to build trusted relationships and collaborate across diverse teams
β’ Comfortable working in fast-paced, complex, and ambiguous environments
Preferred Qualifications
β’ Experience providing technical leadership and mentoring other engineers on best practices in data engineering
β’ Familiarity with BI tools and data science models






