

Senior Data Engineer (Contract)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract) with a pay rate of $80-$90 per hour, 100% remote, working core EST hours. Requires 6+ years of experience, expertise in SQL, AWS services, and Python; knowledge of BI tools and serverless applications is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
August 14, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Agile #OpenSearch #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Automation #Security #Data Extraction #Data Engineering #Redshift #Athena #"ETL (Extract #Transform #Load)" #Python #DevOps #React #JavaScript #Tableau #Data Ingestion #Microservices #AI (Artificial Intelligence) #DynamoDB #SageMaker #Looker #ML (Machine Learning) #SNS (Simple Notification Service) #Cloud #SQS (Simple Queue Service) #Strategy #S3 (Amazon Simple Storage Service) #SQL Queries #BI (Business Intelligence) #SQL (Structured Query Language) #Scrum
Role description
Position: Senior Data Engineer (2 roles) (Contract)
Mode: Long term contract
Location: 100% Remote US based, working core EST hours
Tech Stack Grid
NO Awareness of technology (0)
LITTLE Awareness - read/heard of technology (1)
EXPOSURE to technology in environment (2)
SOME development in technology (3)
Very COMFORTABLE developing in technology (4)
EXPERTISE in technology i.e. could teach a class (5)
RedShift/SQL, Level - 4
BI Tool, Level - 3
AWS - Dynamo, Level - 1
Gen AI, Level - 3
AWS Step Function, Level - 3
AWS - Lambda, Level - 3
AWS - S3, Level - 3
AWS - SNS/SQS, Level - 3
AWS - CloudFormation, Level - 3
Python, Level - 3
node.js Development, Level - 1
JavaScript (ES6), Level - 1
About the Team
The organization offers a dynamic suite of tools and programs designed to streamline and expedite the Higher Education recruitment and admission process. Our Higher Ed Elevate team is a passionate, close-knit, group of professionals responsibly experimenting with AI to make informed decisions that give institutions unparalleled reach, connecting them with millions of high school students each year. We pride ourselves on being innovative in our industry by continuously improving our products and services. Our team consists of 3 architects and 3+ engineers, a blend of full-time staff and contractors, collaborating across various time zones to deliver multiple services and tools.
About the Opportunity
As a Data Engineer on our Higher Ed Elevate team, you possess an unwavering passion for excellence in Software Engineering and a drive to transform that passion into consistent excellence in product development. You are customer-obsessed, have strong analytical and communication skills, enjoy working with native AWS services, and thrive on solving challenging business problems. You will partner with Product Owners, Architects, Developers, and other Software Engineers to help design, define, implement functional features, conduct system level validation and verify features of product development. You have an aptitude for learning and will apply this to implement advanced software engineering practices in Data Analytics within our AWS, Microservices environment to solve complex problems with a mission to release high quality software that is resilient and optimally performing. You will engage in both data and analytics reports development and quality engineering, testing and automation as a data engineer.
In this role, you will
Design & Implementation (70%)
β’ Incorporate Cloud technologies on new application development to include microservices, SQL and AWS services such as Lambda, S3, SQS, SNS, DynamoDB, Quicksight, AWS Redshift, Opensearch, and SageMaker AI
β’ Create and maintain automated functional and system tests to replicate complex real-world scenarios
β’ Develop code and our test automation suite to support security features and technical scoping, ensuring that features deliver the expected functionality with high quality
β’ Ensure that continuous integration is performed on the application source code and constantly seek to enhance the continuous integration methods of the development team to ensure extremely high quality of code
β’ Support and coordinate with other Engineers, Architects and teams such as User Experience and/or infrastructure teams
Strategy & Communication (20%)
β’ Implement cloud-first architectural solutions and best practices
β’ Participate in agile scrum ceremonies (Sprint Planning, Grooming, Daily SCRUM, Demo) and contribute to team deliverables
β’ Participate in peer reviews of software engineering artifacts
Team Coordination (10%)
β’ Mentor team members by designing and developing training materials to communicate the current and future product architecture
β’ Mentor technical staff by providing feedback of code and other design artifacts
About You
You have:
β’ Ideally 6+ years of software development experience in a production environment
β’ Expertise writing and optimizing complex SQL queries for data extraction, transformation, and reporting (e.g., SELECT, JOIN, GROUP BY, HAVING, WINDOW functions)
β’ Expertise integrating Redshift Serverless with AWS services such as S3, Glue, Lambda, Athena, DynamoDB and Kinesis BI tools like Tableau, Quicksight, or Looker for real-time analytics and dashboards
β’ Expertise building serverless ETL pipelines with Lambda to automate data ingestion and transformation into Redshift
β’ Experience with Python required; React and Node.js nice to have
β’ Experience building event-driven cloud-based serverless applications and deploying to AWS
β’ Technical knowledge of Cloud Computing, DevOps, and Microservices
β’ Excellent communication skills with the ability to present ideas in business-friendly and user-friendly language
β’ Demonstrated ability to develop and maintain good customer working relationships
β’ Exceptional analytical, conceptual, and problem-solving abilities
β’ Proficient in designing, building, and deploying machine learning models using Amazon SageMaker
β’ Able to prioritize and execute tasks in a high-pressure environment
Pay Rate Range: $80-$90 per hr (Depending on Experience)
Position: Senior Data Engineer (2 roles) (Contract)
Mode: Long term contract
Location: 100% Remote US based, working core EST hours
Tech Stack Grid
NO Awareness of technology (0)
LITTLE Awareness - read/heard of technology (1)
EXPOSURE to technology in environment (2)
SOME development in technology (3)
Very COMFORTABLE developing in technology (4)
EXPERTISE in technology i.e. could teach a class (5)
RedShift/SQL, Level - 4
BI Tool, Level - 3
AWS - Dynamo, Level - 1
Gen AI, Level - 3
AWS Step Function, Level - 3
AWS - Lambda, Level - 3
AWS - S3, Level - 3
AWS - SNS/SQS, Level - 3
AWS - CloudFormation, Level - 3
Python, Level - 3
node.js Development, Level - 1
JavaScript (ES6), Level - 1
About the Team
The organization offers a dynamic suite of tools and programs designed to streamline and expedite the Higher Education recruitment and admission process. Our Higher Ed Elevate team is a passionate, close-knit, group of professionals responsibly experimenting with AI to make informed decisions that give institutions unparalleled reach, connecting them with millions of high school students each year. We pride ourselves on being innovative in our industry by continuously improving our products and services. Our team consists of 3 architects and 3+ engineers, a blend of full-time staff and contractors, collaborating across various time zones to deliver multiple services and tools.
About the Opportunity
As a Data Engineer on our Higher Ed Elevate team, you possess an unwavering passion for excellence in Software Engineering and a drive to transform that passion into consistent excellence in product development. You are customer-obsessed, have strong analytical and communication skills, enjoy working with native AWS services, and thrive on solving challenging business problems. You will partner with Product Owners, Architects, Developers, and other Software Engineers to help design, define, implement functional features, conduct system level validation and verify features of product development. You have an aptitude for learning and will apply this to implement advanced software engineering practices in Data Analytics within our AWS, Microservices environment to solve complex problems with a mission to release high quality software that is resilient and optimally performing. You will engage in both data and analytics reports development and quality engineering, testing and automation as a data engineer.
In this role, you will
Design & Implementation (70%)
β’ Incorporate Cloud technologies on new application development to include microservices, SQL and AWS services such as Lambda, S3, SQS, SNS, DynamoDB, Quicksight, AWS Redshift, Opensearch, and SageMaker AI
β’ Create and maintain automated functional and system tests to replicate complex real-world scenarios
β’ Develop code and our test automation suite to support security features and technical scoping, ensuring that features deliver the expected functionality with high quality
β’ Ensure that continuous integration is performed on the application source code and constantly seek to enhance the continuous integration methods of the development team to ensure extremely high quality of code
β’ Support and coordinate with other Engineers, Architects and teams such as User Experience and/or infrastructure teams
Strategy & Communication (20%)
β’ Implement cloud-first architectural solutions and best practices
β’ Participate in agile scrum ceremonies (Sprint Planning, Grooming, Daily SCRUM, Demo) and contribute to team deliverables
β’ Participate in peer reviews of software engineering artifacts
Team Coordination (10%)
β’ Mentor team members by designing and developing training materials to communicate the current and future product architecture
β’ Mentor technical staff by providing feedback of code and other design artifacts
About You
You have:
β’ Ideally 6+ years of software development experience in a production environment
β’ Expertise writing and optimizing complex SQL queries for data extraction, transformation, and reporting (e.g., SELECT, JOIN, GROUP BY, HAVING, WINDOW functions)
β’ Expertise integrating Redshift Serverless with AWS services such as S3, Glue, Lambda, Athena, DynamoDB and Kinesis BI tools like Tableau, Quicksight, or Looker for real-time analytics and dashboards
β’ Expertise building serverless ETL pipelines with Lambda to automate data ingestion and transformation into Redshift
β’ Experience with Python required; React and Node.js nice to have
β’ Experience building event-driven cloud-based serverless applications and deploying to AWS
β’ Technical knowledge of Cloud Computing, DevOps, and Microservices
β’ Excellent communication skills with the ability to present ideas in business-friendly and user-friendly language
β’ Demonstrated ability to develop and maintain good customer working relationships
β’ Exceptional analytical, conceptual, and problem-solving abilities
β’ Proficient in designing, building, and deploying machine learning models using Amazon SageMaker
β’ Able to prioritize and execute tasks in a high-pressure environment
Pay Rate Range: $80-$90 per hr (Depending on Experience)