

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 8–10 years of experience, focusing on Snowflake, AWS, and SQL. It's a 6-month onsite contract in Boston, MA, requiring strong skills in data pipelines, Python, and financial industry experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 13, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Airflow #Data Engineering #SQL (Structured Query Language) #Data Pipeline #Azure cloud #Data Ingestion #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Cloud #Azure #Databricks #Agile #Python #PySpark #dbt (data build tool) #Spark (Apache Spark) #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer – Snowflake / AWS
Location: Boston, MA – Onsite Only (Local Candidates Only)
Employment Type: Contract – 6 Months
Experience Required: 8–10 years
Must Have Skills
• Snowflake
• AWS
• SQL
Detailed Job Description
• 9+ years’ total experience in IT development and solution in doing end to end data engineering development project.
• 6 years’ experience in Snowflake data engineering development.
• Hands on experience in Design, build data pipelines on cloud-based infrastructure having extensively worked on AWS, snowflake, DBT, Airflow.
• Strong hands-on in Python, Pyspark is required for data transformation.
• Having done end to end build from data ingestion, transformation, and extract generation in AWS/Azure cloud services.
• Strong exposure on all AWS/Azure services.
• Good hands-on in Databricks data engineering is required.
• Optimize and tune performance including query optimization and have experience in scaling strategies.
• Address data issues, root cause analysis and provide required technical solution.
• Experience working in a Financial Industry is added advantage.
• Experience in working agile methodologies.
Minimum years of experience: 8–10 years
Certifications Needed: No
Top 3 Responsibilities
• Address data issues, root cause analysis and provide required technical solution.
• Experience working in a Financial Industry is an added advantage.
• Experience in working agile methodologies.
Work Location Policy
This is a 100% onsite role. Only candidates who are local to Boston, MA will be considered. Profiles from outside Boston, MA will not be accepted, even if the candidate is willing to relocate.