

Holistic Partners, Inc
Data Engineer (Python & SQL)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Python & SQL) in Atlanta, GA & Boston, MA, for 6 months at a competitive pay rate. Key skills include Python, SQL, AI, Snowflake, and AWS Airflow. 5+ years of data engineering experience required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 28, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Data Processing #dbt (data build tool) #ML (Machine Learning) #Azure #Data Lake #AWS (Amazon Web Services) #Data Engineering #Data Pipeline #Automation #Cloud #Data Integration #Apache Airflow #Data Modeling #Kubernetes #Apache NiFi #Data Governance #NiFi (Apache NiFi) #Agile #DevOps #Kafka (Apache Kafka) #GCP (Google Cloud Platform) #Airflow #Snowflake #Python #Data Science #SQL (Structured Query Language) #NoSQL #"ETL (Extract #Transform #Load)" #Hadoop #AI (Artificial Intelligence) #Monitoring #Programming #Security #Spark (Apache Spark) #Compliance #Big Data #Java #Scala
Role description
Job Title: Data Engineer (Python & SQL)
Location: Atlanta, GA & Boston, MA (Onsite)
Duration: 6 Months
Interview Process: Video
Project Overview
This role will support a key data engineering workstream, focused on building and maintaining enterprise data pipelines and transformations in a modern cloud environment. The individual should be able to work independently, ramp up quickly, and operate comfortably in a client-facing setting.
Top Required (MUST – High Proficiency)
• Python
• SQL
Required Skills / Experience
• AI
• Snowflake
• DBT
• AWS Airflow
• AI / Automation
Role: Senior Associate Technology L2 (Data Platforms)
As a Senior Associate Technology L2 specializing in Data Platforms, you will play a key role in designing, developing, and optimizing data solutions that enable scalable, high-performance data processing. You will work with cutting-edge technologies to build robust data pipelines, data lakes, and analytics platforms that drive business insights and innovation.
Your Impact
• Design, develop, and maintain scalable data platforms that support enterprise data needs.
• Build and optimize data pipelines, ETL processes, and data integration workflows.
• Collaborate with data scientists, analysts, and business stakeholders to ensure data solutions meet business requirements.
• Implement best practices in data governance, security, and compliance.
• Work with cloud-based data platforms such as AWS, Azure, or GCP.
• Utilize big data technologies such as Hadoop, Spark, Kafka, and Snowflake.
• Automate data processing and monitoring using tools like Airflow, Kubernetes, or Apache NiFi.
• Troubleshoot and optimize data performance, ensuring high availability and reliability.
• Stay updated on emerging trends in data engineering and contribute to innovation within the team.
Skills & Experience
• 5+ years of experience in data engineering, data platforms, or related fields.
• Strong expertise in SQL, NoSQL, and data modeling.
• Hands-on experience with big data technologies such as Hadoop, Spark, Kafka, or Snowflake.
• Proficiency in cloud-based data solutions (AWS, Azure, GCP).
• Experience with data pipeline orchestration tools like Apache Airflow, NiFi, or Kubernetes.
• Strong programming skills in Python, Java, Scala, or similar languages.
• Knowledge of data governance, security, and compliance best practices.
• Ability to work in an Agile environment and collaborate with cross-functional teams.
• Strong problem-solving and analytical skills with a focus on data-driven decision-making.
Set Yourself Apart With
• Experience with real-time data processing and streaming analytics.
• Knowledge of machine learning pipelines and data science workflows.
• Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Azure Data Engineer, GCP Professional Data Engineer).
• Exposure to DevOps practices for data engineering.
Job Title: Data Engineer (Python & SQL)
Location: Atlanta, GA & Boston, MA (Onsite)
Duration: 6 Months
Interview Process: Video
Project Overview
This role will support a key data engineering workstream, focused on building and maintaining enterprise data pipelines and transformations in a modern cloud environment. The individual should be able to work independently, ramp up quickly, and operate comfortably in a client-facing setting.
Top Required (MUST – High Proficiency)
• Python
• SQL
Required Skills / Experience
• AI
• Snowflake
• DBT
• AWS Airflow
• AI / Automation
Role: Senior Associate Technology L2 (Data Platforms)
As a Senior Associate Technology L2 specializing in Data Platforms, you will play a key role in designing, developing, and optimizing data solutions that enable scalable, high-performance data processing. You will work with cutting-edge technologies to build robust data pipelines, data lakes, and analytics platforms that drive business insights and innovation.
Your Impact
• Design, develop, and maintain scalable data platforms that support enterprise data needs.
• Build and optimize data pipelines, ETL processes, and data integration workflows.
• Collaborate with data scientists, analysts, and business stakeholders to ensure data solutions meet business requirements.
• Implement best practices in data governance, security, and compliance.
• Work with cloud-based data platforms such as AWS, Azure, or GCP.
• Utilize big data technologies such as Hadoop, Spark, Kafka, and Snowflake.
• Automate data processing and monitoring using tools like Airflow, Kubernetes, or Apache NiFi.
• Troubleshoot and optimize data performance, ensuring high availability and reliability.
• Stay updated on emerging trends in data engineering and contribute to innovation within the team.
Skills & Experience
• 5+ years of experience in data engineering, data platforms, or related fields.
• Strong expertise in SQL, NoSQL, and data modeling.
• Hands-on experience with big data technologies such as Hadoop, Spark, Kafka, or Snowflake.
• Proficiency in cloud-based data solutions (AWS, Azure, GCP).
• Experience with data pipeline orchestration tools like Apache Airflow, NiFi, or Kubernetes.
• Strong programming skills in Python, Java, Scala, or similar languages.
• Knowledge of data governance, security, and compliance best practices.
• Ability to work in an Agile environment and collaborate with cross-functional teams.
• Strong problem-solving and analytical skills with a focus on data-driven decision-making.
Set Yourself Apart With
• Experience with real-time data processing and streaming analytics.
• Knowledge of machine learning pipelines and data science workflows.
• Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Azure Data Engineer, GCP Professional Data Engineer).
• Exposure to DevOps practices for data engineering.






