

Senior Data Engineer MarTech/CDP | W2 Only | Local
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in MarTech and CDP, offering a W2 contract for local candidates. Requires 12+ years of experience, expertise in Snowflake, SQL, Python, real-time data processing, and relevant certifications.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Programming #Python #Data Integration #Batch #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Apache Airflow #Data Quality #NoSQL #Delta Lake #Azure #Databases #Computer Science #Kafka (Apache Kafka) #Cloud #Snowflake #MongoDB #Data Pipeline #Big Data #Airflow #Azure Event Hubs #Scala #Redis #Automation #Security #Data Lake #DynamoDB #Data Engineering #Data Warehouse #Spark (Apache Spark) #Azure cloud #Data Processing #Data Governance
Role description
About the Role
We are looking for an experienced Senior Data Engineer with a strong background in data engineering and specialized expertise in MarTech, Customer Data Platforms (CDP), and data warehousing. The ideal candidate will design, build, and optimize data solutions to enable marketing and customer engagement strategies. This role requires deep technical expertise in Snowflake, cloud platforms, and real-time/batch data pipelines, along with strong collaboration skills to work with cross-functional teams.
Key Responsibilities
β’ Architect, develop, and maintain large-scale data pipelines (batch and streaming) to support MarTech and CDP use cases.
β’ Design and manage data warehouse solutions on Snowflake, ensuring high performance and scalability.
β’ Implement ETL/ELT workflows, including orchestration with Apache Airflow and real-time streaming with Kafka, Kinesis, or Pub/Sub.
β’ Integrate diverse data sources (structured, semi-structured, unstructured) into Snowflake and cloud data lakes.
β’ Partner with marketing, analytics, and product teams to deliver data-driven solutions that enable personalization and customer engagement.
β’ Ensure data quality, governance, and security best practices are applied across data engineering processes.
β’ Optimize data pipelines and workflows for efficiency and scalability.
β’ Stay updated on emerging technologies and recommend innovative approaches to enhance the data ecosystem.
Job Requirements
β’ 12+ years of experience in data engineering, with at least 3β5 years focused on MarTech, CDP, and data warehousing. (For mid-senior roles: 8+ years with 3β5 years specialization)
β’ Bachelorβs or Masterβs degree in Computer Science, Information Systems, or related field.
β’ Hands-on experience with Snowflake cloud data platform for ingestion, transformation, and orchestration.
β’ Strong proficiency in SQL and Python (or similar programming languages) for data processing and automation.
β’ Expertise with ETL/ELT tools and data pipeline development.
β’ Proficiency in real-time data processing frameworks (Spark Streaming, Flink, Kafka Streams).
β’ Experience with data lakes (Delta Lake, Iceberg) and cloud data warehouses.
β’ Familiarity with NoSQL databases (MongoDB, Cassandra) and key-value stores (Redis, DynamoDB).
β’ Experience with Azure cloud platform, especially Azure Event Hubs and Snowflake integration.
β’ Strong understanding of marketing technologies, CDPs, and data integration challenges.
β’ Knowledge of data governance, data quality, and security practices.
β’ Excellent problem-solving, optimization, and performance-tuning skills.
β’ Strong collaboration and communication skills to work with architects, analysts, and business stakeholders.
β’ Relevant certifications (Snowflake, Azure, Big Data) are a plus.
About the Role
We are looking for an experienced Senior Data Engineer with a strong background in data engineering and specialized expertise in MarTech, Customer Data Platforms (CDP), and data warehousing. The ideal candidate will design, build, and optimize data solutions to enable marketing and customer engagement strategies. This role requires deep technical expertise in Snowflake, cloud platforms, and real-time/batch data pipelines, along with strong collaboration skills to work with cross-functional teams.
Key Responsibilities
β’ Architect, develop, and maintain large-scale data pipelines (batch and streaming) to support MarTech and CDP use cases.
β’ Design and manage data warehouse solutions on Snowflake, ensuring high performance and scalability.
β’ Implement ETL/ELT workflows, including orchestration with Apache Airflow and real-time streaming with Kafka, Kinesis, or Pub/Sub.
β’ Integrate diverse data sources (structured, semi-structured, unstructured) into Snowflake and cloud data lakes.
β’ Partner with marketing, analytics, and product teams to deliver data-driven solutions that enable personalization and customer engagement.
β’ Ensure data quality, governance, and security best practices are applied across data engineering processes.
β’ Optimize data pipelines and workflows for efficiency and scalability.
β’ Stay updated on emerging technologies and recommend innovative approaches to enhance the data ecosystem.
Job Requirements
β’ 12+ years of experience in data engineering, with at least 3β5 years focused on MarTech, CDP, and data warehousing. (For mid-senior roles: 8+ years with 3β5 years specialization)
β’ Bachelorβs or Masterβs degree in Computer Science, Information Systems, or related field.
β’ Hands-on experience with Snowflake cloud data platform for ingestion, transformation, and orchestration.
β’ Strong proficiency in SQL and Python (or similar programming languages) for data processing and automation.
β’ Expertise with ETL/ELT tools and data pipeline development.
β’ Proficiency in real-time data processing frameworks (Spark Streaming, Flink, Kafka Streams).
β’ Experience with data lakes (Delta Lake, Iceberg) and cloud data warehouses.
β’ Familiarity with NoSQL databases (MongoDB, Cassandra) and key-value stores (Redis, DynamoDB).
β’ Experience with Azure cloud platform, especially Azure Event Hubs and Snowflake integration.
β’ Strong understanding of marketing technologies, CDPs, and data integration challenges.
β’ Knowledge of data governance, data quality, and security practices.
β’ Excellent problem-solving, optimization, and performance-tuning skills.
β’ Strong collaboration and communication skills to work with architects, analysts, and business stakeholders.
β’ Relevant certifications (Snowflake, Azure, Big Data) are a plus.