

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on data pipelines and cloud infrastructure (AWS, Azure, GCP). Contract length is over 6 months, and strong skills in SQL, Python, and big data technologies are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 8, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Quality #Data Privacy #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Security #Scala #Data Modeling #Big Data #Cloud #Kafka (Apache Kafka) #SQL (Structured Query Language) #Python #Airflow #Data Pipeline #GCP (Google Cloud Platform) #Azure #Data Engineering #Data Science #Spark (Apache Spark) #Leadership #GDPR (General Data Protection Regulation) #AWS (Amazon Web Services) #Data Processing
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer (Remote)
Job Type: W2 Full-time
Location: Remote
Job Summary:
We are looking for a highly experienced Senior Data Engineer with over 10 years of experience to design, build, and optimize data pipelines and architectures. This role requires deep technical expertise, strong problem-solving skills, and the ability to drive strategic data initiatives in a remote, fast-paced environment.
Key Responsibilities:
β’ Design and implement scalable, secure data pipelines and ETL/ELT processes
β’ Architect and maintain cloud-based data infrastructure (AWS, Azure, or Google Cloud Platform)
β’ Optimize data systems for performance and reliability
β’ Collaborate with data scientists, analysts, and engineers across teams
β’ Ensure data quality, governance, and security best practices
β’ Mentor junior engineers and lead technical initiatives
Requirements:
β’ 10+ years of experience in data engineering or related roles
β’ Expert in SQL, Python, and distributed data systems (Spark, Kafka, etc.)
β’ Deep knowledge of data modeling, warehousing, and big data technologies
β’ Strong experience with cloud platforms and CI/CD pipelines
β’ Excellent communication and leadership skills
Preferred:
β’ Experience with real-time data processing
β’ Knowledge of data privacy regulations (e.g., GDPR, HIPAA)
β’ Familiarity with orchestration tools (Airflow, dbt, etc.)