

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on SQL, Python, and cloud platforms (AWS, Azure, GCP). Requires expertise in data modeling, ETL frameworks, and streaming technologies. Contract length and pay rate unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 19, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#Snowflake #Leadership #Spark (Apache Spark) #BigQuery #Data Engineering #GCP (Google Cloud Platform) #Airflow #Azure #AWS (Amazon Web Services) #Redshift #dbt (data build tool) #Scala #Databricks #Computer Science #Cloud #Data Modeling #Python #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Synapse
Role description
A technology services client of ours is looking for Senior Data Engineer skills their ongoing projects.
Below are the additional details of this role:
Required Skills:
β’ Bachelorβs or Masterβs in Computer Science, Information Systems, or related field.
β’ 10+ years of experience in Data Engineering or related fields.
β’ Strong expertise in SQL, Python, and/or Scala.
β’ Proven experience with cloud data platforms (AWS, Azure, or GCP) β services such as Redshift, Snowflake, BigQuery, Databricks, or Synapse.
β’ Deep understanding of data modeling, warehousing concepts, and distributed systems.
β’ Experience with streaming technologies (Kafka, Kinesis, Spark Streaming, Flink, etc.).
β’ Strong knowledge of ETL/ELT frameworks, orchestration tools (Airflow, DBT, etc.).
β’ Hands-on experience with containerization and CI/CD pipelines.
β’ Excellent problem-solving, communication, and leadership skills.
A technology services client of ours is looking for Senior Data Engineer skills their ongoing projects.
Below are the additional details of this role:
Required Skills:
β’ Bachelorβs or Masterβs in Computer Science, Information Systems, or related field.
β’ 10+ years of experience in Data Engineering or related fields.
β’ Strong expertise in SQL, Python, and/or Scala.
β’ Proven experience with cloud data platforms (AWS, Azure, or GCP) β services such as Redshift, Snowflake, BigQuery, Databricks, or Synapse.
β’ Deep understanding of data modeling, warehousing concepts, and distributed systems.
β’ Experience with streaming technologies (Kafka, Kinesis, Spark Streaming, Flink, etc.).
β’ Strong knowledge of ETL/ELT frameworks, orchestration tools (Airflow, DBT, etc.).
β’ Hands-on experience with containerization and CI/CD pipelines.
β’ Excellent problem-solving, communication, and leadership skills.