Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on SQL, Python, and cloud platforms (AWS, Azure, GCP). Requires expertise in data modeling, ETL frameworks, and streaming technologies. Contract length and pay rate unspecified.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 19, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
California, United States
-
🧠 - Skills detailed
#Snowflake #Leadership #Spark (Apache Spark) #BigQuery #Data Engineering #GCP (Google Cloud Platform) #Airflow #Azure #AWS (Amazon Web Services) #Redshift #dbt (data build tool) #Scala #Databricks #Computer Science #Cloud #Data Modeling #Python #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Synapse
Role description
A technology services client of ours is looking for Senior Data Engineer skills their ongoing projects. Below are the additional details of this role: Required Skills: β€’ Bachelor’s or Master’s in Computer Science, Information Systems, or related field. β€’ 10+ years of experience in Data Engineering or related fields. β€’ Strong expertise in SQL, Python, and/or Scala. β€’ Proven experience with cloud data platforms (AWS, Azure, or GCP) – services such as Redshift, Snowflake, BigQuery, Databricks, or Synapse. β€’ Deep understanding of data modeling, warehousing concepts, and distributed systems. β€’ Experience with streaming technologies (Kafka, Kinesis, Spark Streaming, Flink, etc.). β€’ Strong knowledge of ETL/ELT frameworks, orchestration tools (Airflow, DBT, etc.). β€’ Hands-on experience with containerization and CI/CD pipelines. β€’ Excellent problem-solving, communication, and leadership skills.