

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in San Francisco, CA, for 2 months with a high possibility of extension. Requires strong skills in Python, SQL, and data integration tools. Experience in Advertising Tech/Media industry is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
May 31, 2025
π - Project duration
1 to 3 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco Bay Area
-
π§ - Skills detailed
#Kafka (Apache Kafka) #Cloud #GCP (Google Cloud Platform) #Talend #Azure #Apache Spark #Computer Science #Apache Kafka #Scala #dbt (data build tool) #Programming #Python #Data Modeling #MySQL #Database Design #"ETL (Extract #Transform #Load)" #Data Engineering #PostgreSQL #AWS (Amazon Web Services) #Java #Snowflake #Spark (Apache Spark) #Data Pipeline #Data Warehouse #Data Integration #SQL (Structured Query Language) #Data Quality #Oracle
Role description
W2 candidates only (No C2C/No sponsorship)
Candidates from West coast area who are ready to travel to San Francisco are welcome to apply.
Job title: Data Engineer
Location: San Francisco, CA (Hybrid)
Duration: 2 months (High possibility of extension)
Responsibilities:
β’ Design, build, and maintain robust, scalable data pipelines and architectures tailored for complex financial data environments
β’ Develop and implement optimized data models and architectures to support analytics, reporting, and business insights
β’ Architect and build a modern, enterprise-grade financial data warehouse using Snowflake
β’ Leverage DBT (Data Build Tool) to transform raw data into trusted, reusable assets
β’ Partner closely with cross-functional teams to gather requirements and ensure data solutions align with business goals
β’ Implement best practices in data engineering for performance, reliability, and data quality
β’ Work within the Advertising Tech / Media industry landscape to support data-driven decisions and monetization strategies
β’ Continuously monitor, troubleshoot, and improve data workflows to ensure seamless and secure data operations
Qualifications:
β’ Bachelorβs degree in computer science, Information Systems, or a related field
β’ Proven experience as a Data Engineer or in a similar role
β’ Strong programming skills in languages such as Python, Java, or Scala
β’ Experience with data integration and ETL tools, such as Apache Spark, Apache Kafka, or Talend
β’ Proficiency in SQL and database technologies (e.g., PostgreSQL, MySQL, or Oracle)
β’ Familiarity with cloud platforms and services (e.g., AWS, Azure, or GCP)
β’ Knowledge of data modeling and database design principles
β’ Strong analytical and problem-solving skills
β’ Effective communication and collaboration skills
W2 candidates only (No C2C/No sponsorship)
Candidates from West coast area who are ready to travel to San Francisco are welcome to apply.
Job title: Data Engineer
Location: San Francisco, CA (Hybrid)
Duration: 2 months (High possibility of extension)
Responsibilities:
β’ Design, build, and maintain robust, scalable data pipelines and architectures tailored for complex financial data environments
β’ Develop and implement optimized data models and architectures to support analytics, reporting, and business insights
β’ Architect and build a modern, enterprise-grade financial data warehouse using Snowflake
β’ Leverage DBT (Data Build Tool) to transform raw data into trusted, reusable assets
β’ Partner closely with cross-functional teams to gather requirements and ensure data solutions align with business goals
β’ Implement best practices in data engineering for performance, reliability, and data quality
β’ Work within the Advertising Tech / Media industry landscape to support data-driven decisions and monetization strategies
β’ Continuously monitor, troubleshoot, and improve data workflows to ensure seamless and secure data operations
Qualifications:
β’ Bachelorβs degree in computer science, Information Systems, or a related field
β’ Proven experience as a Data Engineer or in a similar role
β’ Strong programming skills in languages such as Python, Java, or Scala
β’ Experience with data integration and ETL tools, such as Apache Spark, Apache Kafka, or Talend
β’ Proficiency in SQL and database technologies (e.g., PostgreSQL, MySQL, or Oracle)
β’ Familiarity with cloud platforms and services (e.g., AWS, Azure, or GCP)
β’ Knowledge of data modeling and database design principles
β’ Strong analytical and problem-solving skills
β’ Effective communication and collaboration skills