Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in San Francisco, CA, for 2 months with a high possibility of extension. Requires strong skills in Python, SQL, and data integration tools. Experience in Advertising Tech/Media industry is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
May 31, 2025
πŸ•’ - Project duration
1 to 3 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Francisco Bay Area
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #Cloud #GCP (Google Cloud Platform) #Talend #Azure #Apache Spark #Computer Science #Apache Kafka #Scala #dbt (data build tool) #Programming #Python #Data Modeling #MySQL #Database Design #"ETL (Extract #Transform #Load)" #Data Engineering #PostgreSQL #AWS (Amazon Web Services) #Java #Snowflake #Spark (Apache Spark) #Data Pipeline #Data Warehouse #Data Integration #SQL (Structured Query Language) #Data Quality #Oracle
Role description
W2 candidates only (No C2C/No sponsorship) Candidates from West coast area who are ready to travel to San Francisco are welcome to apply. Job title: Data Engineer Location: San Francisco, CA (Hybrid) Duration: 2 months (High possibility of extension) Responsibilities: β€’ Design, build, and maintain robust, scalable data pipelines and architectures tailored for complex financial data environments β€’ Develop and implement optimized data models and architectures to support analytics, reporting, and business insights β€’ Architect and build a modern, enterprise-grade financial data warehouse using Snowflake β€’ Leverage DBT (Data Build Tool) to transform raw data into trusted, reusable assets β€’ Partner closely with cross-functional teams to gather requirements and ensure data solutions align with business goals β€’ Implement best practices in data engineering for performance, reliability, and data quality β€’ Work within the Advertising Tech / Media industry landscape to support data-driven decisions and monetization strategies β€’ Continuously monitor, troubleshoot, and improve data workflows to ensure seamless and secure data operations Qualifications: β€’ Bachelor’s degree in computer science, Information Systems, or a related field β€’ Proven experience as a Data Engineer or in a similar role β€’ Strong programming skills in languages such as Python, Java, or Scala β€’ Experience with data integration and ETL tools, such as Apache Spark, Apache Kafka, or Talend β€’ Proficiency in SQL and database technologies (e.g., PostgreSQL, MySQL, or Oracle) β€’ Familiarity with cloud platforms and services (e.g., AWS, Azure, or GCP) β€’ Knowledge of data modeling and database design principles β€’ Strong analytical and problem-solving skills β€’ Effective communication and collaboration skills