ITTConnect

Data Engineer / Analytics Specialist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer / Analytics Specialist with 10+ years of experience, focusing on Snowflake, AWS, and ETL/ELT pipelines. It is a hybrid position based in the San Francisco Bay Area or NYC, with a pay rate of "unknown."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 17, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco Bay Area
-
🧠 - Skills detailed
#SQL (Structured Query Language) #AI (Artificial Intelligence) #API (Application Programming Interface) #Data Modeling #"ETL (Extract #Transform #Load)" #Snowflake #dbt (data build tool) #Data Pipeline #AWS (Amazon Web Services) #Data Architecture #Airflow #Data Engineering #Consulting
Role description
ITTConnect is seeking a Data Engineer / Analytics to work for one of our clients, a major Technology Consulting firm with headquarters in Europe. They are experts in tailored technology consulting and services to banks, investment firms and other Financial vertical clients. Job location: San Francisco Bay area or NY City. Work Model: Ability to come into the office as requested Citizenship Requirement: U.S. Citizens Only Seniority: 10+ years of total experience About the role: The Data Engineer / Analytics Specialist will support analytics, product insights, and AI initiatives. You will build robust data pipelines, integrate data sources, and enhance the organization's analytical foundations. Responsibilities: • Build and operate Snowflake-based analytics environments. • Develop ETL/ELT pipelines (DBT, Airflow, etc.). • Integrate APIs, external data sources, and streaming inputs. • Perform query optimization, basic data modeling, and analytics support. • Enable downstream GenAI and analytics use cases. Requirements: • 10+ years of overall technology experience • 3+ years hands-on AWS experience required • Strong SQL and Snowflake experience. • Hands-on pipeline engineering with DBT, Airflow, or similar. • Experience with API integrations and modern data architectures.