

Senior Analytics Engineer (Contract) (Remote)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Analytics Engineer (Contract) for a remote position, offering a competitive pay rate. Key skills include data modeling (dbt), pipeline development, cloud platforms (BigQuery, Snowflake), and collaboration with cross-functional teams. An advanced degree is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 7, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #BigQuery #API (Application Programming Interface) #Cloud #Data Quality #Monitoring #dbt (data build tool) #Deployment #Docker #Data Analysis #AWS (Amazon Web Services) #Data Pipeline #Data Engineering #PostgreSQL #Azure #"ETL (Extract #Transform #Load)" #Kubernetes #Databases #Snowflake #Computer Science
Role description
Verified Job On Employer Career Site
Job Summary:
Uniswap Foundation is dedicated to supporting the growth and sustainability of the Uniswap community. As a Senior Analytics Engineer, you will be responsible for transforming raw data into analytics-ready models, developing and optimizing data pipelines, and collaborating with various teams to ensure data quality and accessibility.
Responsibilities:
• Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
• Develop & maintain pipelines to Ingest onchain events, API feeds and third‑party sources into Dune/BigQuery/Snowflake, with monitoring and alerting
• Optimize pipeline health: Implement monitoring, alerting and root‑cause workflows to quickly detect and resolve data issues
• Collaborate & iterate: Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
• Centralize data sources: Merge disparate feeds into a unified repository while provisioning data to where it’s needed
• Plan & build in‑house models: As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
• Champion best practices: Contribute to open standards in the Uniswap and DEX communities
• Stay current: Evaluate emerging data‑engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack
Qualifications:
Required:
• Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
• Develop & maintain pipelines to Ingest onchain events, API feeds and third‑party sources into Dune/BigQuery/Snowflake, with monitoring and alerting
• Optimize pipeline health: Implement monitoring, alerting and root‑cause workflows to quickly detect and resolve data issues
• Collaborate & iterate: Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
• Centralize data sources: Merge disparate feeds into a unified repository while provisioning data to where it’s needed
• Plan & build in‑house models: As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
• Champion best practices: Contribute to open standards in the Uniswap and DEX communities
• Stay current: Evaluate emerging data‑engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack
• Engineering‑minded: you treat analytics transformations as production code robust, testable and maintainable
• Future‑focused: adept with Dune Spellbook today and excited to build self‑hosted solutions tomorrow
• Detail‑obsessed: you identify edge cases, troubleshoot upstream issues and prevent data drift proactively
• Collaborative: you translate requirements into solutions and work seamlessly across small, cross‑functional teams
Preferred:
• Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse
• Experience building subgraphs or equivalent custom indexers (e.g., The Graph, Ponder)
• Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes
• Advanced degree in Computer Science, Data Engineering, or a related technical field
Company:
Grants making foundation for the Uniswap community. Founded in 2022, the company is headquartered in , with a team of 2-10 employees. The company is currently Early Stage. Uniswap Foundation has a track record of offering H1B sponsorships.
Verified Job On Employer Career Site
Job Summary:
Uniswap Foundation is dedicated to supporting the growth and sustainability of the Uniswap community. As a Senior Analytics Engineer, you will be responsible for transforming raw data into analytics-ready models, developing and optimizing data pipelines, and collaborating with various teams to ensure data quality and accessibility.
Responsibilities:
• Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
• Develop & maintain pipelines to Ingest onchain events, API feeds and third‑party sources into Dune/BigQuery/Snowflake, with monitoring and alerting
• Optimize pipeline health: Implement monitoring, alerting and root‑cause workflows to quickly detect and resolve data issues
• Collaborate & iterate: Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
• Centralize data sources: Merge disparate feeds into a unified repository while provisioning data to where it’s needed
• Plan & build in‑house models: As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
• Champion best practices: Contribute to open standards in the Uniswap and DEX communities
• Stay current: Evaluate emerging data‑engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack
Qualifications:
Required:
• Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
• Develop & maintain pipelines to Ingest onchain events, API feeds and third‑party sources into Dune/BigQuery/Snowflake, with monitoring and alerting
• Optimize pipeline health: Implement monitoring, alerting and root‑cause workflows to quickly detect and resolve data issues
• Collaborate & iterate: Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
• Centralize data sources: Merge disparate feeds into a unified repository while provisioning data to where it’s needed
• Plan & build in‑house models: As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
• Champion best practices: Contribute to open standards in the Uniswap and DEX communities
• Stay current: Evaluate emerging data‑engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack
• Engineering‑minded: you treat analytics transformations as production code robust, testable and maintainable
• Future‑focused: adept with Dune Spellbook today and excited to build self‑hosted solutions tomorrow
• Detail‑obsessed: you identify edge cases, troubleshoot upstream issues and prevent data drift proactively
• Collaborative: you translate requirements into solutions and work seamlessly across small, cross‑functional teams
Preferred:
• Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse
• Experience building subgraphs or equivalent custom indexers (e.g., The Graph, Ponder)
• Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes
• Advanced degree in Computer Science, Data Engineering, or a related technical field
Company:
Grants making foundation for the Uniswap community. Founded in 2022, the company is headquartered in , with a team of 2-10 employees. The company is currently Early Stage. Uniswap Foundation has a track record of offering H1B sponsorships.