Jobs via Dice

Data Engineer II / Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer II (Contract, Remote) with a pay rate of $60-$64/hr. Key skills include API data ingestion, advanced SQL, Snowflake, Databricks, and Python. Experience in cloud environments and independent work is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
512
-
🗓️ - Date
January 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Azure #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Data Ingestion #Cloud #Snowflake #API (Application Programming Interface) #Python #dbt (data build tool) #Fivetran #AWS (Amazon Web Services) #REST (Representational State Transfer) #BI (Business Intelligence) #Databricks #Airflow #Data Pipeline #Scala #Spark (Apache Spark) #PySpark #Data Quality #Tableau #Consulting #"ETL (Extract #Transform #Load)" #Data Engineering
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Russell, Tobin & Associates, is seeking the following. Apply via Dice today! Hi, Hope you are doing good, this is Prabhakar Jha from Russell Tobin we are working with our Fortune 50 client for the below New Requirement please have a look at the below Job Description and let me know if you would be having any matching candidates for the same. Data Engineer II Remote Contract Pay Range: - $60 to $64/hr. on w2. Role Summary We are seeking an experienced Contract Data Engineer to build and support scalable data pipelines with a strong focus on API integrations and cloud data platforms. This is a hands-on, project-based role requiring quick ramp-up and delivery. Responsibilities • Build and maintain data ingestion pipelines from internal and third-party APIs (REST/SOAP) • Develop ETL/ELT processes using Databricks and Python/PySpark • Manage and optimize data models in Snowflake and Postgres • Ensure data quality, reliability, and performance • Partner with analytics and BI teams to support tools like Tableau • Monitor, troubleshoot, and document data pipelines Requirements • Proven experience as a Data Engineer • Strong experience with API-based data ingestion • Advanced SQL skills and relational database experience • Hands-on experience with Snowflake and Databricks • Proficiency in Python • Experience in cloud environments (AWS, Azure, or Google Cloud Platform) • Ability to work independently in a fast-paced, contract setting Nice to Have • Airflow, dbt, Fivetran • Tableau or BI support experience • Consulting or contract background