

Sophus IT Solutions
Data Engineer (Only US Citizens)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer, onsite in Seattle, WA, for US Citizens only. Contract length and pay rate are unspecified. Key skills required include Databricks, DBT, Snowflake, and experience with enterprise SaaS systems.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 1, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Seattle, WA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Databricks #dbt (data build tool) #Documentation #Compliance #PySpark #Data Science #Data Quality #SaaS (Software as a Service) #Snowflake #ML (Machine Learning) #SQL (Structured Query Language) #Workday #Python #Datasets #Data Engineering #AI (Artificial Intelligence)
Role description
Hi,
Please review the below JD:
Role: Data Engineer
Is this an onsite role? Preferred on-site
Location: 500 5th Ave N, Seattle WA 98109
Need only US Citizens
Responsibilities
1. Ingest and process data from Salesforce (Investments), Dynamics 365 & Coupa (Finance), Workday (HR), Concur (Operations), and external program/partner data (e.g., Health, Education).
1. Build and maintain pipelines in Databricks, DBT, and Snowflake to deliver curated datasets for AI/ML.
1. Design simple, reusable data models to support training and inference.
1. Implement data quality tests, documentation, and lineage in DBT
1. Ensure performance optimization, cost efficiency, and compliance with governance standards.
1. Collaborate with Knowledge Management (AI/Data Science) team to provide feature ready datasets.
Requirements
1. βHands-on experience with Databricks (PySpark/SQL, Unity Catalog), DBT (models, tests, docs), and Snowflake (SQL, performance tuning).
1. Strong background in ETL/ELT pipeline development.
1. Experience with enterprise SaaS systems (Salesforce, D365, Coupa, Workday, Concur).
1. Solid SQL and Python skills; knowledge of data quality frameworks.
Hi,
Please review the below JD:
Role: Data Engineer
Is this an onsite role? Preferred on-site
Location: 500 5th Ave N, Seattle WA 98109
Need only US Citizens
Responsibilities
1. Ingest and process data from Salesforce (Investments), Dynamics 365 & Coupa (Finance), Workday (HR), Concur (Operations), and external program/partner data (e.g., Health, Education).
1. Build and maintain pipelines in Databricks, DBT, and Snowflake to deliver curated datasets for AI/ML.
1. Design simple, reusable data models to support training and inference.
1. Implement data quality tests, documentation, and lineage in DBT
1. Ensure performance optimization, cost efficiency, and compliance with governance standards.
1. Collaborate with Knowledge Management (AI/Data Science) team to provide feature ready datasets.
Requirements
1. βHands-on experience with Databricks (PySpark/SQL, Unity Catalog), DBT (models, tests, docs), and Snowflake (SQL, performance tuning).
1. Strong background in ETL/ELT pipeline development.
1. Experience with enterprise SaaS systems (Salesforce, D365, Coupa, Workday, Concur).
1. Solid SQL and Python skills; knowledge of data quality frameworks.





