

Trilyon, Inc.
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer V, remote (PST hours), with an 8-month contract at $60/hr. Key skills include strong Python, PySpark, advanced SQL, and hands-on Palantir Foundry experience, focusing on compliance reporting and data quality frameworks.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date
April 11, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#PySpark #Data Quality #Data Engineering #Palantir Foundry #Spark SQL #Compliance #Spark (Apache Spark) #Data Lineage #SQL (Structured Query Language) #Datasets #"ETL (Extract #Transform #Load)" #Monitoring #Python
Role description
Job Title: Data Engineer V
Location: REMOTE - PST hours
Job Duration: 8 months
Pay Rate: $60/hr
Responsibilities
Design and maintain Palantir Foundry pipelines for OEIS compliance reporting
Build and maintain data checker / validation frameworks within Foundry
Develop PySpark and Spark SQL transformations for largeβscale datasets
Ensure full data lineage, auditability, and reproducibility of compliance datasets
Partner with compliance SMEs and stakeholders to validate logic and reporting outputs
Support operational monitoring, issue triage, and data quality remediation
Required Technical Skills
Strong Python
PySpark / Spark SQL
Advanced SQL
Handsβon experience with Palantir Foundry
Experience building data validation, QA, or data quality frameworks
Understanding of compliance, audit, or regulatory reporting requirements
Job Title: Data Engineer V
Location: REMOTE - PST hours
Job Duration: 8 months
Pay Rate: $60/hr
Responsibilities
Design and maintain Palantir Foundry pipelines for OEIS compliance reporting
Build and maintain data checker / validation frameworks within Foundry
Develop PySpark and Spark SQL transformations for largeβscale datasets
Ensure full data lineage, auditability, and reproducibility of compliance datasets
Partner with compliance SMEs and stakeholders to validate logic and reporting outputs
Support operational monitoring, issue triage, and data quality remediation
Required Technical Skills
Strong Python
PySpark / Spark SQL
Advanced SQL
Handsβon experience with Palantir Foundry
Experience building data validation, QA, or data quality frameworks
Understanding of compliance, audit, or regulatory reporting requirements





