Trigent Software Inc

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12-month contract, offering a competitive pay rate. Requires 10+ years of experience in data engineering, strong SQL skills, and familiarity with cloud environments. Expertise in ETL, data modeling, and data governance is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Version Control #Data Governance #Microsoft Power BI #Data Pipeline #Scala #Data Access #Strategy #Data Modeling #Looker #Tableau #Snowflake #Data Engineering #BigQuery #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Agile #Data Quality #Visualization #Redshift #BI (Business Intelligence) #Data Integrity #GIT #Data Security #Security #Data Analysis #Azure #Cloud #dbt (data build tool) #SQL (Structured Query Language) #Datasets #GCP (Google Cloud Platform)
Role description
12 months contract What you’ll do: • Design and develop high-performance, scalable, and reliable data models for our attribution and measurement platforms. • Build and maintain robust ETL/ELT pipelines to ingest, transform, and load large datasets from various sources. • Collaborate with data engineers and analysts to define semantic layers and ensure consistency across data sources • Manage the end-to-end pixel tracking solution, ensuring high availability and low-latency data capture for critical measurement needs • Implement and promote best practices for data governance, data quality, and data security. • Enable self-service data access and analysis for stakeholders through well-designed data platforms and tools. • Monitor data pipeline performance, troubleshoot issues, and optimize for efficiency and cost. • Contribute to the overall architecture and strategy of our data platform. Who you are: • 10+ years of hands-on data engineering experience, with a strong track record of designing, building, and optimizing data platforms in high-volume or ad tech environments. • A strong collaborator who thrives in a cross-functional setting, effectively communicating technical concepts to diverse audiences • Strong SQL skills, including complex joins, aggregations, and performance tuning • Experience working with semantic layers and data modeling for analytics • Solid understanding of data analysis and visualization best practices • Passionate about data quality and governance, with an eye for detail and a commitment to maintaining data integrity • Experience using version control systems, preferably Git • Excellent communication skills and the ability to work cross-functionally • Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift) • Experience working in cloud environments such as AWS, GCP, or Azure (nice to have) • Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker • Experience working in agile data teams and managing BI projects • Familiarity with dbt or other data transformation frameworks