

SnapCode Inc
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, contracted for 6 months nearshore, offering a competitive pay rate. Key skills include DBT, Snowflake, Python, and SQL. Requires 4+ years of data engineering experience and a relevant undergraduate degree.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
200
-
🗓️ - Date
December 7, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Databases #Data Engineering #Datasets #Statistics #Data Ingestion #Fivetran #Batch #Data Warehouse #dbt (data build tool) #Redshift #Snowflake #ML (Machine Learning) #Workiva #Airflow #Big Data #Data Science #Python #Scala #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #BI (Business Intelligence) #AI (Artificial Intelligence) #Data Pipeline #Cloud #Data Processing #Computer Science #SQL (Structured Query Language) #Security #Business Analysis #S3 (Amazon Simple Storage Service) #Deployment
Role description
Need strong on DBT and Snowflake.
Job Title: Senior Data Engineer
Location: Nearshore
Duration: 6months
The Senior Data Engineer at Workiva will be an instrumental part of data workflows throughout the organization. You will own the technical design and implementation of systems that support multiple data analytics teams and business intelligence engineers reliably and at scale. You will provide cutting-edge, reliable, and easy-to-use systems for ingesting and processing data and help the teams that build data-intensive applications be successful.
This role will collaborate with many cross-functional teams on the planning, execution, and successful completion of technical projects. You will build and maintain batch and real-time data flows used for business intelligence, analytics, and machine learning within all organizations across Workiva. Senior Data Engineers work primarily with other Data Engineers but also with Data Scientists, Business Intelligence Engineers and business partners to ensure quality, reliability, and performance at the highest level.
What You'll Do
• Design and implement modern data transformation using tools like dbt (Data Build Tool) and manage data ingestion patterns utilizing services like Fivetran (based on prior conversation and source mention of interaction tools).
• Own the implementation of data pipelines from various data sources using new and existing patterns, leveraging expertise in Python for complex processing logic.
• Maintain the health of the data ecosystem by configuring deployment, monitors, defining alerts, and quality checks.
• Build highly reliable CI/CD processes to ensure high quality data throughout the data ecosystem.
• Review peer code and submit thorough and actionable feedback based on team standards and industry best practices.
• Tune processes and SQL within current data platforms, including Redshift and Snowflake, to reduce cost and wait time. Implement systems to balance data volume, latency and customer requirements.
• Triage and resolve production issues. Communicate with individual business partners on status and escalate as needed.
• Understand the data at a deep level, apply security appropriately, and escalate as needed.
• Work with business partners to write requirements and test deployed code.
• Design systems that enable next generation AI to unlock business insights and sales opportunities.
• Join rotation to support production workflows during off hours.
Minimum Qualifications
What You'll Need
• 4+ years of relevant experience in the data engineering role, including data warehousing and business intelligence tools, techniques, and technology, or experience in analytics, business analysis or comparable consumer analytics solutions.
• Undergraduate Degree or equivalent combination of education and experience in a related field.
Preferred Qualifications
• Bachelor's degree in Computer Science, Engineering, Math, Finance, Statistics or related discipline.
• Extensive experience with cloud data warehouses Snowflake and Redshift.
• Experience in big data processing and using databases in a business environment with large-scale, complex datasets.
• Experience with the tools to manage and interact with data, including Airflow, dbt, and Fivetran.
• Extensive knowledge of SQL query design and tuning for performance and accuracy.
• Experience with Python.
• Strong planning and organizing skills to prioritize numerous projects and ensure data is delivered in an accurate and understandable manner to the end user.
• Experience with AWS cloud technologies including S3, Redshift, Lambda, Quicksight and Kinesis.
• Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
Need strong on DBT and Snowflake.
Job Title: Senior Data Engineer
Location: Nearshore
Duration: 6months
The Senior Data Engineer at Workiva will be an instrumental part of data workflows throughout the organization. You will own the technical design and implementation of systems that support multiple data analytics teams and business intelligence engineers reliably and at scale. You will provide cutting-edge, reliable, and easy-to-use systems for ingesting and processing data and help the teams that build data-intensive applications be successful.
This role will collaborate with many cross-functional teams on the planning, execution, and successful completion of technical projects. You will build and maintain batch and real-time data flows used for business intelligence, analytics, and machine learning within all organizations across Workiva. Senior Data Engineers work primarily with other Data Engineers but also with Data Scientists, Business Intelligence Engineers and business partners to ensure quality, reliability, and performance at the highest level.
What You'll Do
• Design and implement modern data transformation using tools like dbt (Data Build Tool) and manage data ingestion patterns utilizing services like Fivetran (based on prior conversation and source mention of interaction tools).
• Own the implementation of data pipelines from various data sources using new and existing patterns, leveraging expertise in Python for complex processing logic.
• Maintain the health of the data ecosystem by configuring deployment, monitors, defining alerts, and quality checks.
• Build highly reliable CI/CD processes to ensure high quality data throughout the data ecosystem.
• Review peer code and submit thorough and actionable feedback based on team standards and industry best practices.
• Tune processes and SQL within current data platforms, including Redshift and Snowflake, to reduce cost and wait time. Implement systems to balance data volume, latency and customer requirements.
• Triage and resolve production issues. Communicate with individual business partners on status and escalate as needed.
• Understand the data at a deep level, apply security appropriately, and escalate as needed.
• Work with business partners to write requirements and test deployed code.
• Design systems that enable next generation AI to unlock business insights and sales opportunities.
• Join rotation to support production workflows during off hours.
Minimum Qualifications
What You'll Need
• 4+ years of relevant experience in the data engineering role, including data warehousing and business intelligence tools, techniques, and technology, or experience in analytics, business analysis or comparable consumer analytics solutions.
• Undergraduate Degree or equivalent combination of education and experience in a related field.
Preferred Qualifications
• Bachelor's degree in Computer Science, Engineering, Math, Finance, Statistics or related discipline.
• Extensive experience with cloud data warehouses Snowflake and Redshift.
• Experience in big data processing and using databases in a business environment with large-scale, complex datasets.
• Experience with the tools to manage and interact with data, including Airflow, dbt, and Fivetran.
• Extensive knowledge of SQL query design and tuning for performance and accuracy.
• Experience with Python.
• Strong planning and organizing skills to prioritize numerous projects and ensure data is delivered in an accurate and understandable manner to the end user.
• Experience with AWS cloud technologies including S3, Redshift, Lambda, Quicksight and Kinesis.
• Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.





