

Knowledge Services
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Senior Data Engineer for a 6-month contract, focusing on data extraction and Snowflake warehousing. Requires 5+ years of experience, proficiency in Snowflake and Fivetran, SQL, Python, and relevant certifications.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 10, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#dbt (data build tool) #S3 (Amazon Simple Storage Service) #Data Warehouse #Data Modeling #Cloud #Data Science #Computer Science #Python #Data Security #Data Lake #Security #Data Pipeline #Database Design #Version Control #Redshift #Synapse #Data Quality #Lambda (AWS Lambda) #Azure #Data Engineering #Data Integration #Data Processing #Snowflake #Documentation #Data Transformations #Scripting #Databases #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Web Scraping #AWS (Amazon Web Services) #Automation #Data Extraction #Scala #Fivetran #Data Analysis
Role description
Knowledge Services is seeking a remote Senior Data Engineer for a 6-month contract (potential for extension). This role may work 100% remotely.
β’ Please note that we CANNOT CONSIDER ANYONE REQUIRING C2C or Sponsorship for a work visa
Senior Data Engineer Overview:
The Sr. Data Engineer will lead the design, develop, and optimize data pipelines across diverse sources. This role focuses on efficient data extraction, staging, and loading into our Snowflake-based data warehouse, ensuring high availability, accuracy, and performance. The ideal candidate will bring a technical foundation in modern data engineering practices, hands-on experience with Snowflake and tools like Fivetran, and a collaborative mindset.
Duties and Responsibilities:
β’ Develop efficient and scalable data extraction methodologies to retrieve data from diverse sources, such as databases, APIs, web scraping, flat files, and streaming platforms.
β’ Design and implement robust data loading processes to efficiently ingest and integrate data into the latest data warehousing technology, ensuring data quality and consistency.
β’ Develop and maintain staging processes to facilitate the organization and transformation of raw data into structured formats, preparing it for downstream analysis and reporting.
β’ Implement data quality checks and validation processes to identify and address data anomalies, inconsistencies, and integrity issues.
β’ Identify and resolve performance bottlenecks in data extraction and loading processes, optimizing overall system performance and data availability.
β’ Ensure adherence to data security and privacy standards throughout the data extraction and warehousing processes, implementing appropriate access controls and encryption mechanisms.
β’ Create and maintain comprehensive documentation of data extraction and warehousing processes, including data flow diagrams, data dictionaries, and process workflows.
β’ Mentor and support junior data engineers, providing guidance on best practices, technical design, and professional development to elevate overall team capability and performance.
β’ Collaborate with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand their data requirements and provide efficient data engineering solutions.
β’ Stay updated with the latest advancements in data engineering, data warehousing, and cloud technologies, and proactively propose innovative solutions to enhance data extraction and warehousing capabilities.
Senior Data Engineer Requirements:
β’ Minimum of 5 yearsβ experience in data engineering, with a strong focus on data extraction and cloud-based warehousing; a combination of years of experience and relevant advanced technology proficiency will also be considered.
β’ Proficiency with Snowflake and data integration tools like Fivetran.
β’ Advanced SQL skills and experience with ETL/ELT frameworks.
β’ Experience with scripting languages such as Python for data processing and automation.
β’ Solid understanding of data modeling and relational database design.
β’ Strong communication skills and the ability to collaborate with technical and non-technical stakeholders.
β’ Strong analytical and problem-solving skills, with the ability to identify and resolve complex data engineering challenges.
Preferred Credentials and Experience:
β’ Bachelor's or Masterβs degree in Computer Science, Information Systems, or a related field.
β’ Snowflake Architect, Administrator, or Data Engineering certification required.
β’ Experience with dbt (data build tool) for managing data transformations, modeling, and maintaining version- controlled, modular SQL pipelines.
β’ Familiarity with cloud platforms such as AWS and Azure, including services like S3, Lambda, Redshift, Glue, Azure Data Lake, and Synapse.
Knowledge Services is seeking a remote Senior Data Engineer for a 6-month contract (potential for extension). This role may work 100% remotely.
β’ Please note that we CANNOT CONSIDER ANYONE REQUIRING C2C or Sponsorship for a work visa
Senior Data Engineer Overview:
The Sr. Data Engineer will lead the design, develop, and optimize data pipelines across diverse sources. This role focuses on efficient data extraction, staging, and loading into our Snowflake-based data warehouse, ensuring high availability, accuracy, and performance. The ideal candidate will bring a technical foundation in modern data engineering practices, hands-on experience with Snowflake and tools like Fivetran, and a collaborative mindset.
Duties and Responsibilities:
β’ Develop efficient and scalable data extraction methodologies to retrieve data from diverse sources, such as databases, APIs, web scraping, flat files, and streaming platforms.
β’ Design and implement robust data loading processes to efficiently ingest and integrate data into the latest data warehousing technology, ensuring data quality and consistency.
β’ Develop and maintain staging processes to facilitate the organization and transformation of raw data into structured formats, preparing it for downstream analysis and reporting.
β’ Implement data quality checks and validation processes to identify and address data anomalies, inconsistencies, and integrity issues.
β’ Identify and resolve performance bottlenecks in data extraction and loading processes, optimizing overall system performance and data availability.
β’ Ensure adherence to data security and privacy standards throughout the data extraction and warehousing processes, implementing appropriate access controls and encryption mechanisms.
β’ Create and maintain comprehensive documentation of data extraction and warehousing processes, including data flow diagrams, data dictionaries, and process workflows.
β’ Mentor and support junior data engineers, providing guidance on best practices, technical design, and professional development to elevate overall team capability and performance.
β’ Collaborate with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand their data requirements and provide efficient data engineering solutions.
β’ Stay updated with the latest advancements in data engineering, data warehousing, and cloud technologies, and proactively propose innovative solutions to enhance data extraction and warehousing capabilities.
Senior Data Engineer Requirements:
β’ Minimum of 5 yearsβ experience in data engineering, with a strong focus on data extraction and cloud-based warehousing; a combination of years of experience and relevant advanced technology proficiency will also be considered.
β’ Proficiency with Snowflake and data integration tools like Fivetran.
β’ Advanced SQL skills and experience with ETL/ELT frameworks.
β’ Experience with scripting languages such as Python for data processing and automation.
β’ Solid understanding of data modeling and relational database design.
β’ Strong communication skills and the ability to collaborate with technical and non-technical stakeholders.
β’ Strong analytical and problem-solving skills, with the ability to identify and resolve complex data engineering challenges.
Preferred Credentials and Experience:
β’ Bachelor's or Masterβs degree in Computer Science, Information Systems, or a related field.
β’ Snowflake Architect, Administrator, or Data Engineering certification required.
β’ Experience with dbt (data build tool) for managing data transformations, modeling, and maintaining version- controlled, modular SQL pipelines.
β’ Familiarity with cloud platforms such as AWS and Azure, including services like S3, Lambda, Redshift, Glue, Azure Data Lake, and Synapse.






