Kavaliro

Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include data engineering, data warehousing, ETL processes, and cloud technologies. A Bachelor’s degree and 3–5 years of relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
544
-
🗓️ - Date
April 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Aurora, CO
-
🧠 - Skills detailed
#Data Warehouse #Databases #Data Management #Data Lake #Data Engineering #Data Pipeline #Cloud #Data Quality #Metadata #Data Access #Snowflake #Web Services #Data Science #Computer Science #Data Security #"ETL (Extract #Transform #Load)" #Data Storage #Programming #Security #Storage #Scala #BI (Business Intelligence)
Role description
JOB DESCRIPTION: Our client is seeking a Data Engineer for an exciting contract to hire scenario. The Data Engineer designs, builds, and maintains scalable data infrastructure and pipelines for customer. This role combines data warehousing expertise with modern cloud-based technologies to ensure reliable, secure, and performant data delivery across the district. This role collaborates closely with cross-functional teams to ensure the availability, reliability, and performance of our data systems and solutions DUTIES: • Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. • Design and maintain data models, schemas, and database structures to support analytical and operational use cases. Develop robust ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to integrate data from diverse sources into our data ecosystem. • Implement data security, access controls, backup, and recovery strategies across all data platforms. Champion data quality standards in pipeline and warehouse design. • Configure and manage data infrastructure components, including databases, data warehouses, data lakes, and distribute computing frameworks to support scalable data storage and processing. • Build and maintain integrations with internal and external data sources and APIs. • Implement RESTful APIs and web services for data access and consumption. Ensure compatibility and interoperability between different systems and platforms. • Collaborate with data scientists, analysts, developers, and other stakeholders to understand data requirements and deliver tailored solutions. • Provide technical guidance and support to team members and stakeholders as needed. • Perform other duties as assigned. REQUIREMENTS: • Bachelor’s degree in computer science, Engineering, Information Systems, or related field; advanced degree preferred. • 3–5 years of experience in data engineering, data warehousing, or BI solution delivery including dimensional modeling, ETL development, and pipeline orchestration. • Strong knowledge of data warehouse concepts including OLAP, star/snowflake schemas, dimensional modeling, and metadata management. Experience with cloud data platforms and Infrastructure-as-Code. • Proficiency in programming languages commonly used in data engineering. Excellent problem-solving skills and attention to detail. Ability to communicate complex technical concepts to non-technical stakeholders.