HireTalent - Diversity Staffing & Recruiting Firm

Cloud Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer on a 10-month contract, 100% remote, requiring CST availability. Key skills include SQL, Python, Snowflake, Databricks, and S/4HANA. A minimum of 10 years in data engineering is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Database Administration #Python #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Terraform #Cloud #dbt (data build tool) #Snowflake #Data Modeling #Data Engineering #Databricks #Streamlit #BI (Business Intelligence) #Scripting #Scala #Data Extraction #Infrastructure as Code (IaC) #Visualization #SAP #Data Pipeline
Role description
Job Title: Cloud Data Engineer Contract Duration: 10 months Location: 100% Remote (must be available to work Central Standard Time – CST hours) Job Description: We are seeking a highly experienced and technical Senior Data Engineer to lead the end-to-end lifecycle of our data ecosystems. This role focuses on sophisticated scripting, robust data modeling, and performance optimization across a modern data stack. The ideal candidate is a problem-solver who can extract data from complex source systems and transform it into actionable intelligence through high-performance stored procedures and optimized data loads. Key Responsibilities: β€’ Data Extraction & Engineering: Build and maintain scalable data pipelines to extract data from various source systems, ensuring integrity and efficiency. β€’ Modeling & Analysis: Design, develop, and tune complex data models to support business intelligence and analytical needs. β€’ Performance Optimization: Expertly manage stored procedures and data loads, identifying bottlenecks and implementing performance-tuning strategies to ensure low latency. β€’ Application Development: Utilize Python and Streamlit to build interactive data tools and internal applications. Technical Requirements: Must-Haves: β€’ Education: Bachelor’s degree in any major. β€’ Experience: Minimum of 10 years of professional experience in data engineering, database administration, or a related field. β€’ Core Technical Stack: Advanced proficiency in: β€’ Languages: SQL and Python. β€’ Platforms: Snowflake and Databricks. β€’ ERP Systems: S/4HANA and SAP. β€’ Visualization/Apps: Streamlit. Nice-to-Haves: β€’ Experience with Infrastructure as Code (IaC) using Terraform. β€’ Experience with data transformation workflows using dbt.