WeVision LLC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Contract-to-Hire) with an initial 3-month contract, remote in CA, offering a pay rate of "TBD." Key skills include Python, SQL, Snowflake, and DBT, with a strong finance background preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 24, 2025
πŸ•’ - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
California, United States
-
🧠 - Skills detailed
#dbt (data build tool) #Data Modeling #GCP (Google Cloud Platform) #Cloud #Data Accuracy #Apache Airflow #Data Engineering #Python #"ETL (Extract #Transform #Load)" #Observability #Airflow #Azure #Data Science #Automation #Data Pipeline #Data Quality #Snowflake #Scala #Data Transformations #Data Processing #Agile #Consulting #Monitoring #AWS (Amazon Web Services) #Code Reviews #SQL (Structured Query Language) #Computer Science
Role description
Data Engineer (Contract-to-Hire) Employment Type: Contract (Initial 3 Months) Contract Duration: 3 Months, with strong potential to convert to Full-Time Location: Remote in CA Industry: Financial / Consumer / Data & Analytics Role Overview We are seeking an experienced Data Engineer with a strong finance background to support the design, development, and optimization of modern data pipelines and analytics infrastructure. This role will begin as a 3-month consulting engagement and may convert to a full-time position based on performance and business needs. The ideal candidate is hands-on, detail-oriented, and comfortable working in a fast-paced, data-driven environment, supporting financial reporting, analytics, and operational use cases. Key Responsibilities β€’ Design, build, and maintain scalable ETL/ELT pipelines using modern data tools β€’ Develop and manage data transformations using DBT β€’ Orchestrate and automate workflows using Apache Airflow β€’ Build and optimize data models in Snowflake for analytics and reporting β€’ Write clean, efficient, and well-tested Python code for data processing and automation β€’ Partner closely with finance, analytics, and business stakeholders to translate requirements into data solutions β€’ Ensure data accuracy, reliability, and performance through monitoring and validation β€’ Document data models, pipelines, and technical designs β€’ Participate in agile development, code reviews, and best-practice sharing Required Qualifications β€’ Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field β€’ 3+ years of experience in data engineering or a related role β€’ Strong proficiency in Python and SQL β€’ Hands-on experience with Snowflake data warehousing β€’ Practical experience using DBT for data modeling and transformations β€’ Experience scheduling and managing workflows with Airflow (or similar orchestration tools) β€’ Solid understanding of data warehousing concepts and best practices β€’ Strong communication skills and ability to work cross-functionally Preferred / Nice to Have β€’ Experience working with financial data, accounting systems, or finance teams β€’ Familiarity with ERP systems such as NetSuite β€’ Experience with cloud platforms (AWS, GCP, or Azure) β€’ Knowledge of data quality, testing, and observability frameworks β€’ Experience in high-growth or fast-paced environments