Openmind Technologies Inc.

New Age Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a New Age Data Modeler with a 6-month contract, 100% remote in Sunnyvale, CA. Key skills include SQL, dbt, and Apache Airflow. Experience with cloud data warehouses and software engineering practices is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
800
-
πŸ—“οΈ - Date
October 1, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Airflow #GIT #Normalization #Observability #SQL (Structured Query Language) #Data Modeling #Computer Science #Scala #Cloud #Apache Airflow #Data Engineering #"ETL (Extract #Transform #Load)" #Redshift #Complex Queries #Documentation #Snowflake #Data Pipeline #BigQuery #Data Governance #Version Control #Data Warehouse #dbt (data build tool)
Role description
We are seeking a New Age Data Engineer/Modeler to join our data engineering team and help build modern, scalable, and maintainable data transformation pipelines. If interested, kindly reply back with your updated resume and contact info. Duration: 6 months Contract (possibilities of Extension) Location: Sunnyvale, CA (100% remote) Summary: This role combines deep SQL expertise with hands-on experience in DBT (Data Build Tool) and orchestration platforms like Apache Airflow to design, implement, and maintain reliable data models that power analytics and decision-making across the organization. Key Responsibilities β€’ Design, develop, and maintain modular SQL transformations using dbt to create scalable, high-quality data models. β€’ Consolidate and stitch together multiple SQL sources into unified, reliable outputs.Integrate and orchestrate dbt workflows within Apache Airflow to ensure smooth, automated data pipelines.Apply software engineering best practices (version control, testing, CI/CD) to data modeling and transformation processes. β€’ Collaborate with analytics, product, and engineering teams to understand data requirements and translate them into efficient dbt models. β€’ Leverage dbt’s built-in documentation, testing, and lineage capabilities to improve transparency and governance.Optimize data models for performance, scalability, and reusability, ensuring consistency across the data platform.Drive adoption of modern data stack tools and methodologies within the data organization. Required Skills and Experience: β€’ Strong proficiency in SQL with hands-on experience creating complex queries and transformations. Proven expertise in dbt (Data Build Tool) for modular data modeling, testing, documentation, and version control.Familiarity with Apache Airflow or similar orchestration tools for scheduling and dependency management. β€’ Experience with cloud-based data warehouses (Snowflake, BigQuery, Redshift, or similar).Solid understanding of data modeling concepts (star/snowflake schemas, normalization, incremental models). β€’ Knowledge of software engineering practices such as Git-based workflows, CI/CD, and testing frameworks. β€’ Strong problem-solving skills with the ability to work independently as well as cross-functionally. Preferred Qualifications β€’ Bachelor s degree in Data Analytics, Computer Science, Business, or a related field (or equivalent experience) β€’ Exposure to analytics engineering or modern data stack environments. β€’ Experience consolidating multiple disparate SQL sources into a unified data model. β€’ Familiarity with data governance, lineage, and observability tools.Strong communication skills for collaborating with technical and non-technical stakeholders. We are an Equal Opportunity Employer