Exdonuts

Qlik Data Engineer :: Raleigh, NC

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Qlik Data Engineer in Raleigh, NC, with a contract length of unspecified duration and a pay rate of "unknown." Candidates should have 3-5 years of experience with DBT, Snowflake, SQL, and data warehousing.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 14, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#SQL Queries #Physical Data Model #Consulting #Data Quality #Snowflake #Qlik #Data Governance #Data Ingestion #SQL (Structured Query Language) #Data Accuracy #Data Analysis #Version Control #Clustering #Data Engineering #.Net #dbt (data build tool) #Data Modeling #Automation #"ETL (Extract #Transform #Load)" #Macros #Data Manipulation #Documentation #GIT #Code Reviews #Scala
Role description
HI Hope you are doing well! I have an urgent requirement with one of my clients. Please find the job details below and forward me your updated resume along with your contact detailsΒ ajeet@realtekconsulting.net DBT Snowflake Developer Location :: Raleigh, NC Skills Required DBT, Snowflake, SQL, Data warehousing, ETL, Git, Data Modelling techniques Experience: 3-5 years Roles and Responsibilities: β€’ DBT Development & Data Transformation: β€’ Design, develop, and maintain robust and scalable data transformation β€’ pipelines using dbt on the Snowflake platform. β€’ DBT Macro Development to Create and utilize Jinja-based DBT β€’ macros to promote code reusability, modularity, and dynamic SQL β€’ generation within DBT projects. β€’ Data Transformation & Orchestration to Implement and manage data β€’ transformation pipelines using DBT, integrating with various data β€’ sources and ensuring efficient data flow. β€’ Utilize advanced dbt concepts, including macros, materializations β€’ (e.g., incremental, view, table), snapshots, and configurations to build efficient data models. β€’ Write highly optimized and complex SQL queries for data manipulation, cleaning, β€’ aggregation, and transformation within dbt models. β€’ Implement and enforce best practices for dbt project structure, version control β€’ (Git), documentation, and testing. β€’ Data Modeling: β€’ Collaborate with data analysts, engineers, and business stakeholders β€’ to understand data requirements and translate them into effective β€’ data models (e.g., star schema, snowflake schema). β€’ Design and implement logical and physical data models within dbt β€’ to support analytical and reporting needs. β€’ Snowflake Platform Expertise: β€’ Leverage Snowflake features and functionalities for performance β€’ optimization, including virtual warehouses, clustering, caching, and query optimization. β€’ Manage and optimize data ingestion and integration processes from various β€’ sources into Snowflake. β€’ Collaboration & Communication: β€’ Work closely with cross-functional teams to understand business β€’ equirements, troubleshoot issues, and deliver high-quality data solutions. β€’ Participate in code reviews, provide constructive feedback, and ensure β€’ adherence to coding standards. β€’ Communicate technical concepts effectively to both technical and β€’ non-technical audiences. β€’ Quality & Governance: β€’ Ensure data quality, integrity, and lineage throughout the data β€’ transformation process. β€’ Implement and maintain DBT tests to ensure data quality, integrity, β€’ and adherence to business rules. β€’ Implement and maintain data governance policies and procedures β€’ within the dbt environment. β€’ Develop and execute automated tests for dbt models to ensure β€’ data accuracy and reliability. β€’ Continuous Improvement: β€’ Stay updated with the latest dbt and Snowflake features and best practices. β€’ Identify opportunities for process improvements and implement automation β€’ where appropriate. Required Skills: β€’ Proven hands-on experience with dbt in a production environment, including β€’ extensive use of macros and advanced modeling techniques. β€’ Expert-level proficiency in SQL for data querying, manipulation, and transformation. β€’ Strong experience with Snowflake, including performance tuning and optimization. β€’ Solid understanding of data warehousing concepts and ETL/ELT processes. β€’ Experience with version control systems, particularly Git. β€’ Familiarity with data modeling principles (star schema, snowflake schema). β€’ Excellent problem-solving and analytical skills. β€’ Strong communication and collaboration abilities."