

Sapphire Software Solutions Inc
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, specializing in Snowflake, DBT, and Power BI. It is a 1-year W2 contract located in Oaks, Pennsylvania. Key skills include SQL, ETL/ELT processes, and cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 2, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Oaks, PA
-
π§ - Skills detailed
#Storage #Scala #Security #Snowflake #Data Security #Data Engineering #dbt (data build tool) #Python #Data Pipeline #Microsoft Power BI #Version Control #Scripting #Data Storage #Compliance #GIT #Scrum #Visualization #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Data Quality #BI (Business Intelligence) #AWS (Amazon Web Services) #Azure #Data Warehouse #Cloud #Data Analysis #Airflow #DAX #"ETL (Extract #Transform #Load)" #Agile
Role description
+1 571-556-1002 |naresh@sapphiresoftwaresolutions.com
Hiring Data Engineer (Snowflake / DBT / Power BI)-Only W2 || Hybrid 4 days Oaks, Pennsylvania
Hi Folks
Please check the JD and share your updated resume
JD
Job Title: Data Engineer (Snowflake / DBT / Power BI)
Location: Hybrid 4 days Oaks, Pennsylvania (Local candidates only)
1 year contract -Only W2
Job Summary
We are seeking a highly skilled Data Engineer with strong expertise in Snowflake, DBT, and Power BI to join our data team. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and data models to support analytics and business intelligence initiatives. This role requires a deep understanding of modern data stack technologies and best practices in data warehousing and transformation.
Key Responsibilities
Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
Build and optimize data models using DBT (Data Build Tool).
Develop and manage data solutions within Snowflake data warehouse.
Collaborate with business stakeholders to translate requirements into technical solutions.
Create and maintain Power BI dashboards and reports for business insights.
Ensure data quality, integrity, and governance across data platforms.
Optimize query performance and data storage strategies in Snowflake.
Implement best practices for data security and compliance.
Troubleshoot data-related issues and provide timely resolutions.
Work closely with cross-functional teams including Data Analysts, BI Developers, and Architects.
Required Skills & Experience
6+ years of experience in Data Engineering or related roles.
Strong hands-on experience with:
Snowflake (data warehousing, performance tuning, data sharing)
DBT (data transformation, modeling, testing)
Power BI (dashboarding, DAX, data visualization)
Proficiency in SQL (advanced level required).
Experience with ETL/ELT tools and building data pipelines.
Strong understanding of data warehousing concepts and dimensional modeling.
Experience working with cloud platforms (AWS, Azure, or GCP).
Familiarity with version control tools like Git.
Strong problem-solving and analytical skills.
Preferred Qualifications
Experience with orchestration tools (e.g., Airflow).
Knowledge of Python or other scripting languages.
Experience in Agile/Scrum environments.
Prior experience in financial services or enterprise data environments.
+1 571-556-1002 |naresh@sapphiresoftwaresolutions.com
Hiring Data Engineer (Snowflake / DBT / Power BI)-Only W2 || Hybrid 4 days Oaks, Pennsylvania
Hi Folks
Please check the JD and share your updated resume
JD
Job Title: Data Engineer (Snowflake / DBT / Power BI)
Location: Hybrid 4 days Oaks, Pennsylvania (Local candidates only)
1 year contract -Only W2
Job Summary
We are seeking a highly skilled Data Engineer with strong expertise in Snowflake, DBT, and Power BI to join our data team. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and data models to support analytics and business intelligence initiatives. This role requires a deep understanding of modern data stack technologies and best practices in data warehousing and transformation.
Key Responsibilities
Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
Build and optimize data models using DBT (Data Build Tool).
Develop and manage data solutions within Snowflake data warehouse.
Collaborate with business stakeholders to translate requirements into technical solutions.
Create and maintain Power BI dashboards and reports for business insights.
Ensure data quality, integrity, and governance across data platforms.
Optimize query performance and data storage strategies in Snowflake.
Implement best practices for data security and compliance.
Troubleshoot data-related issues and provide timely resolutions.
Work closely with cross-functional teams including Data Analysts, BI Developers, and Architects.
Required Skills & Experience
6+ years of experience in Data Engineering or related roles.
Strong hands-on experience with:
Snowflake (data warehousing, performance tuning, data sharing)
DBT (data transformation, modeling, testing)
Power BI (dashboarding, DAX, data visualization)
Proficiency in SQL (advanced level required).
Experience with ETL/ELT tools and building data pipelines.
Strong understanding of data warehousing concepts and dimensional modeling.
Experience working with cloud platforms (AWS, Azure, or GCP).
Familiarity with version control tools like Git.
Strong problem-solving and analytical skills.
Preferred Qualifications
Experience with orchestration tools (e.g., Airflow).
Knowledge of Python or other scripting languages.
Experience in Agile/Scrum environments.
Prior experience in financial services or enterprise data environments.






