PEOPLE FORCE CONSULTING INC

Data Engineer - SQL + Snowflake

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 10 years of experience, focusing on SQL and Snowflake, for a remote position. The contract length is unspecified, with a pay rate of "unknown." Candidates should have healthcare industry experience and effective communication skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#SQL (Structured Query Language) #dbt (data build tool) #Unit Testing #"ETL (Extract #Transform #Load)" #Data Profiling #Snowflake #Agile #Data Engineering
Role description
As a Data Engineer, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards . Experience: - 10 Years Location: - Remote / PST times Educational Qualifications: - • Engineering Degree – BE/ME/BTech/MTech/BSc/MSc. • Technical certification in multiple technologies is desirable. Mandatory skills:- • SQL: Highly Skilled. Must be familiar with common table elements (CTEs), complex joins, and analytic & windowing functions. • Snowflake: Prefer candidates with real-world experience using Snowflake. • Data Profiling: Must be able to identify unique and primary keys and determine cardinality, data type, precision, and scale, and skew. • Effective communication skills to communicate internally within team and externally across other teams, notably Product and QA • Understands the scope and context of the work and takes initiative to achieve the goals, rather than waiting to take specific directionn. Good to have skills: - • DBT: Prefer candidates with real-world DBT experience. • Healthcare Industry Experience: Some experience is required, preferably Payer experience. Job Summary: - • Data profiling to identify primary keys and issues with the data. • ETL to bring data onto the Data Platform, de-duplicate data, create or update dimensional data structures, and produce use case-specific output. • Unit testing, functional testing, and performance testing and tuning. • Interacting with the Product team to understand and refine requirements. • Interacting with QA to address reported findings. • Working individually and as a team to achieve our goals. • Taking initiative to take on additional work if the present work stream slows down