

KPG99 INC
Data Engineer (Data Ingestion)-(No California or NY Consultant)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Data Ingestion) on a 6-month contract, remote (excluding California and New York). Requires strong skills in Azure Data Factory, Snowflake, SQL, and Python, with real-world data ingestion and data lake experience; healthcare experience preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 15, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Lake #Azure #Data Engineering #Azure Data Factory #ADF (Azure Data Factory) #Python #Snowflake #Data Ingestion #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
Role: Data Engineer (Data Ingestion)
Location: Remote (No California OR New York Candidates)
Duration: 6 month contract to hire (No sponsorship available)
β’ Must Have: Azure Data Factory (ADF), Snowflake, Strong SQL, Python
β’ Must have strong coding experience with python
β’ Real world data ingestion experience
β’ This team is very technical - ingesting the data and creating the data lake, importance of frameworks
β’ Python and SQL are required
β’ Ingesting the data and creating the data lake
Key Requirements:
β’ SQL & Snowflake: Strong proficiency required.
β’ ETL & DW: Solid experience in data warehousing and ETL processes.
β’ ETL Expertise: Azure Data Factory (ADF).
β’ SQL & Python: Must be highly proficient in SQL and Python
Additional Notes:
β’ Preferred: healthcare experience
Role: Data Engineer (Data Ingestion)
Location: Remote (No California OR New York Candidates)
Duration: 6 month contract to hire (No sponsorship available)
β’ Must Have: Azure Data Factory (ADF), Snowflake, Strong SQL, Python
β’ Must have strong coding experience with python
β’ Real world data ingestion experience
β’ This team is very technical - ingesting the data and creating the data lake, importance of frameworks
β’ Python and SQL are required
β’ Ingesting the data and creating the data lake
Key Requirements:
β’ SQL & Snowflake: Strong proficiency required.
β’ ETL & DW: Solid experience in data warehousing and ETL processes.
β’ ETL Expertise: Azure Data Factory (ADF).
β’ SQL & Python: Must be highly proficient in SQL and Python
Additional Notes:
β’ Preferred: healthcare experience