

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in Snowflake, offering a remote contract in the United States. Key skills include SQL, Python, and data warehousing. Experience with ETL/ELT pipelines and DevOps practices is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Monitoring #Data Engineering #SQL (Structured Query Language) #Data Pipeline #"ETL (Extract #Transform #Load)" #R #Security #Data Warehouse #Databases #Data Governance #Databricks #DevOps #Python #Data Quality #Programming #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: Data Engineer with Snowflake
Location: United States(Remote)
Required Skills
β’ Proficiency in SQL and at least one programming language (e.g., Python).
β’ Snowflake(mandatory)+ exposure to databricks
β’ Familiarity with data warehousing concepts and tools.
β’ Strong problem-solving and communication skills.
β’ Knowledge of DevOps practices and CI/CD pipelines.
Responsibilities
β’ Build and optimize data models and data warehouses (e.g., Snowflake, Databricks).
β’ Integrate data from various sources including APIs, databases, and third-party platforms.
β’ Develop and implement data transformation logic using SQL, Python with Snowflake integrations to cleanse, enrich, and conform data to business rules.
β’ Experience in design, develop, and maintain robust ETL/ELT pipelines using tools
β’ Ensure data quality, integrity, and consistency across systems.
β’ Collaborate with cross-functional teams to understand data requirements and deliver solutions.
β’ Monitor and troubleshoot data pipeline performance and reliability.
β’ Implement data governance and security best practices.
β’ Automate data validation and monitoring processes.
Regards
Praveen Kumar
Talent Acquisition Group β Strategic Recruitment Manager
praveen.r@themesoft.com| Themesoft Inc