

Snowflake Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Engineer in Greensboro, NC, offering a long-term contract (12+ months) at $90/hr on W2 & $100/hr on C2C. Requires 10+ years in Data Engineering, expertise in Snowflake, ETL/ELT, and data modeling.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
July 16, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Greensboro, NC
-
π§ - Skills detailed
#Bash #Data Integration #Data Science #Azure #"ETL (Extract #Transform #Load)" #Data Warehouse #Data Pipeline #Matillion #Scala #Integration Testing #Docker #Cloud #Agile #EDW (Enterprise Data Warehouse) #SQL (Structured Query Language) #Data Ingestion #Scrum #Schema Design #Storage #Visualization #Data Integrity #Security #Kubernetes #GDPR (General Data Protection Regulation) #Snowflake #Data Privacy #Data Security #Fivetran #Data Lifecycle #BI (Business Intelligence) #Data Modeling #Microsoft Power BI #Data Governance #AWS (Amazon Web Services) #Data Engineering #Scripting #Data Storage #Data Quality #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title : Snowflake Engineer
Job Location : Greensboro, NC(Onsite)
Job Duration :Long-term Contract(12+ months)
Pay rate: $90/hr on W2 & $100/hr on C2C
Job Description:
The successful candidate will design, build, and maintain a robust and scalable data infrastructure to support business intelligence, analytics, and data science initiatives. The responsibility includes managing the end-to-end data lifecycle, including data ingestion, transformation, storage, and accessibility for various stakeholders.
Key Responsibilities:
β’ Design, develop, and maintain efficient and scalable data pipelines to extract, transform, and load data from various sources.
β’ Build and maintain data models within the Enterprise Data Warehouse, ensuring data is organized and accessible for analysis and reporting.
β’ Gather business reporting requirements and translate them into technical specifications for data solutions.
β’ Develop and implement ETL/ELT solutions to support data integration needs.
β’ Integrate data from external partner applications and other internal systems.
β’ Develop and support dashboards, analytics, and reports using Power BI.
β’ Perform integration testing to ensure data quality and system functionality.
β’ Design and implement data security measures and access controls.
β’ Collaborate with cross-functional teams to ensure data solutions meet business needs.
β’ Ensure data integrity and accuracy across the EDW solution and source systems.
β’ Provide support for production data environments and troubleshoot data-related issues.
β’ Participate in cutover planning and execution for new data solutions.
β’ Stay up-to-date with industry trends and emerging technologies in data engineering.
Required Skills and Experience:
β’ Extensive Data Engineering Experience: A minimum of 10+ years of proven experience in Data Engineering is required, including designing, developing, and supporting complex data environments.
β’ Data Warehouse Design and Development:
β’ Experience in designing, implementing, and optimizing enterprise data warehouses (EDWs) is required.
β’ Expertise in data modeling and schema design (e.g., relational, dimensional) is required to ensure efficient data storage and retrieval.
β’ Snowflake: Hands-on experience with Snowflake data warehousing, including optimization techniques, is required.
β’ ETL/ELT and Data Integration:
β’ A deep understanding and experience with ETL/ELT processes and tools are required.
β’ Matillion and Fivetran HVR: Proficiency with Matillion and Fivetran HVR for building data pipelines is required.
β’ Experience integrating data from various sources using APIs, EDI, SFTP, and other methods is required.
β’ Data Modeling and Design:
β’ A strong understanding of data modeling principles and best practices is required.
β’ The ability to translate business requirements into efficient data models is required.
β’ Data Quality and Governance:
β’ Experience implementing data quality checks and validation processes is required.
β’ Familiarity with data governance principles and best practices is required.
β’ Ensuring data security and access controls are implemented and maintained is required.
β’ Data Security:
β’ Experience implementing industry-standard data security measures, including encryption and access controls, is required.
β’ Familiarity with data privacy regulations (e.g., GDPR) is required.
β’ Testing and Validation:
β’ Experience in designing and executing integration testing for data pipelines and EDW is required.
β’ The ability to implement robust error handling and recovery mechanisms is required.
β’ Collaboration and Communication:
β’ The ability to work collaboratively in cross-functional teams, including data scientists, analysts, and business stakeholders, is required.
β’ Strong communication skills, both verbal and written, with the ability to explain complex technical concepts to non-technical audiences, are required.
β’ Agile Methodologies:
β’ Experience working in Agile/Scrum development methodologies is required.
β’ Data Integrity and Accuracy:
β’ Experience ensuring data integrity and accuracy across various data sources is required.
β’ BI and/or EDW Lifecycle Implementation:
β’ Experience with multiple BI and/or EDW lifecycle implementations (e.g., blueprinting, build/test, go-live, support) is required.
Preferred Skills:
β’ Experience with external partner integrations, including eCommerce, EDI, WMS, and B2B integration.
β’ Experience with Snowflake Marketplace integrations.
β’ Proficiency in SQL and scripting languages (e.g., Python, Bash).
β’ Experience with data visualization tools (e.g., Power BI).
β’ Cloud computing experience (e.g., AWS, Azure, Google Cloud).
β’ Familiarity with containerization technologies (e.g., Docker, Kubernetes).
Regards,
Vishwajeet Verma