Snowflake Data Engineer - ON SITE

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer, onsite in Columbus, OH, with a contract length of unspecified duration. Pay is up to $70/hr. Key skills include Snowflake, ETL/ELT, Big Data, and strong SQL. A BS/BA degree and 9 years of relevant experience are required. Snowflake certification is highly desirable.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Columbus, Ohio Metropolitan Area
-
🧠 - Skills detailed
#Monitoring #Database Architecture #Cloud #SAS #HDFS (Hadoop Distributed File System) #Documentation #Consulting #Kerberos #Hadoop #Snowflake #Data Integration #"ETL (Extract #Transform #Load)" #Java #Scripting #Data Analysis #SAML (Security Assertion Markup Language) #Security #Data Security #StreamSets #Unit Testing #Data Warehouse #Unix #Python #Data Migration #Big Data #PySpark #Migration #Data Engineering #Quality Assurance #Oracle #AWS (Amazon Web Services) #Libraries #Snowpark #Databases #Data Profiling #EDW (Enterprise Data Warehouse) #Data Quality #Datasets #SnowPipe #Spark (Apache Spark) #Cloudera #Scala #SQL (Structured Query Language) #Data Ingestion #Linux #Sqoop (Apache Sqoop) #Compliance #SQL Queries #Leadership #Data Governance #Agile #Shell Scripting #Deployment #Data Framework #Metadata #Impala
Role description
Company and Role: Dedicated Tech Services, Inc. (DTS) is an award-winning IT consulting firm based in Columbus, OH. We now have an opening for a Snowflake Data Engineer - for critical ODM Enterprise Data warehouse. EDW M&O to migrate the production data (Weekly, Monthly, and yearly) from the current Big Data Environment to Snowflake Environment, run ELT jobs, and check the data quality from disparate data sources in Snowflake Environment to achieve ODM's strategic and long-term business goals. Highlights and Benefits: β€’ Onsite – Columbus, Ohio β€’ W2 hourly pay rate up to $70hr or salaried equivalent β€’ Government Environment β€’ Direct W2 hourly or salaried applicants only (no corp-to-corp subcontractors, third parties, or agencies) β€’ Paid time off and holidays for salaried employees β€’ 401K, billable bonus, and health, life, vision, dental and short-term disability insurance options for all β€’ DTS is a proud Women Business Enterprise (WBE) and Woman Owned Small Business (WOSB)! β€’ Check out our benefits and company information at www.dtsdelivers.com! Job Description: We are hiring an experienced Snowflake Data Engineer to work for us as our direct, W2 salaried or hourly employee to join our team. You will: β€’ Participate in Team activities, Design discussions, Stand up meetings and planning Review with team. β€’ Provide Snowflake database technical support in developing reliable, efficient, and scalable solutions for various projects on Snowflake. β€’ Ingest the existing data, framework and programs from ODM EDW IOP Big data environment to the ODM EDW Snowflake environment using the best practices. β€’ Design and develop Snowpark features in Python, understand the requirements and iterate. β€’ Interface with the open-source community and contribute to Snowflake’s open-source libraries including Snowpark Python and the Snowflake Python Connector. β€’ Create, monitor, and maintain role-based access controls, Virtual warehouses, Tasks, Snow pipe, Streams on Snowflake databases to support different use cases. β€’ Performance tuning of Snowflake queries and procedures. Recommending and documenting the best practices of Snowflake. β€’ Explore the new capabilities of Snowflake, perform POC and implement them based on business requirements. β€’ Responsible for creating and maintaining the Snowflake technical documentation, ensuring compliance with data governance and security policies. β€’ Implement Snowflake user /query log analysis, History capture, and user email alert configuration. β€’ Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking features. β€’ Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shell scripts. β€’ Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document. β€’ Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements. β€’ Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment. β€’ Updating the production support Run book, Control M schedule document as per the production release. β€’ Create and update design documents, provide detail description about workflows after every production release. β€’ Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues. β€’ Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches. β€’ Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data. β€’ Participate in ETL/ELT code review and design re-usable frameworks. β€’ Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment. β€’ Work with Snowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments and health checks. β€’ Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows. β€’ Create Snowpark and PySpark programs to ingest historical and incremental data. β€’ Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables. β€’ Participate in meetings to continuously upgrade the Functional and technical expertise. Required Skills and Experience: β€’ Required Education: BS/BA degree or equivalent experience. β€’ 9 years of experience with Big Data, Hadoop on Data Warehousing or Data Integration projects. β€’ 9 years experience in Analysis, Design, development, support, and enhancements of ETL/ELT with Cloudera Big Data Technologies (Hadoop, MapReduce, Sqoop, PySpark, Hive, Impala, etc.) β€’ 9 years of strong development experience in creating Sqoop scripts, PySpark programs, HDFS commands, StreamSets pipelines, Hive/Impala queries, and Unix shell scripts. β€’ 9 years writing Hadoop/Hive/Impala scripts for gathering table stats post data load. β€’ 6 years of hands-on experience with cloud databases. β€’ 3 years of hands-on data migration experience from Big Data to Snowflake. β€’ 3 years of hands-on experience with the Snowflake platform, including Snowpipe and Snowpark. β€’ Proficiency in Data Warehousing, Data Migration, and Snowflake (Essential). β€’ Strong experience in implementing, executing, and maintaining Data Integration solutions. β€’ Strong SQL experience (Oracle and Hadoop – Hive/Impala). β€’ Expertise implementing complex ETL/ELT logic. β€’ Strong experience with Snow SQL, PL/SQL, and writing Snowflake procedures in SQL/Python/Java. β€’ Experience with Snowflake performance optimization and monitoring. β€’ Experience building datasets; familiarity with PHI and PII data. β€’ Strong database architecture, problem-solving, and critical thinking. β€’ Experience with the AWS platform. β€’ Writing complex SQL queries and performance tuning (Hive/Impala explain plans). β€’ Accountable for ETL/ELT design documentation. β€’ Utilize ETL/ELT standards and centralized metadata practices. β€’ Effective communication, presentation, and organizational skills. β€’ Good experience with Visio, Excel, PowerPoint, Word. β€’ Familiar with project methodologies: Agile and Waterfall. β€’ Ability to prioritize and work independently with minimal supervision. β€’ Basic knowledge of UNIX/Linux shell scripting. β€’ Required excellent written and oral communication skills with technical and business teams. Desired Skills and Experience: β€’ Snowflake certification (highly desirable). β€’ Experience with Snowpark using Python (preferred). β€’ Experience in the development work in both the Snowpipe and Snowpark. β€’ Experience with Data Migration from Big data environment to Snowflake environment. β€’ Strong understanding of Snowflake capabilities like Snowpipe, STREAMS, TASKS etc. β€’ Knowledge of security (SAML, SCIM, OAuth, OpenID, Kerberos, Policies, entitlements etc.). β€’ Has experience with System DRP for Snowflake systems β€’ Demonstrate effective leadership, analytical and problem-solving skills. β€’ Ability to work independently, as well as part of a team. β€’ Stay abreast of current technologies in area of IT assigned. β€’ Establish facts and draw valid conclusions. β€’ Recognize patterns and opportunities for improvement throughout the entire organization. β€’ Ability to discern critical from minor problems and innovate new solutions. US Citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor at this time. Dedicated Tech Services, Inc. is an Equal Opportunity Employer