Data Warehouse Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Analyst in Columbus, Ohio, with a contract length of "unknown" and a pay rate of "unknown." Key requirements include 4-6 years in data warehousing, 2-3 years with Snowflake, and relevant certifications.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Columbus, OH
-
🧠 - Skills detailed
#Monitoring #Data Pipeline #Database Architecture #Cloud #HDFS (Hadoop Distributed File System) #Documentation #Project Management #Kerberos #Programming #Hadoop #Snowflake #Data Integration #"ETL (Extract #Transform #Load)" #Java #Scripting #StreamSets #SAML (Security Assertion Markup Language) #Security #YARN (Yet Another Resource Negotiator) #Data Warehouse #Unix #Python #Data Migration #Big Data #PySpark #Migration #Oracle #AWS (Amazon Web Services) #Kudu #Snowpark #Databases #Zookeeper #SnowPipe #Spark (Apache Spark) #Cloudera #SQL (Structured Query Language) #Kafka (Apache Kafka) #Linux #Sqoop (Apache Sqoop) #SQL Queries #Leadership #Agile #Shell Scripting #Metadata #Impala #Database Performance
Role description
Hello This is Sheetal from ABSI urgently looking for DATA WAREHOUSE CONSULTANT for COLUMBUS OHIO LOCATION position is ONSITE at COLUMBUS OHIO POSITION : DATA WAREHOUSE CONSULTANT LOCATION : COLUMBUS OHIO ONSITE Job Description REQUIRED Skill Sets: β€’ Proficiency in Data Warehousing, Data migration, and Snowflake is essential for this role. β€’ Strong Experience in the implementation, execution, and maintenance of Data Integration technology solutions. β€’ Minimum (4-6) years of hands-on experience with Cloud databases. β€’ Minimum (2-3) years of hands-on data migration experience from the Big data environment to Snowflake environment. β€’ Minimum (2-3) years of hands-on experience with the Snowflake platform along with Snowpipe and Snowpark. β€’ Strong experience with Snow SQL, PL/SQL, and expertise in writing snowflake procedures using SQL/python/Java. β€’ Experience with optimizing Snowflake database performance and real-time monitoring. β€’ Strong database architecture, critical thinking, and problem-solving abilities. β€’ Experience with the AWS platform Services. β€’ Snowflake Certification is Highly desirable. β€’ Snowpark with Python programming languages is preferred to be used to build data pipelines. β€’ 8+ years of experience with Big Data, Hadoop on Data Warehousing or Data Integration projects. β€’ Analysis, Design, development, support and Enhancements of ETL/ELT in data warehouse environment with Cloudera Bigdata Technologies (with a minimum of 8-9 years’ experience in Hadoop, MapReduce, Sqoop, PySpark, Spark, HDFS, Hive, Impala, StreamSets, Kudu, Oozie, Hue, Kafka, Yarn, Python, Flume, Zookeeper, Sentry, Cloudera Navigator) along with Oracle SQL/PL-SQL, Unix commands and shell scripting; β€’ Strong development experience (minimum of 8-9 years) in creating Sqoop scripts, PySpark programs, HDFS commands, HDFS file formats (Parquet, Avro, ORC etc.), StreamSets pipeline creation, jobs scheduling, hive/impala queries, Unix commands, scripting and shell scripting etc. β€’ Writing Hadoop/Hive/Impala scripts (minimum of 8-9 years’ experience) for gathering stats on table post data loads. β€’ Strong SQL experience (Oracle and Hadoop (Hive/Impala etc.)). β€’ Writing complex SQL queries and performed tuning based on the Hadoop/Hive/Impala explain plan results. β€’ Experience building data sets and familiarity with PHI and PII data. β€’ Expertise implementing complex ETL/ELT logic. β€’ Accountable for ETL/ELT design documentation. β€’ Basic knowledge of UNIX/LINUX shell scripting. β€’ Utilize ETL/ELT standards and practices towards establishing and following centralized metadata repository. β€’ Good experience in working with Visio, Excel, PowerPoint, Word, etc. β€’ Effective communication, presentation and organizational skills. β€’ Familiar with Project Management methodologies like Waterfall and Agile. β€’ Ability to establish priorities and follow through on projects, paying close attention to detail with minimal supervision. β€’ Required Education: BS/BA degree or combination of education & experience. DESIRED Skill Sets: β€’ In addition, to overall Snowflake experience, a candidate should have experience in the development work in both the Snowpipe and Snowpark. β€’ Experience with Data Migration from Big data environment to Snowflake environment. β€’ Strong understanding of Snowflake capabilities like Snowpipe, STREAMS, TASKS etc. β€’ Knowledge of security (SAML, SCIM, OAuth, OpenID, Kerberos, Policies, entitlements etc.). β€’ Has experience with System DRP for Snowflake systems β€’ Demonstrate effective leadership, analytical and problem-solving skills. β€’ Required excellent written and oral communication skills with technical and business teams. β€’ Ability to work independently, as well as part of a team. β€’ Stay abreast of current technologies in area of IT assigned. β€’ Establish facts and draw valid conclusions. β€’ Recognize patterns and opportunities for improvement throughout the entire organization. β€’ Ability to discern critical from minor problems and innovate new solutions. Sheetal Agarwal American Business Solutions Inc 8850 Whitney Drive Lewis Center, OH 43035 Cell : 614-541-3190 Work : 614-888-2227 Ext. 4692 Fax: 888-699-8866 Mail to: sheetal@absiusa.com