Talent Groups

Snowflake Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer in Dallas, TX (Hybrid) on a contract basis. Requires 3-5 years of coding experience, proficiency in SQL, Python or Java, and familiarity with SDLC, CI/CD, and technologies like Kafka, Apache Spark, and Snowflake.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 16, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Deployment #Snowflake #Kafka (Apache Kafka) #Computer Science #Scripting #Mathematics #Spark (Apache Spark) #HDFS (Hadoop Distributed File System) #Sybase #Apache Iceberg #Python #Hadoop #Apache Spark #JSON (JavaScript Object Notation) #"ETL (Extract #Transform #Load)" #Java #SQL (Structured Query Language)
Role description
Role: Snowflake Developer Location: Dallas, TX (Hybrid) Duration: Contract Job Description: Basic Qualifications Education: β€’ Bachelor’s or master’s in computer science, Applied Mathematics, Engineering, or a related quantitative field. b. c. d. β€’ Experience: Minimum of 3-5 years of professional "hands-on-keyboard" coding experience in a collaborative, team-based environment. β€’ Ability to trouble shoot (SQL) and basic scripting experience. Languages: Professional proficiency in Python or Java. β€’ Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience. Technical Stack Requirements: β€’ While candidates are not expected to be experts in every tool, the collective team must cover the following technologies: β€’ Extraction & Logic Kafka, ANSI SQL, FTP, Apache Spark Data Formats JSON, Avro, Parquet Platforms Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ