Engineering Square

Sr.Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Snowflake Data Engineer with a contract length of over 6 months, paying $84.50 per hour. Located in Columbia, MD (remote with occasional meetings), it requires 5+ years in data engineering, strong SQL, Python/Snowpark skills, and experience in healthcare or insurance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
544
-
🗓️ - Date
November 7, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Columbia, MD
-
🧠 - Skills detailed
#Data Modeling #"ETL (Extract #Transform #Load)" #DevOps #SQL (Structured Query Language) #Data Governance #Jenkins #Batch #Deployment #Alation #Collibra #Snowpark #Data Engineering #Data Architecture #Python #Automation #Scala #dbt (data build tool) #Data Quality #Data Pipeline #Snowflake #Cloud #Kafka (Apache Kafka)
Role description
SR Snowflake Data Engineer role Must be able to convert after a year duration C2C rate: $84.50 Location: Columbia, MD (few meetings per month) We are seeking an experienced Senior Data Engineer to join our health insurance client. This role is a key contributor in designing, developing, and optimizing cloud-based data solutions using Snowflake. You’ll leverage advanced Snowflake capabilities, build modern data pipelines, and enable scalable analytics and reporting to support enterprise operations. The ideal candidate demonstrates deep Snowflake and SQL expertise, hands-on experience with Snowpark (Python), and a strong foundation in data architecture, governance, and automation. Key Responsibilities Develop and optimize data pipelines and transformations in Snowflake using SQL and Snowpark (Python). Build and manage Streams, Tasks, and Materialized Views for real-time and batch data operations. Implement CI/CD pipelines and automate deployments (e.g., Jenkins). Collaborate with architects and analysts to design scalable data models and enforce data governance standards. Monitor and optimize pipeline performance, reliability, and cost. Integrate Snowflake with tools like dbt and Kafka for orchestration and streaming. Required Skills 5+ years in data engineering; 3+ years hands-on with Snowflake Data Cloud. Strong SQL and Python/Snowpark skills. Experience with data modeling, DevOps/CI-CD, and data governance (Collibra, Alation, or Purview). Proven ability to troubleshoot, optimize, and deliver scalable solutions. Preferred Experience with dbt, Kafka, or data quality tools (Great Expectations, Monte Carlo). Snowflake certification (SnowPro Core or Advanced). Background in regulated industries such as healthcare or insurance. Soft Skills Strong analytical and communication abilities. Collaborative, proactive, and adaptable in fast-paced environments. Commitment to quality, automation, and continuous improvement. Job Type: Contract Pay: $65.00 - $68.00 per hour Expected hours: 8 per week Work Location: Remote