

Quantum World Technologies Inc.
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst specializing in healthcare, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, SQL, ETL pipelines, AWS, Databricks, Snowflake, and FHIR experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Spark (Apache Spark) #Data Analysis #"ETL (Extract #Transform #Load)" #GitLab #PySpark #AWS (Amazon Web Services) #Agile #Python #Databricks #SQL (Structured Query Language) #Terraform #Spark SQL #SQS (Simple Queue Service) #FHIR (Fast Healthcare Interoperability Resources) #S3 (Amazon Simple Storage Service) #Snowflake #Cloud
Role description
Healthcare Data Analyst
Job Mode Hybrid (2 days Onsite and 3 Days Remote)
: • Proficient with Python and SQL
• Build and manage efficient ETL pipelines using Databricks workflows or other orchestration framework
• Familiarity with both structured and semi structured data and ingesting and processing this data using pyspark
• Fundamental AWS services (S3, SQS) or similar services for other clouds
• Terraform and GitLab CICD
• Query tuning and performance optimization in SQL and/or SparkSQL
• Familiarity with data warehousing (snowflake or similar)
• Spark SQL
• AWS, Databricks, Snowflake
• Experience working for a cloud based data services provider to large healthcare clients.
• FHIR experience
• Communicate with technical and non technical stakeholders
• Experience handling large volume of data.
• Root Cause Analysis
• Organizational skills
• Agile development methodology
Additional information:
Healthcare Data Analyst
Job Mode Hybrid (2 days Onsite and 3 Days Remote)
: • Proficient with Python and SQL
• Build and manage efficient ETL pipelines using Databricks workflows or other orchestration framework
• Familiarity with both structured and semi structured data and ingesting and processing this data using pyspark
• Fundamental AWS services (S3, SQS) or similar services for other clouds
• Terraform and GitLab CICD
• Query tuning and performance optimization in SQL and/or SparkSQL
• Familiarity with data warehousing (snowflake or similar)
• Spark SQL
• AWS, Databricks, Snowflake
• Experience working for a cloud based data services provider to large healthcare clients.
• FHIR experience
• Communicate with technical and non technical stakeholders
• Experience handling large volume of data.
• Root Cause Analysis
• Organizational skills
• Agile development methodology
Additional information:






