Sigma Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sigma Developer/Data Engineer (Remote, USA) on a contract basis, requiring 7+ years of experience, expertise in Sigma Computing, strong SQL and scripting skills, and familiarity with ETL tools and cloud platforms.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 19, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#PySpark #Airflow #Data Modeling #Azure Data Factory #AWS (Amazon Web Services) #Python #Indexing #SQL (Structured Query Language) #BI (Business Intelligence) #Scala #Data Warehouse #GCP (Google Cloud Platform) #Redshift #Spark (Apache Spark) #Synapse #GIT #Cloud #BigQuery #Data Pipeline #ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #Azure #Snowflake #Data Engineering #dbt (data build tool) #DevOps #Scripting #Security
Role description
One of my clients is looking for a Sigma Developer/ Data Engineer - USA (Remote) for a contract role. Required Skills & Qualifications: ● 7+ years of experience as a Data Engineer, BI Developer, or similar role. ● Hands-on expertise with Sigma Computing (dashboard design, data modeling, embedding, row-level security). ● Strong proficiency in SQL and scripting languages (Python, Scala, or PySpark). ● Experience with ETL/ELT tools (dbt, Airflow, Glue, Azure Data Factory, etc.). ● Knowledge of cloud platforms (AWS, Azure, or GCP) and their data services. ● Solid understanding of data warehousing concepts (Star/Snowflake schema, OLAP, partitioning, indexing). ● Experience with modern data warehouses (Snowflake, BigQuery, Redshift, or Synapse). ● Familiarity with CI/CD, Git, and DevOps practices for data pipelines. ● Strong problem-solving, communication, and collaboration skills. If interested, please share your updated resume.