

E-Solutions
Big Data Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer with 6+ years of Snowflake experience, strong skills in Informatica IICS, Databricks, SQL, and cloud platforms (Azure/AWS/GCP). Contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 28, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Minneapolis, MN
-
π§ - Skills detailed
#Scripting #IICS (Informatica Intelligent Cloud Services) #Azure #Data Governance #Automation #Metadata #Big Data #dbt (data build tool) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #PySpark #Data Quality #Data Management #Cloud #Data Modeling #AWS (Amazon Web Services) #Airflow #SQL (Structured Query Language) #S3 (Amazon Simple Storage Service) #Snowflake #Informatica #GCP (Google Cloud Platform) #Storage #Python #Databricks
Role description
β’ 6+ years of hands-on experience with Snowflake Data Cloud.
β’ Strong experience with Informatica IICS (mapping, taskflows, ingestion).
β’ Handsβon experience with Databricks, Spark/PySpark, and SQL.
β’ Strong SQL, data modeling, ETL/ELT concepts, and performance tuning.
β’ Cloud experience in Azure / AWS / GCP.
β’ Experience integrating Snowflake with cloud object storage (S3/Azure Blob/GCS).
Additional Skills (Good to Have)
β’ Python scripting experience for automation.
β’ Experience with dbt, Airflow, or other orchestration tools.
β’ Knowledge of data governance, data quality frameworks, metadata management.
β’ 6+ years of hands-on experience with Snowflake Data Cloud.
β’ Strong experience with Informatica IICS (mapping, taskflows, ingestion).
β’ Handsβon experience with Databricks, Spark/PySpark, and SQL.
β’ Strong SQL, data modeling, ETL/ELT concepts, and performance tuning.
β’ Cloud experience in Azure / AWS / GCP.
β’ Experience integrating Snowflake with cloud object storage (S3/Azure Blob/GCS).
Additional Skills (Good to Have)
β’ Python scripting experience for automation.
β’ Experience with dbt, Airflow, or other orchestration tools.
β’ Knowledge of data governance, data quality frameworks, metadata management.






