

Zillion Technologies, Inc.
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "Unknown" and a pay rate of "$40-$60 per hour." Required skills include Python, PySpark, SQL, and experience with Snowflake, ETL tools, and data modeling. A Bachelor's degree and 5+ years in data technologies are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
December 30, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
McLean, VA
-
π§ - Skills detailed
#Informatica #Collibra #SaaS (Software as a Service) #NoSQL #IICS (Informatica Intelligent Cloud Services) #Metadata #Angular #Data Analysis #SQL (Structured Query Language) #Data Architecture #Data Warehouse #Data Mart #Databases #MongoDB #Agile #XML (eXtensible Markup Language) #PostgreSQL #Spark SQL #Talend #Computer Science #JDBC (Java Database Connectivity) #PySpark #Snowflake #Data Management #Physical Data Model #"ETL (Extract #Transform #Load)" #Data Engineering #Spark (Apache Spark) #Data Lake #BI (Business Intelligence) #Automation #REST API #Java #JSON (JavaScript Object Notation) #Python #Data Modeling #Classification #API (Application Programming Interface) #Spring Boot #REST (Representational State Transfer) #RDBMS (Relational Database Management System) #Big Data
Role description
β’ The candidate must have experience with the following:
β’ PySpark, Python, SQL
β’ Snowflake or any large data warehouse
β’ Any RDBMS and NoSQL databases
β’ Experience with IICS or other ETL tools, Collibra, and understanding of data modeling are preferred. Additionally, experience with working with ServiceNow databases are also preferred.
β’ Experience with Selenium is nice to have.
Must Have Qualifications: Python, PySpark, SQL Agile practices
Preferred: Snowflake and Informatica
Qualifications:
β’ Bachelorβs degree in computer science, information technology or related field; advanced studies/degree preferred.
β’ 5 years of extensive knowledge and experience in the Data technologies for Data Analytics, Data Lake/Mart/Warehouse, Databases SQL/NoSQL (DB2, Mongo, Postgres).
β’ Big Data Technologies (Spark or PySpark), ETL (Informatica, Talend), REST API, Integration/EAI technologies like Informatica.
β’ 3+ years of experience with Technologies including Web Service API, XML, JSON, JDBC, Java, Python.
β’ 3+ years working with SaaS platforms such as Snowflake, Collibra, Mongo/MongoDB Atlas.
β’ Knowledge of enterprise data models, information classification, meta-data models, taxonomies and ontologies.
β’ Exposure to Full stack enterprise application development (Angular, Spring Boot, Automation testing using Selenium).
β’ 5 β 7 years of experience in a logical/physical data modeling, data architecture, data analysis, and data management role.
β’ Experience with different query languages such as PL/SQL, T-SQL, and ANSI SQL.
β’ Experience with database technologies such as DB2, PostgreSQL, Snowflake.
β’ Knowledge of data warehousing and business intelligence concepts including data mesh, data fabric, data lake, data warehouse, and data marts.
β’ The candidate must have experience with the following:
β’ PySpark, Python, SQL
β’ Snowflake or any large data warehouse
β’ Any RDBMS and NoSQL databases
β’ Experience with IICS or other ETL tools, Collibra, and understanding of data modeling are preferred. Additionally, experience with working with ServiceNow databases are also preferred.
β’ Experience with Selenium is nice to have.
Must Have Qualifications: Python, PySpark, SQL Agile practices
Preferred: Snowflake and Informatica
Qualifications:
β’ Bachelorβs degree in computer science, information technology or related field; advanced studies/degree preferred.
β’ 5 years of extensive knowledge and experience in the Data technologies for Data Analytics, Data Lake/Mart/Warehouse, Databases SQL/NoSQL (DB2, Mongo, Postgres).
β’ Big Data Technologies (Spark or PySpark), ETL (Informatica, Talend), REST API, Integration/EAI technologies like Informatica.
β’ 3+ years of experience with Technologies including Web Service API, XML, JSON, JDBC, Java, Python.
β’ 3+ years working with SaaS platforms such as Snowflake, Collibra, Mongo/MongoDB Atlas.
β’ Knowledge of enterprise data models, information classification, meta-data models, taxonomies and ontologies.
β’ Exposure to Full stack enterprise application development (Angular, Spring Boot, Automation testing using Selenium).
β’ 5 β 7 years of experience in a logical/physical data modeling, data architecture, data analysis, and data management role.
β’ Experience with different query languages such as PL/SQL, T-SQL, and ANSI SQL.
β’ Experience with database technologies such as DB2, PostgreSQL, Snowflake.
β’ Knowledge of data warehousing and business intelligence concepts including data mesh, data fabric, data lake, data warehouse, and data marts.






