

Glint Tech Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7–10 years of experience, focusing on building ETL/ELT pipelines and managing data platforms like Snowflake and BigQuery. Strong SQL and programming skills in Python, Scala, or Java are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
April 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#Schema Design #Monitoring #Cloud #Data Pipeline #Spark (Apache Spark) #Python #Programming #Data Modeling #"ETL (Extract #Transform #Load)" #Data Engineering #Big Data #ML (Machine Learning) #Snowflake #BigQuery #Batch #Data Processing #S3 (Amazon Simple Storage Service) #Java #SQL (Structured Query Language) #Data Quality #Scala #Kafka (Apache Kafka)
Role description
Title: Data Engineer
Join a forward-thinking and innovative organization that values growth, collaboration, and cutting-edge technology. We're looking for a skilled Data Engineer to build scalable data solutions that power analytics and machine learning.
Key Responsibilities:
Build and maintain ETL/ELT data pipelines (batch & streaming)
Manage data platforms like Snowflake, BigQuery, and S3
Work with distributed systems such as Spark and Kafka
Ensure data quality, validation, and monitoring
Optimize performance for scalable data processing
Requirements:
7–10 years of experience in Data Engineering
Strong SQL and programming skills (Python/Scala/Java)
Experience with big data tools and cloud data platforms
Knowledge of data modeling and schema design
Title: Data Engineer
Join a forward-thinking and innovative organization that values growth, collaboration, and cutting-edge technology. We're looking for a skilled Data Engineer to build scalable data solutions that power analytics and machine learning.
Key Responsibilities:
Build and maintain ETL/ELT data pipelines (batch & streaming)
Manage data platforms like Snowflake, BigQuery, and S3
Work with distributed systems such as Spark and Kafka
Ensure data quality, validation, and monitoring
Optimize performance for scalable data processing
Requirements:
7–10 years of experience in Data Engineering
Strong SQL and programming skills (Python/Scala/Java)
Experience with big data tools and cloud data platforms
Knowledge of data modeling and schema design






