

Intellibee Inc
ETL Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Data Engineer with over 6 months contract length, offering a competitive pay rate. Key skills include strong Java experience, ETL pipeline development, SQL expertise, and familiarity with Snowflake, AWS, Azure, or GCP.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 6, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
McLean, VA
-
🧠 - Skills detailed
#Python #SQL Queries #Data Pipeline #SQL (Structured Query Language) #Cloud #Snowflake #Datasets #Redshift #Kafka (Apache Kafka) #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Hadoop #BigQuery #Data Engineering #Big Data #Databases #Java #Azure
Role description
We are seeking an experienced Data Engineer with a strong Java background to support our data platforms and ETL operations. The ideal candidate will have hands-on experience building and maintaining ETL pipelines, writing complex SQL queries, and applying software engineering best practices. This contract role offers a potential path to full-time conversion based on performance
Job Description:
Strong experience developing ETL jobs and data pipelines
Experience with data warehousing (Snowflake, Redshift, BigQuery)
Exposure to big data tools (Spark, Hadoop, Kafka)
Ability to write and optimize complex SQL queries
Solid software development background in Java (Python is a plus)
Experience working with relational databases and large datasets
Familiarity with cloud environments (AWS/Azure/GCP preferred)
We are seeking an experienced Data Engineer with a strong Java background to support our data platforms and ETL operations. The ideal candidate will have hands-on experience building and maintaining ETL pipelines, writing complex SQL queries, and applying software engineering best practices. This contract role offers a potential path to full-time conversion based on performance
Job Description:
Strong experience developing ETL jobs and data pipelines
Experience with data warehousing (Snowflake, Redshift, BigQuery)
Exposure to big data tools (Spark, Hadoop, Kafka)
Ability to write and optimize complex SQL queries
Solid software development background in Java (Python is a plus)
Experience working with relational databases and large datasets
Familiarity with cloud environments (AWS/Azure/GCP preferred)





