

Intellibus
Data Engineer – SQL Expert
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer – SQL Expert, offering a contract length of "unknown" with a pay rate of "unknown." Key skills include AWS, Snowflake, ETL processes, and programming in Python or Java. Requires 10+ years of data engineering experience and a relevant degree.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Data Security #Scala #S3 (Amazon Simple Storage Service) #Redshift #DynamoDB #GIT #Apache Spark #Apache Airflow #Data Wrangling #Data Engineering #Documentation #Computer Science #Big Data #Jira #Spark (Apache Spark) #Java #NoSQL #BigQuery #Snowflake #Database Migration #PostgreSQL #Talend #Airflow #Compliance #Monitoring #Python #Cloud #Security #Data Modeling #Agile #Kafka (Apache Kafka) #Programming #SQL (Structured Query Language) #Data Migration #Data Pipeline #"ETL (Extract #Transform #Load)" #Migration #AWS (Amazon Web Services) #dbt (data build tool) #Project Management #Databases #MySQL #Unix
Role description
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
• Develop and deliver AWS-based data solutions with hands-on cloud engineering expertise.
• Build and manage big data pipelines using Snowflake & Postgres or similar Spark-based platforms.
• Design and implement parallel ETL processes to optimize resources and processing speed.
• Architect data models and modern data solutions, including cloud technologies.
• Collaborate cross-functionally to translate business needs into technical requirements.
• Drive performance monitoring and reliability of the Data Platform.
• Lead the creation of technical specification, design, and end-to-end documentation.
• Ensure runbook SLA compliance and oversee daily cloud platform operations.
• Utilize tools like Git, Jira, and agile methodologies for project management.
• Ensure data security and compliance with relevant regulations.
Key Skills & Qualifications:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
• 10+ years of experience in Data engineering, with a strong focus on building and managing data pipelines.
• Proficiency in programming languages such as Python, Java, or Scala.
• Extensive experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB).
• Demonstrated experience with cloud data platforms (e.g., AWS) and their data services (e.g., S3, Redshift, Snowflake, BigQuery, Data Factory).
• Solid understanding of ETL/ELT principles and experience with tools like Apache Airflow, dbt, or similar.
• Experience with big data technologies such as Apache Spark, Kafka.
• Strong understanding of data modeling, data warehousing concepts, and dimensional modeling.
• Excellent problem-solving, analytical, and communication skills.
• Ability to work independently and as part of a collaborative team.
We work closely with
• Data Wrangling
• ETL
• Talend
• Jasper
• Java
• Python
• Unix
• AWS
• Data Warehousing
• Data Modeling
• Database Migration
• RBAC model
• Data migration
Our Process
• Schedule a 15 min Video Call with someone from our Team
• 4 Proctored GQ Tests (< 2 hours)
• 30-45 min Final Video Interview
• Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
• Develop and deliver AWS-based data solutions with hands-on cloud engineering expertise.
• Build and manage big data pipelines using Snowflake & Postgres or similar Spark-based platforms.
• Design and implement parallel ETL processes to optimize resources and processing speed.
• Architect data models and modern data solutions, including cloud technologies.
• Collaborate cross-functionally to translate business needs into technical requirements.
• Drive performance monitoring and reliability of the Data Platform.
• Lead the creation of technical specification, design, and end-to-end documentation.
• Ensure runbook SLA compliance and oversee daily cloud platform operations.
• Utilize tools like Git, Jira, and agile methodologies for project management.
• Ensure data security and compliance with relevant regulations.
Key Skills & Qualifications:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
• 10+ years of experience in Data engineering, with a strong focus on building and managing data pipelines.
• Proficiency in programming languages such as Python, Java, or Scala.
• Extensive experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB).
• Demonstrated experience with cloud data platforms (e.g., AWS) and their data services (e.g., S3, Redshift, Snowflake, BigQuery, Data Factory).
• Solid understanding of ETL/ELT principles and experience with tools like Apache Airflow, dbt, or similar.
• Experience with big data technologies such as Apache Spark, Kafka.
• Strong understanding of data modeling, data warehousing concepts, and dimensional modeling.
• Excellent problem-solving, analytical, and communication skills.
• Ability to work independently and as part of a collaborative team.
We work closely with
• Data Wrangling
• ETL
• Talend
• Jasper
• Java
• Python
• Unix
• AWS
• Data Warehousing
• Data Modeling
• Database Migration
• RBAC model
• Data migration
Our Process
• Schedule a 15 min Video Call with someone from our Team
• 4 Proctored GQ Tests (< 2 hours)
• 30-45 min Final Video Interview
• Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.