

Intellibus
Senior Data Engineer – AWS, Snowflake & ETL Pipelines
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focused on AWS, Snowflake, and ETL pipelines. Contract length is unspecified, with a competitive pay rate. Key skills include ETL, SQL, Python, AWS, and Snowflake proficiency. A Bachelor's degree in a related field is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Maryland, United States
-
🧠 - Skills detailed
#Batch #JSON (JavaScript Object Notation) #AWS (Amazon Web Services) #Scripting #SQS (Simple Queue Service) #Delta Lake #Talend #SNS (Simple Notification Service) #Snowflake #Unix #Lambda (AWS Lambda) #Data Integration #Cloud #Data Wrangling #SQL (Structured Query Language) #Data Bricks #Data Migration #Migration #Airflow #Data Integrity #Data Warehouse #Python #Database Design #Data Modeling #Database Migration #Data Engineering #"ETL (Extract #Transform #Load)" #Leadership #Java #Clustering #BI (Business Intelligence) #Linux #Data Pipeline #Regular Expressions #Kafka (Apache Kafka) #Computer Science #S3 (Amazon Simple Storage Service) #Documentation #Scala #SnowPipe #Databricks #EC2 #Automation
Role description
Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together!
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
• Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.
• Implement and automate ETL processes using Snowflake features such as Snowpipe, Streams, and Tasks.
• Design efficient data models and schemas in Snowflake to support reporting, analytics, and BI workloads.
• Optimize data warehouse performance and scalability using clustering, partitioning, and materialized views.
• Integrate Snowflake with AWS services (S3, Glue, Lambda, EC2, Step Functions) and external data sources.
• Implement data synchronization and validation processes to ensure consistency and data integrity.
• Monitor and optimize query performance and resource usage using query profiling and workload management.
• Define and maintain RBAC models, data masking, encryption, and tokenization standards.
• Set up and manage AWS infrastructure (S3, EC2, SQS, SNS, Glue) to support scalable data workflows.
• Build and orchestrate data pipelines using Airflow or equivalent workflow automation tools.
• Work with MSK Kafka Connect and Delta Lake (Databricks) for real-time and batch integrations.
• Drive SQL performance tuning, database optimization, and continuous improvement of data systems.
• Contribute to design discussions, architecture decisions, and best practices documentation.
• Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
Key Skills & Qualifications:
• ETL: Strong experience in designing and developing ETL workflows for large-scale data integration
• SQL: Expert-level SQL skills for querying, tuning, and data transformation.
• Python: Hands-on experience with Python (Boto3, JSON handling, data structures).
• UNIX/Linux: Proficiency in file operations, scripting, and regular expressions.
• AWS: Deep experience with EC2, Glue, S3, Lambda, Step Functions, SQS/SNS for scalable cloud-based data pipelines.
• Snowflake: Proficient in Snowflake development (Snowpipe, Streams, Tasks, Stored Procedures).
• Database Modeling: Strong understanding of database design, data warehousing, and CDC mechanisms.
• Airflow: Working knowledge of Airflow for orchestrating data workflows and automation.
• Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
• Excellent communication skills and the ability to collaborate with cross-functional teams.
• Leadership experience in mentoring developers and driving technical excellence.
We work closely with
• Data Wrangling
• ETL
• Talend
• Jasper
• Java
• Python
• Unix
• AWS
• Data Warehousing
• Data Modeling
• Database Migration
• RBAC model
• Data migration
Our Process
• Schedule a 15 min Video Call with someone from our Team
• 4 Proctored GQ Tests (< 2 hours)
• 30-45 min Final Video Interview
• Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.
Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together!
Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry.
Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.
What We Offer:
A dynamic environment where your skills will make a direct impact. The opportunity to work with cutting-edge technologies and innovative projects. A collaborative team that values your passion and focus.
We are looking for Engineers who can
• Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake.
• Implement and automate ETL processes using Snowflake features such as Snowpipe, Streams, and Tasks.
• Design efficient data models and schemas in Snowflake to support reporting, analytics, and BI workloads.
• Optimize data warehouse performance and scalability using clustering, partitioning, and materialized views.
• Integrate Snowflake with AWS services (S3, Glue, Lambda, EC2, Step Functions) and external data sources.
• Implement data synchronization and validation processes to ensure consistency and data integrity.
• Monitor and optimize query performance and resource usage using query profiling and workload management.
• Define and maintain RBAC models, data masking, encryption, and tokenization standards.
• Set up and manage AWS infrastructure (S3, EC2, SQS, SNS, Glue) to support scalable data workflows.
• Build and orchestrate data pipelines using Airflow or equivalent workflow automation tools.
• Work with MSK Kafka Connect and Delta Lake (Databricks) for real-time and batch integrations.
• Drive SQL performance tuning, database optimization, and continuous improvement of data systems.
• Contribute to design discussions, architecture decisions, and best practices documentation.
• Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
Key Skills & Qualifications:
• ETL: Strong experience in designing and developing ETL workflows for large-scale data integration
• SQL: Expert-level SQL skills for querying, tuning, and data transformation.
• Python: Hands-on experience with Python (Boto3, JSON handling, data structures).
• UNIX/Linux: Proficiency in file operations, scripting, and regular expressions.
• AWS: Deep experience with EC2, Glue, S3, Lambda, Step Functions, SQS/SNS for scalable cloud-based data pipelines.
• Snowflake: Proficient in Snowflake development (Snowpipe, Streams, Tasks, Stored Procedures).
• Database Modeling: Strong understanding of database design, data warehousing, and CDC mechanisms.
• Airflow: Working knowledge of Airflow for orchestrating data workflows and automation.
• Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
• Excellent communication skills and the ability to collaborate with cross-functional teams.
• Leadership experience in mentoring developers and driving technical excellence.
We work closely with
• Data Wrangling
• ETL
• Talend
• Jasper
• Java
• Python
• Unix
• AWS
• Data Warehousing
• Data Modeling
• Database Migration
• RBAC model
• Data migration
Our Process
• Schedule a 15 min Video Call with someone from our Team
• 4 Proctored GQ Tests (< 2 hours)
• 30-45 min Final Video Interview
• Receive Job Offer
If you are interested in reaching out to us, please apply and our team will contact you within the hour.