Aureon Consulting

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Work location is "remote." Key skills include Node.js, TypeScript, AWS serverless, event-driven systems, and data pipelines. Experience with S3-based data lakes is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 9, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Des Moines, IA
-
🧠 - Skills detailed
#Data Quality #NoSQL #SQL (Structured Query Language) #Data Lake #Lambda (AWS Lambda) #"ETL (Extract #Transform #Load)" #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Data Engineering #Data Pipeline #TypeScript #Python #SQS (Simple Queue Service) #Databases #Data Governance #AWS Glue #Scala #Data Processing #Cloud #SNS (Simple Notification Service)
Role description
NO C2C or 3rd Party Inquires please -DIRECT APPLICANTS ONLY Senior Data Engineer (Full Stack / AWS / Event‑Driven) We’re partnering with a technology-driven organization to find a Full Stack + Data Engineer who enjoys working across backend APIs, event-driven systems, and modern data platforms. This is a great fit for an engineer who likes building cloud-native services while also enabling scalable, reliable data pipelines and Lakehouse architectures. What You’ll Be Doing β€’ Build and maintain backend APIs using Node.js, TypeScript, and Express β€’ Design and support AWS serverless solutions (Lambda, Step Functions) β€’ Implement and support event-driven architectures using SNS, SQS, and EventBridge β€’ Develop and maintain data pipelines / ETL workflows β€’ Contribute to S3-based data lakes or lakehouse architectures β€’ Partner with engineering and data teams to ensure data quality, reliability, and performance Required Experience β€’ Strong hands-on experience with Node.js, TypeScript, Express β€’ AWS serverless experience (Lambda, Step Functions) β€’ Experience with event-driven systems (SNS, SQS, EventBridge) β€’ Background building or supporting data pipelines or ETL processes β€’ Working knowledge of: β€’ S3-based data lakes or lakehouse concepts β€’ SQL and NoSQL databases Nice to Have β€’ AWS Glue β€’ Iceberg and Parquet β€’ Lakehouse / medallion architecture (Bronze / Silver / Gold) β€’ Python for data processing or ETL β€’ Experience with high-volume event ingestion β€’ Exposure to data governance and data quality best practices