Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 16, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Snowflake #S3 (Amazon Simple Storage Service) #Data Security #Security #Athena #Data Storage #SQL (Structured Query Language) #Spark (Apache Spark) #Python #Data Transformations #Tableau #Business Analysis #Agile #Cloud #Data Science #Data Lake #GitHub #Data Quality #"ETL (Extract #Transform #Load)" #Scrum #AWS Glue #Redshift #BI (Business Intelligence) #Datasets #Microsoft Power BI #Jenkins #Lambda (AWS Lambda) #Monitoring #Scala #AWS (Amazon Web Services) #Data Engineering #Compliance #Data Warehouse #Data Modeling #Storage #Visualization #PySpark
Role description
Key Responsibilities β€’ Design, develop, and maintain ETL pipelines using AWS Glue, Glue Studio, and Glue Catalog. β€’ Ingest, transform, and load large datasets from structured and unstructured sources into AWS data lakes/warehouses. β€’ Work with S3, Redshift, Athena, Lambda, and Step Functions for data storage, query, and orchestration. β€’ Build and optimize PySpark/Scala scripts within AWS Glue for complex transformations. β€’ Implement data quality checks, lineage, and monitoring across pipelines. β€’ Collaborate with business analysts, data scientists, and product teams to deliver reliable data solutions. β€’ Ensure compliance with data security, governance, and regulatory requirements (BFSI preferred). β€’ Troubleshoot production issues and optimize pipeline performance. Required Qualifications β€’ 15+ years of experience in Data Engineering, with at least 8+ years on AWS cloud data services. β€’ Strong expertise in AWS Glue, S3, Redshift, Athena, Lambda, Step Functions, CloudWatch. β€’ Proficiency in PySpark, Python, SQL for ETL and data transformations. β€’ Experience in data modeling (star, snowflake, dimensional models) and performance tuning. β€’ Hands-on experience with data lake/data warehouse architecture and implementation. β€’ Strong problem-solving skills and ability to work in Agile/Scrum environments. Preferred Qualifications β€’ Experience in BFSI / Wealth Management domain. β€’ AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification. β€’ Familiarity with CI/CD pipelines for data engineering (CodePipeline, Jenkins, GitHub Actions). β€’ Knowledge of BI/Visualization tools like Tableau, Power BI, QuickSight.