AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with 8+ years of experience in Python, Pyspark, Apache Airflow, and AWS services. Contract length is unspecified, with a pay rate of "unknown." Location is hybrid in Owings Mills, MD.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Owings Mills, MD
-
🧠 - Skills detailed
#Docker #Python #Data Migration #dbt (data build tool) #Migration #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Apache Airflow #Agile #Cloud #Data Pipeline #EDW (Enterprise Data Warehouse) #Airflow #Vault #AWS (Amazon Web Services) #Data Lake #Kubernetes #Data Engineering #Data Mart #Data Warehouse #Spark (Apache Spark) #PySpark
Role description
AWS Data Engineer W2 & C2C Location: Owings Mills, MD - hybrid onsite (LOCAL CANDIDATES ONLY!!!) - IN PERSON INTERVIEW β€’ β€’ In Person END CLIENT Interview in Owings Mills, MD β€’ β€’ JD: β€’ 8+ years of solid hands-on experience in Python, Pyspark, Apache Airflow and AWS Services (EKS, EMR, HashiCorp Vault, Glue, Docker, Kubernetes). β€’ Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. β€’ Experience in creating data pipelines and orchestrating using Apache Airflow β€’ Overall 10 + years of experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. β€’ Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR β€’ Excellent communication skills to liaise with Business & IT stakeholders. β€’ Expertise in planning execution of a project and efforts estimation. β€’ Exposure to working in Agile ways of working. Note :Interested candidates share your resumes to sai1@vmcsofttech.com or reach me out 4804076917