Sr. AWS Data Engg.

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. AWS Data Engineer on a long-term contract in NYC, NY, requiring 3 days onsite weekly. Key skills include AWS Lambda, Glue, ETL, Python/Spark, SQL, and experience in data migration to AWS Redshift.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Cloud #Data Pipeline #Redshift #Python #Snowflake #Macros #AWS Lambda #SQL (Structured Query Language) #Data Engineering #dbt (data build tool) #Data Migration #AWS Glue #Data Architecture #Data Quality #Agile #Spark (Apache Spark) #Data Modeling #Hadoop #Databricks #GitHub #Lambda (AWS Lambda) #Migration #AWS (Amazon Web Services) #Documentation #Aurora
Role description
Our Client ( A Large Financial Org) is urgently looking to hire Sr. AWS Data Engg. Sr. AWS Data Engg. Long Term Contract. Location - NYC, NY Need 3 days onsite every week. This is not a remote role. Major Skills - AWS Lambda, Glue, ETL, Python / Spark, DBT, SQL, Databricks. Need with Data Background. Data Pipelines etc. Multiple Rounds of Interviews + Coding. Our client has a very large initiative to migrate their Data from Hadoop to AWS Redshift. Key skills required for this would be: AWS Redshift, Aurora, AWS Glue, AWS Lambda, etc. Extensive experience with Data gathering and ingestion (from multiple sources), and manipulation, orchestration and optimization on AWS Cloud Data Migration to SnowFlake, Databricks, etc. Hands-on experience with Databricks and AWS. Experience in data architecture, analytics engineering, or data engineering roles. Practical experience building transformation pipelines using dbt Cloud (models, tests, documentation, macros). Solid understanding of data modeling principles, data quality techniques, and pipeline orchestration. Experience with agile delivery practices and CI/CD pipelines using tools like GitHub Actions, dbt Cloud jobs, or Databricks repos. Strong proficiency in SQL, with familiarity using Python in data workflows culture of openness and continuous learning. Skilled in using ETL tools and ELT Tools. Proven track record of leading multi-shore teams on sizable Cloud + Data initiatives Working in an Agile Environment Excellent communication, presentation and client interaction skills Ability to work in a very dynamic environment with multiple onshore stakeholders MUST be willing, available and ready to work from our client's onsite location in the NJ / NYC area two to three times a week Certifications in dbt, Databricks, or AWS are a plus. Strong analytical and communication skills with a collaborative mindset.