SPECTRAFORCE

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a 7-month contract, hybrid location in Newark, NJ, offering W2 pay. Key skills include AWS services, Python, SQL, and data pipeline management. Requires 7+ years of data engineering experience and leadership capabilities.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 7, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Data Framework #Scripting #Data Science #Cloud #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Data Governance #Data Ingestion #Computer Science #API (Application Programming Interface) #Shell Scripting #Python #Programming #AWS (Amazon Web Services) #Athena #Data Quality #Security #SQL (Structured Query Language) #Code Reviews #Terraform #Leadership #Big Data #RDS (Amazon Relational Database Service) #Compliance #Migration #Data Engineering #Strategy #Data Strategy #Data Pipeline #S3 (Amazon Simple Storage Service) #Data Lake #Redshift #Scala #DevOps #Storage #Data Modeling #Spark (Apache Spark) #Data Warehouse
Role description
Job Title: AWS Data Engineer Duration: 7 months (Possibility of extension) Location: Hybrid (Newark, NJ)/ W2 Only Job Summary: We are seeking an experienced AWS Data Engineer to join our Data Engineering team. As a technical leader, you will be responsible for architecting, implementing, and managing scalable data solutions on AWS. You will provide technical guidance to a team of data engineers, collaborate with cross-functional partners, and ensure best practices in cloud data engineering. Key Responsibilities: β€’ Lead the design, development, and optimization of large-scale, reliable, and secure data pipelines and data lake architecture on AWS. β€’ Architect and implement end-to-end data solutions, including data ingestion, storage, transformation, and analytics using AWS services (Glue, Redshift, S3, Lambda, EMR, Kinesis, Athena, RDS, etc.). β€’ Mentor and guide a team of data engineers, conducting code reviews and fostering best practices in data engineering and cloud architecture. β€’ Collaborate with data scientists, analysts, and business stakeholders to translate requirements into scalable and maintainable solutions. β€’ Oversee migration of data from legacy systems to AWS-based data lakes and data warehouses. β€’ Develop and enforce standards for data quality, security, and governance. β€’ Drive the adoption of DevOps, CI/CD, and infrastructure-as-code practices within the data engineering team. β€’ Ensure solutions are cost-effective, performant, and aligned with enterprise data strategy. β€’ Stay current with advancements in AWS technologies and data engineering trends and evaluate new tools and frameworks for potential adoption. β€’ Troubleshoot complex data issues and provide technical leadership in problem resolution. Required Qualifications: β€’ Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. β€’ 7+ years of experience in data engineering, with at least 3 years in technical leadership or lead engineer role. β€’ Extensive hands-on experience with AWS data services (Glue, Redshift, S3, Lambda, EMR/Spark, Kinesis, Athena, RDS, API Gateway, etc.). β€’ Proficient in programming languages such as Python and SQL; experience with Shell scripting and Scala is a plus. β€’ Strong experience designing, implementing, and managing data lakes, data warehouses, and data ingestion pipelines on AWS. β€’ Proven experience with ETL/ELT processes, data modeling, and big data frameworks. β€’ Demonstrated ability to lead, mentor, and coach engineers in a collaborative team environment. β€’ Experience with DevOps practices, CI/CD pipelines, and infrastructure-as-code tools (e.g., CloudFormation, Terraform). β€’ Excellent problem-solving, communication, and organizational skills. Preferred Qualifications: β€’ AWS Solutions Architect or AWS Data Engineer certification. β€’ Experience with real-time streaming technologies. β€’ Knowledge of data governance, compliance, and security best practices. β€’ Familiarity with Lakehouse architecture and modern data platforms.