Big Data Engineer with AWS - W2 ONLY

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer with AWS, offering a 12-month W2 contract, hybrid work in Reston, VA. Requires 7+ years in software development, 3+ years in Data Integration with Hadoop and AWS, and strong programming skills in Java, Python, or Scala.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 15, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Reston, VA
-
🧠 - Skills detailed
#Hadoop #BI (Business Intelligence) #Big Data #Documentation #Automated Testing #Shell Scripting #Cloudera #Lambda (AWS Lambda) #Spark (Apache Spark) #AWS (Amazon Web Services) #Computer Science #Python #Cloud #Data Integration #Programming #Data Analysis #Scripting #Scala #Sqoop (Apache Sqoop) #Kafka (Apache Kafka) #Data Engineering #S3 (Amazon Simple Storage Service) #Java #SQL (Structured Query Language) #Data Ingestion #Redshift
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, nTech Solutions, is seeking the following. Apply via Dice today! β€’ This is a W2 Role and candidates MUST BE local to Reston, VA β€’ β€’ β€’ Title: Big Data Engineer with AWS Terms of Employment W2 Contract-to-Hire, 12 months This position is hybrid (Once a week). The office is located in Reston, VA. Overview Our client is seeking a Senior Software Engineer to perform technical design, coding, and testing of applications. This role involves serving as a subject matter expert for both customer and internal discussions related to maintaining and enhancing existing software systems. The Senior Software Engineer will also be responsible for developing software solutions for enterprise environments and providing direct technical support for critical trouble calls. Responsibilities Perform technical detail design, coding, and testing of applications. Develop and/or analyze interface design documentation. Serve as a subject matter expert for customer and internal discussions. Perform software analysis, including requirements and use case development and design. Implement and document source code to design specifications. Develop software solutions for enterprise environments. Provide direct technical support for high-level, critical trouble calls. Mentor junior software engineers. Continually evaluate emerging technologies to identify opportunities, trends, and best practices. Required Skills & Experience BA/BS in Computer Science, Information Systems, Information Technology or a related field with 7+ years of prior experience in software development, Data Engineering, and Business Intelligence, or equivalent experience. At least 3+ years of experience working on Data Integration projects using Hadoop MapReduce, Sqoop, Oozie, Hive, Spark, and other Big Data technologies. At least 2+ years of experience with AWS, leveraging services such as Lambda, S3, Redshift, and Glue services. 2+ years of strong programming background with Java, Python, or Scala. Some working experience building Kafka-based data ingestion/retrieval programs. Experience tuning Hadoop, Spark, or Hive parameters for optimal performance. Strong SQL query writing and data analysis skills. Good shell scripting experience. Rigor in high code quality, automated testing, and other engineering best practices. Preferred Skills & Experience Healthcare experience. Cloudera Developer certification.