Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Remote, USA) with a 5+ year experience in scalable systems. Key skills include Java, Python, Hadoop, Spark, and cloud platforms like Azure and GCP. W2 only; no H1-B visa holders or third-party applications.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 25, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#BigQuery #Data Modeling #Kubernetes #Hadoop #GraphQL #Spark (Apache Spark) #Trino #API (Application Programming Interface) #Scala #Airflow #Kafka (Apache Kafka) #Python #Presto #Data Engineering #Looker #Azure #Data Lake #GCP (Google Cloud Platform) #Java #AI (Artificial Intelligence)
Role description
One of my clients is looking for a Data Engineer - USA (Remote) for a contract role. Note: No H1-B visa holder please! W2 only no C2C - No third party. Required Skills & Qualifications: ● 5+ years of experience in building scalable, resilient systems. ● Skilled in Java, Python, Scala, Node.js, GraphQL, API development. ● Hands-on with Hadoop, Hive, Spark, Kubernetes, Airflow, Data Lakes, Vertex AI, Presto/Trino. ● Strong in software design, distributed systems, algorithms, and data modeling. ● Experienced with Azure, GCP, Kafka Connect, BigQuery, Looker. ● Proven record in architecting and delivering large-scale, data-driven applications. If interested, please send your resume at harsh@hireplusinfotech.com