Yorkshire Global Solutions Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in New York, NY, with an 8+ year experience requirement. It offers a W2 contract, focusing on Python, SQL, PySpark, and Google Cloud Platform. A Bachelor’s degree in a related field is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 14, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Project Management #SQL (Structured Query Language) #Storage #Data Pipeline #Scala #Data Quality #Security #Cloud #GCP (Google Cloud Platform) #Big Data #Data Storage #BI (Business Intelligence) #Data Engineering #PySpark #Python #Spark (Apache Spark) #Data Framework #Data Accuracy #Programming #Computer Science #Agile
Role description
Job Title: Data Engineer Location: New York, NY (Onsite) Experience: 8+ Years Client: Kforce / Confidential Contract: W2 Only (No Chance of C2C / 1099) Job Description: We are seeking a skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics and business intelligence initiatives. This role requires strong expertise in cloud technologies, big data frameworks, and programming languages. Key Responsibilities: • Develop and maintain robust data pipelines using Python, PySpark, and SQL. • Design and optimize data workflows on Google Cloud Platform (GCP). • Implement and manage data storage solutions using Hive and other big data technologies. • Collaborate with cross-functional teams to understand data requirements and deliver solutions. • Conduct functional testing to ensure data accuracy and reliability. • Utilize Rally for agile project tracking and task management. • Ensure data quality, integrity, and security across all systems. Required Skills & Qualifications: • Proficiency in Python, SQL, and PySpark • Hands-on experience with Google Cloud Platform (GCP) services. • Strong understanding of Hive and big data ecosystems. • Experience with functional testing in data engineering environments. • Familiarity with Rally or similar agile project management tools. • Bachelor’s degree in Computer Science, Engineering, or related field.