

Senior Data Engineer - W2 Basis
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Harrisburg, PA, on a long-term W2 contract. Required skills include programming in Python, Scala, or Java, expertise in Big Data technologies, cloud platform proficiency, and experience with ETL tools and data warehousing solutions.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Harrisburg, PA
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #Spark (Apache Spark) #dbt (data build tool) #Synapse #Snowflake #Hadoop #AWS (Amazon Web Services) #Big Data #"ETL (Extract #Transform #Load)" #Schema Design #BigQuery #Informatica #Databases #Cloud #SQL (Structured Query Language) #Azure #Python #Talend #Scala #Kafka (Apache Kafka) #Airflow #Computer Science #Programming #Redshift #Java #Data Engineering #Data Modeling
Role description
Title: Data Engineer
Location: Harrisburg, PA - Onsite
Duration: Long Term Contract
Note: Looking for consultant to work on W2 basis
Required Skills & Qualifications:
-- Bachelorβs/Masterβs in Computer Science, Information Technology, or related field.
-- Strong programming experience with Python, Scala, or Java.
-- Expertise in Big Data technologies (Spark, Hadoop, Hive, Kafka, etc.).
-- Hands-on experience with SQL and working with large relational databases.
-- Proficiency with cloud platforms (AWS, Azure, or GCP) and cloud-native data services.
-- Familiarity with data warehousing solutions (Snowflake, Redshift, BigQuery, Synapse).
-- Experience with ETL tools/workflows (Airflow, Informatica, Talend, DBT).
-- Knowledge of data modeling, schema design, and performance tuning.
-- Strong problem-solving and communication skills.
Title: Data Engineer
Location: Harrisburg, PA - Onsite
Duration: Long Term Contract
Note: Looking for consultant to work on W2 basis
Required Skills & Qualifications:
-- Bachelorβs/Masterβs in Computer Science, Information Technology, or related field.
-- Strong programming experience with Python, Scala, or Java.
-- Expertise in Big Data technologies (Spark, Hadoop, Hive, Kafka, etc.).
-- Hands-on experience with SQL and working with large relational databases.
-- Proficiency with cloud platforms (AWS, Azure, or GCP) and cloud-native data services.
-- Familiarity with data warehousing solutions (Snowflake, Redshift, BigQuery, Synapse).
-- Experience with ETL tools/workflows (Airflow, Informatica, Talend, DBT).
-- Knowledge of data modeling, schema design, and performance tuning.
-- Strong problem-solving and communication skills.