

Harvey Nash
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Arlington, Virginia (Hybrid) for a 6-month contract, offering $75/hr. Key skills include Spark, Hadoop, Databricks, Python, and SQL. Experience with AWS, GCP, or Azure is a plus.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date
October 9, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Arlington, VA
-
π§ - Skills detailed
#GIT #Data Engineering #Spark (Apache Spark) #Big Data #Azure #GCP (Google Cloud Platform) #Splunk #Monitoring #Python #Cloud #Databricks #Agile #Jenkins #SQL (Structured Query Language) #Impala #NiFi (Apache NiFi) #AWS (Amazon Web Services) #Data Processing #Hadoop
Role description
Job Title:- Senior Data Engineer
Location :- Arlington, Virginia (Hybrid)
Duration:- 6 Months
Overview:-
We are looking for an experienced data engineer who knows big data tools well, especially Spark and Hadoop, and cloud platforms like Databricks. You will work on complex data processing and analytics, create solutions that can grow, and help with both backend and frontend work in a team that uses agile methods.
The team is small and flexible. They focus on building tools for credit risk, portfolio optimization, and advertising insights. They create APIs and user interfaces that make it easy to use analytics in products for clients. The person will also guide junior team members and lead technical design and development.
This role mainly uses big data tools and cloud platforms like Databricks and Spark but is not tied to one cloud provider. Experience with AWS, GCP, or Azure is a plus but not required, as long as you have strong skills in big data processing and engineering.
You should have strong analytical and problem-solving skills, good communication (both writing and speaking), enjoy working with people across different locations and roles, and be motivated, creative, and passionate about innovation.
Required technical skills:-
Spark, Hadoop platform & tools (Hive, Impala, Nifi, Oozie, Scoop)
Databricks
Python, SQL
Nice to have technical skills:-
Knowledge of Splunk or other alerting and monitoring solutions.
Fluent in the use of Git, Jenkins.
A reasonable, good faith estimate of the minimum and maximum for this position is $75/hr. on W2
Benefits will also be available at the following link: Harvey Nash Benefits
About us:
Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry.
Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees. We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide.
For more information, please visit us at https://www.harveynashusa.com/
Job Title:- Senior Data Engineer
Location :- Arlington, Virginia (Hybrid)
Duration:- 6 Months
Overview:-
We are looking for an experienced data engineer who knows big data tools well, especially Spark and Hadoop, and cloud platforms like Databricks. You will work on complex data processing and analytics, create solutions that can grow, and help with both backend and frontend work in a team that uses agile methods.
The team is small and flexible. They focus on building tools for credit risk, portfolio optimization, and advertising insights. They create APIs and user interfaces that make it easy to use analytics in products for clients. The person will also guide junior team members and lead technical design and development.
This role mainly uses big data tools and cloud platforms like Databricks and Spark but is not tied to one cloud provider. Experience with AWS, GCP, or Azure is a plus but not required, as long as you have strong skills in big data processing and engineering.
You should have strong analytical and problem-solving skills, good communication (both writing and speaking), enjoy working with people across different locations and roles, and be motivated, creative, and passionate about innovation.
Required technical skills:-
Spark, Hadoop platform & tools (Hive, Impala, Nifi, Oozie, Scoop)
Databricks
Python, SQL
Nice to have technical skills:-
Knowledge of Splunk or other alerting and monitoring solutions.
Fluent in the use of Git, Jenkins.
A reasonable, good faith estimate of the minimum and maximum for this position is $75/hr. on W2
Benefits will also be available at the following link: Harvey Nash Benefits
About us:
Harvey Nash is a national, full-service talent management firm specializing in technology positions. Our company was founded with a mission to serve as the talent partner of choice for the information technology industry.
Our company vision has led us to incredible growth and success in a relatively short period of time and continues to guide us today. We are committed to operating with the highest possible standards of honesty, integrity, and a passionate commitment to our clients, consultants, and employees. We are part of Nash Squared Group, a global professional services organization with over forty offices worldwide.
For more information, please visit us at https://www.harveynashusa.com/