

Senior Data Engineer (Databricks)
β - Featured Role | Apply direct with Data Freelance Hub
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#SQL (Structured Query Language) #Azure #Data Lake #Datadog #Data Lakehouse #GIT #Automation #AWS (Amazon Web Services) #Spark (Apache Spark) #ML (Machine Learning) #Prometheus #Scala #Data Modeling #Delta Lake #Python #Big Data #REST API #Databricks #MLflow #GCP (Google Cloud Platform) #Cloud #API (Application Programming Interface) #Airflow #"ETL (Extract #Transform #Load)" #Version Control #REST (Representational State Transfer) #Monitoring #Data Engineering
Role description
One of my clients is looking for a Senior Data Engineer (Databricks) - USA (Remote) for a contract role.
Any visa fine: US Citizen, Green Card, H4 EAD, L2 EAD.
β No H1-B visa holders at this time.
W2 only β No C2C / No third party.
Please note: Travel to the client location (NY/NJ) is required 2β3 days per month, and all expenses will be covered by the client.
Required Skills & Qualifications:
β 10+ years in data engineering with strong exposure to Databricks and big data tools.
β Proficient in Python or Scala for ETL development.
β Strong understanding of Spark, Delta Lake, and Databricks SQL.
β Familiar with REST APIs, including Databricks REST API usage.
β Cloud Platform: Experience with AWS, Azure, or GCP.
β Data Modeling: Familiarity with data lakehouse concepts and dimensional modeling.
β Version Control & CI/CD: Comfortable using Git and pipeline automation tools.
β Soft Skills: Strong problem-solving abilities, attention to detail, and teamwork.
Nice to Have:
β Certifications: Databricks Certified Data Engineer Associate/Professional.
β Workflow Tools: Experience with Airflow or Databricks Workflows.
β Monitoring: Familiarity with Datadog, Prometheus, or similar tools.
β ML Pipelines: Exposure to MLflow or model integration in pipelines.
If interested, please share your updated resume.
One of my clients is looking for a Senior Data Engineer (Databricks) - USA (Remote) for a contract role.
Any visa fine: US Citizen, Green Card, H4 EAD, L2 EAD.
β No H1-B visa holders at this time.
W2 only β No C2C / No third party.
Please note: Travel to the client location (NY/NJ) is required 2β3 days per month, and all expenses will be covered by the client.
Required Skills & Qualifications:
β 10+ years in data engineering with strong exposure to Databricks and big data tools.
β Proficient in Python or Scala for ETL development.
β Strong understanding of Spark, Delta Lake, and Databricks SQL.
β Familiar with REST APIs, including Databricks REST API usage.
β Cloud Platform: Experience with AWS, Azure, or GCP.
β Data Modeling: Familiarity with data lakehouse concepts and dimensional modeling.
β Version Control & CI/CD: Comfortable using Git and pipeline automation tools.
β Soft Skills: Strong problem-solving abilities, attention to detail, and teamwork.
Nice to Have:
β Certifications: Databricks Certified Data Engineer Associate/Professional.
β Workflow Tools: Experience with Airflow or Databricks Workflows.
β Monitoring: Familiarity with Datadog, Prometheus, or similar tools.
β ML Pipelines: Exposure to MLflow or model integration in pipelines.
If interested, please share your updated resume.