Artech L.L.C.

Sr. Databricks Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Databricks Data Engineer in Portland, OR, offering a contract with a competitive pay rate. Requires 10+ years of data engineering experience, strong retail expertise, proficiency in Python, SQL, Databricks, Snowflake, and AWS.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Portland, Oregon Metropolitan Area
-
🧠 - Skills detailed
#MongoDB #Python #Databricks #REST API #Scala #Cloud #Big Data #NoSQL #REST (Representational State Transfer) #Snowflake #Apache Airflow #SQL (Structured Query Language) #Azure #Redis #EC2 #AI (Artificial Intelligence) #Lambda (AWS Lambda) #Data Engineering #Azure Data Factory #"ETL (Extract #Transform #Load)" #Security #Data Pipeline #Pandas #RDBMS (Relational Database Management System) #ADF (Azure Data Factory) #Spark (Apache Spark) #BI (Business Intelligence) #Talend #S3 (Amazon Simple Storage Service) #Alteryx #Athena #SQL Queries #Hadoop #PySpark #NumPy #MySQL #Airflow #Data Quality #DynamoDB #AWS (Amazon Web Services) #Schema Design #IAM (Identity and Access Management) #Data Processing #AWS Glue #JSON (JavaScript Object Notation) #Programming #Databases #ML (Machine Learning) #Libraries #Triggers
Role description
Job Title: Sr. Databricks Data Engineer Location: Portland, OR (Local or Willing to relocate) Job Description: We are seeking a highly skilled Databricks Data Engineer with a minimum of 10 years of total experience, including strong expertise in the retail industry. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. This role requires proficiency in Python, SQL, cloud platforms, and ETL tools within a retail-focused data ecosystem. Key Responsibilities: • Design, develop, and maintain scalable data pipelines using Databricks and Snowflake. • Work with Python libraries such as Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, and JSON for efficient data processing. • Optimize and enhance SQL queries, stored procedures, triggers, and schema designs for RDBMS (MSSQL/MySQL) and NoSQL (DynamoDB/MongoDB/Redis) databases. • Develop and manage REST APIs to integrate various data sources and applications. • Implement AWS cloud solutions using AWS Data Exchange, Athena, Cloud Formation, Lambda, S3, AWS Console, IAM, STS, EC2, and EMR. • Utilize ETL tools such as Apache Airflow, AWS Glue, Azure Data Factory, Talend, and Alteryx to orchestrate and automate data workflows. • Work with Hadoop and Hive for big data processing and analysis. • Collaborate with cross-functional teams to understand business needs and develop efficient data solutions that drive decision-making in the retail domain. • Ensure data quality, governance, and security across all data assets and pipelines. Required Qualifications: • 10+ years of total experience in data engineering and data processing. • 6+ years of hands-on experience in Python programming, specifically for data processing and analytics. • 4+ years of experience working with Databricks and Snowflake. • 4+ years of expertise in SQL development, performance tuning, and RDBMS/NoSQL databases. • 4+ years of experience in designing and managing REST APIs. • 2+ years of working experience in AWS data services. • 2+ years of hands-on experience with ETL tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend, or Alteryx. • 1+ year experience with Hadoop and Hive. • Strong understanding of retail industry data needs and best practices. • Excellent problem-solving, analytical, and communication skills. Preferred Qualifications: • Experience with real-time data processing and streaming technologies. • Familiarity with machine learning and AI-driven analytics. • Certifications in Databricks, AWS, or Snowflake. This is an exciting opportunity to work on cutting-edge data engineering solutions in a fast-paced retail environment. If you are passionate about leveraging data to drive business success and innovation, we encourage you to apply!