Digitech

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in San Jose, California, hybrid for 12 months at up to $65.49/hour. Requires 5+ years in data engineering, expertise in Python, SQL, Azure Databricks, and experience with financial analytics solutions.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
528
-
πŸ—“οΈ - Date
October 30, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Computer Science #Cloud #Python #Data Warehouse #Scala #Databases #"ETL (Extract #Transform #Load)" #Scripting #Azure Databricks #Data Management #Automation #Storage #Data Governance #Data Pipeline #Data Design #Data Engineering #Databricks #Azure #Documentation #SAP #SQL Server #Airflow #Capacity Management #Tableau #Data Quality #BI (Business Intelligence) #Metadata #Data Architecture #SAP Hana #SQL (Structured Query Language) #Microsoft Power BI #Data Modeling
Role description
Job Title: Data Engineer Location: San Jose, California, USA (Hybrid – Minimum 3 days onsite) Duration: 12 monthsSchedule: 40 hours per week | 8 hours per day | Monday to Friday Compensation: Up to $65.49 USD per hour (W2) Employment Type: Contract (W2 only – No H-1B / No Subcontracting) Position Overview The Financial Data Engineer will be responsible for designing and developing analytical solutions that provide deep insights into financial performance metrics and customer journey data. This position requires a technically skilled professional who can combine advanced data engineering practices with strong business acumen to enable self-service analytics, enhance data reliability, and deliver high-quality outcomes within established timelines. Key Responsibilities β€’ Design, implement, and maintain analytical data models to support financial and business decision-making. β€’ Develop scalable data pipelines and ETL processes to source, transform, and integrate data from multiple systems, including SAP HANA, SQL Server, and other relational databases. β€’ Collaborate with cross-functional teams to refine data models, ensuring alignment with evolving business and technical requirements. β€’ Create and maintain comprehensive documentation, including data design, architecture, and workflow specifications. β€’ Participate in technical reviews, discovery sessions, and modeling exercises to ensure data quality and consistency. β€’ Monitor, troubleshoot, and optimize performance across data services and applications to ensure stability and scalability. β€’ Support analytical reporting initiatives for sales, bookings, finance, and subscription-based business areas. β€’ Drive automation and efficiency in data management through scripting, orchestration, and workflow optimization tools. Required Qualifications β€’ Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related discipline. β€’ Minimum of 5 years of professional experience in data engineering or data analytics. β€’ Proven expertise in Python, SQL/PLSQL, and Azure Databricks. β€’ Demonstrated experience in enterprise data warehousing, data modeling, and performance optimization. β€’ Proficiency in ETL design, development, and testing, including data warehouse metadata modeling and automation. β€’ Familiarity with Airflow or equivalent orchestration tools. β€’ Exposure to BI platforms such as Tableau and Power BI. β€’ Solid understanding of system operations, including capacity management, performance tuning, and troubleshooting (CPU, memory, storage, and network fundamentals). β€’ Strong problem-solving, analytical, and communication skills with a customer-oriented mindset. Preferred Skills β€’ Experience developing analytics solutions for financial, sales, or subscription business domains. β€’ Advanced understanding of cloud data architectures and data governance practices. β€’ Ability to work effectively under tight deadlines with minimal supervision while maintaining attention to detail.