Future Works, S.C.

Data Engineering Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineering Lead with a contract length of "unknown" and a pay rate of "unknown." It is remote and requires 5+ years of experience in data engineering, proficiency in Python and SQL, and domain knowledge in supply chain or logistics.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 11, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Leadership #Libraries #PostgreSQL #Data Quality #Cloud #Strategy #Alation #Business Analysis #AI (Artificial Intelligence) #SQL (Structured Query Language) #Data Processing #Scala #Python #Agile #Quality Assurance #Schema Design #"ETL (Extract #Transform #Load)" #Data Ingestion #Data Lake #Data Engineering #Stories #Data Architecture #ML (Machine Learning)
Role description
About Future Works Future Works is a US-focused, AI-native professional services firm building operational AI and data systems. Our foundation beliefs are simple: optimization lowers need for resources, which increases sustainability and resilience. We are a team of "Time Benders" on a mission to collapse timelines to operational transformation of high footprint companies in energy, real estate, data and beyond. Together, we master velocity. We achieve our mission by building flawless, AI-powered systems. Our technology is already far more advanced than that of boutique services firms, but we are only getting started and the road from here leads to becoming world's best Service-as-Software company. Regardless of the specific role, our ideal candidate will always have a keen interest in building symbiotic human-AI systems both individually and for teams; high energy, high agency and a certain passion for bending time. The role As the Data Engineering Lead, you will build the critical data foundation that makes this transformation possible. You will be accountable for extracting and unifying highly fragmented historical logistics data to power advanced AI simulations. You will work closely with Solutions Architects, Business Analysts, and Full Stack Engineers to establish a secure environment and a canonical schema that translates raw supply chain data into insights that drive speed, efficiency and fiscal gain. Key Responsibilities Data Ingestion & Structuring: Lead the extraction, ingestion, and harmonization of historical freight, order, inventory, and complex carrier rate card data from diverse legacy sources. Schema Design: Define "Minimum Viable Attributes" and architect a unified, canonical data schema that normalizes client data for downstream consumption. Pipeline Development: Build robust, reusable ETL/ELT pipelines to facilitate rapid and structured analysis in a highly regulated environment. Quality Assurance for Modelling: Ensure data quality, completeness, and readiness are sufficient to train predictive machine learning models and run optimization simulations. Infrastructure Collaboration: Work alongside the Solutions Architect to stand up and operate within a secure, single-tenant sandbox environment ensuring strict data isolation. Hypothesis Validation: Support the Operations Strategy team by providing the necessary data structures to decompose current allocation logic, enabling the identification of wrong-node, mode-mix, and escalation drivers. What We’re Looking For Required Skills and Experience 5+ years of senior-level data engineering experience with a proven track record of designing data architecture, data lakes/warehouses, and complex ETL/ELT pipelines. Technical Mastery: Deep proficiency in Python, advanced SQL, and modern data processing libraries Cloud Expertise: Hands-on experience working within cloud environments to deploy secure, high-performance data infrastructure (e.g., PostgreSQL). Domain Knowledge: Previous experience working with supply chain, logistics, or ERP data (shipment histories, inventory levels, freight billing) is highly preferred. Agile Execution: Ability to thrive in rapid, hypothesis-driven sprint cycles (12-week models) where the focus is on validation experiments rather than traditional, slow IT delivery. AI-Native Workflow: Comfort utilizing LLM code assistants (e.g., Cursor, Copilot) and agentic engineering to multiply your productivity and enhance code quality. Our Culture & Benefits We believe that breakthrough results are driven by breakthrough experiences. Our culture is built on a foundation of freedom, high performance, and our seven core values. Our Core Values AI native - We use AI as a natural extension of our abilities to deliver faster, smarter, and better work. Delivering joy - We underpromise and overdeliver, creating delight for our clients and our team. Smooth & fast - We believe speed is the result of precision, flawless systems, and calm executionβ€”not frantic effort. Selfless - We proactively make work easier for others and measure our success by the success of the team. Candid - We are direct, honest, and clear with each other, believing that transparent feedback is a gift. Pursuing excellence - We are obsessed with quality and empower each other to break through boundaries and set new standards. Ever evolving - We are relentlessly curious and improve every single week, sharing our learnings openly to elevate the entire organization. Benefits & Perks Work from anywhere, forever - We are a fully remote and global team. We trust you to manage your time and energy to deliver exceptional results. Connect deeply - We gather for immersive, all-expenses-paid company retreats in unique locations to connect, learn, and grow together. Share in the upside - A competitive compensation package including equity, bonuses, substantial participation in company profits with a clear growth path to C-Level leadership based on performance.