Optomi

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown", offering a pay rate of "$X". Key skills include 6+ years in data engineering, strong SQL, Python/Scala/Java, and experience in FinTech. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
March 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, Texas Metropolitan Area
-
🧠 - Skills detailed
#Data Pipeline #Datadog #Azure #Airflow #Scala #Trino #Data Governance #Compliance #Dimensional Data Models #Kafka (Apache Kafka) #Splunk #AWS (Amazon Web Services) #Batch #Grafana #Spark (Apache Spark) #GCP (Google Cloud Platform) #GitLab #Docker #Java #ML (Machine Learning) #Observability #Snowflake #AI (Artificial Intelligence) #Automation #Data Modeling #Kubernetes #Databricks #Tableau #Data Engineering #Data Integrity #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
DATA FOUNDATIONS ENGINEER Core Responsibilities • Data Engineering & Architecture • Design and implement scalable batch and near-real-time data pipelines. • Develop ETL/ELT workflows optimized for performance and cost. • Implement dimensional data models and standardize business metrics. • Instrument APIs and user journeys to capture behavioral and transactional data. Data Governance & Quality • Ensure data integrity , governance, privacy , and compliance. • Maintain reliability and availability of mission-critical systems. ML & Advanced Use Cases • Support ML/GenAI pipelines (feature engineering, retraining workflows). • Enable RAG-based data preparation and AI-driven automation. Required Qualifications • 6+ years of experience in data engineering for analytics or ML systems. • Strong SQL proficiency . • Experience in Python, Scala, or Java. • Hands-on experience with Spark, Kafka, and Airflow (or similar). • Strong understanding of data modeling and lakehouse architectures (e.g., Iceberg). • Experience with AWS, Azure, or GCP . • Comfortable participating in rotating on-call. • Experience with Snowflake, Databricks, Trino, OLAP/NRT systems, Superset or Tableau. • Familiarity with CI/CD, data observability , infrastructure-as-code. • Exposure to MLOps and GenAI/RAG pipelines. • Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG). • Experience in FinTech, Wallet, or Payments domain. • Nice-to-haves: Docker, Kubernetes, Splunk, Grafana, Scala, GitLab, Spinnaker, Datadog, Rust, GO, or MLOps/GenAI experience.