

Tundra Technical Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "$X/hour." Required skills include advanced SQL, ETL/ELT pipeline experience, and Python. Industry experience in BSA/AML or compliance is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
512
-
🗓️ - Date
February 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Datasets #GitHub #Scala #Snowflake #Airflow #Data Modeling #Compliance #Monitoring #Data Integration #Data Quality #Databricks #Python #BI (Business Intelligence) #Tableau #SQL (Structured Query Language) #Looker #Observability #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Statistics #Automation #Documentation
Role description
What you’ll be doing:
• Build reliable compliance data foundations
• Design, build, and maintain curated datasets and data models (e.g., case, alert, entity, transaction, and risk features) to support compliance controls and analytics.
• Own data quality: implement validation checks, monitoring, lineage, and documentation to ensure datasets are accurate, complete, and audit-ready.
• Partner with engineering teams to close upstream data gaps and standardize event/instrumentation needed for compliance use cases. Enable and enhance detection systems (TMS/EDD/KYC/Screening)
• Translate compliance control requirements into scalable data structures, feature sets, and output tables that support detection logic and downstream triage workflows.
• Analyze detection outputs to identify opportunities to improve signal quality and operational efficiency (e.g., reduce non-actionable noise while protecting sensitivity).
• Support controlled rollouts of detection enhancements by defining success metrics, measurement approaches, and post-deploy monitoring. Measure control effectiveness and produce auditable reporting
• Develop measurement frameworks for compliance control effectiveness (coverage, timeliness, stability, and outcome-based metrics).
• Build repeatable reporting pipelines and dashboards that are consumable by Compliance leaders, operations teams, and auditors.
• Contribute to governance artifacts (definitions, metric logic, runbooks) to ensure metrics are consistent and defensible.
Core qualifications
• 2–6+ years of relevant experience in analytics engineering, data engineering, BI engineering, or data-heavy technical roles (level dependent).
• Advanced SQL skills (complex transformations, performance tuning, data quality checks).
• Experience building and maintaining ETL/ELT pipelines and data models (e.g., dbt, Airflow, or equivalent orchestration/transformation tools).
• Strong data modeling fundamentals (modular design, reusable patterns, dimensional modeling concepts).
• Working Python skills for automation, pipelines, and analysis
• Demonstrated ability to execute on ambiguous, cross-functional projects with clear written and verbal communication.
• High ownership mindset with strong attention to detail and comfort operating in regulated environments.
Nice to haves
• Domain experience in BSA/AML, Transaction Monitoring, sanctions screening, KYC/EDD, fraud, or investigations.
• Familiarity with Snowflake/Databricks and modern warehouse/lakehouse patterns.
• Experience with GitHub-based workflows, CI/CD for data, and data observability tools.
• Experience building dashboards in Looker/Tableau/Superset (or equivalent).
• Applied statistics/experimentation, root-cause analysis, or detection performance measurement.
• Interest in crypto, blockchain analytics, or on-chain/off-chain data integration.
What you’ll be doing:
• Build reliable compliance data foundations
• Design, build, and maintain curated datasets and data models (e.g., case, alert, entity, transaction, and risk features) to support compliance controls and analytics.
• Own data quality: implement validation checks, monitoring, lineage, and documentation to ensure datasets are accurate, complete, and audit-ready.
• Partner with engineering teams to close upstream data gaps and standardize event/instrumentation needed for compliance use cases. Enable and enhance detection systems (TMS/EDD/KYC/Screening)
• Translate compliance control requirements into scalable data structures, feature sets, and output tables that support detection logic and downstream triage workflows.
• Analyze detection outputs to identify opportunities to improve signal quality and operational efficiency (e.g., reduce non-actionable noise while protecting sensitivity).
• Support controlled rollouts of detection enhancements by defining success metrics, measurement approaches, and post-deploy monitoring. Measure control effectiveness and produce auditable reporting
• Develop measurement frameworks for compliance control effectiveness (coverage, timeliness, stability, and outcome-based metrics).
• Build repeatable reporting pipelines and dashboards that are consumable by Compliance leaders, operations teams, and auditors.
• Contribute to governance artifacts (definitions, metric logic, runbooks) to ensure metrics are consistent and defensible.
Core qualifications
• 2–6+ years of relevant experience in analytics engineering, data engineering, BI engineering, or data-heavy technical roles (level dependent).
• Advanced SQL skills (complex transformations, performance tuning, data quality checks).
• Experience building and maintaining ETL/ELT pipelines and data models (e.g., dbt, Airflow, or equivalent orchestration/transformation tools).
• Strong data modeling fundamentals (modular design, reusable patterns, dimensional modeling concepts).
• Working Python skills for automation, pipelines, and analysis
• Demonstrated ability to execute on ambiguous, cross-functional projects with clear written and verbal communication.
• High ownership mindset with strong attention to detail and comfort operating in regulated environments.
Nice to haves
• Domain experience in BSA/AML, Transaction Monitoring, sanctions screening, KYC/EDD, fraud, or investigations.
• Familiarity with Snowflake/Databricks and modern warehouse/lakehouse patterns.
• Experience with GitHub-based workflows, CI/CD for data, and data observability tools.
• Experience building dashboards in Looker/Tableau/Superset (or equivalent).
• Applied statistics/experimentation, root-cause analysis, or detection performance measurement.
• Interest in crypto, blockchain analytics, or on-chain/off-chain data integration.






