InfoCepts

Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer with a contract length of 6-12 months, offering up to $85.00 per hour. It requires onsite work in Raleigh, NC, Dallas, TX, or Phoenix, AZ, and expertise in Snowflake, dbt Cloud, and data ingestion frameworks.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
680
-
πŸ—“οΈ - Date
March 9, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ 85074
-
🧠 - Skills detailed
#Looker #Scala #VPC (Virtual Private Cloud) #Storage #Programming #PCI (Payment Card Industry) #AWS (Amazon Web Services) #dbt (data build tool) #Vault #SQL (Structured Query Language) #Data Quality #Compliance #Airflow #DevOps #ThoughtSpot #Documentation #IAM (Identity and Access Management) #Replication #Anomaly Detection #"ETL (Extract #Transform #Load)" #Statistics #SnowPipe #Data Ingestion #Data Vault #UAT (User Acceptance Testing) #Collibra #Microsoft Power BI #Databases #GIT #Cloud #Leadership #Datadog #Classification #BI (Business Intelligence) #Data Engineering #Terraform #Observability #Data Pipeline #Macros #Schema Design #Python #Qlik #S3 (Amazon Simple Storage Service) #Clustering #Kafka (Apache Kafka) #Automation #Splunk #Snowflake #Security #Deployment
Role description
Skills: Digital : Qlik Sense~Digital : Snowflake~Data Build ToolExperience Required: 6-8 Location: ONSITE- Raleigh NC, Dallas TX, Phoenix AZ Descriptions: You will design, build, and operate secure, audited, and cost-efficient data pipelines on Snowflakeβ€”fromraw ingestion to Data Vault 2.0 modelsand onward to business-friendly consumption layers (mart/semantic). You’ll use Qlik/Glue/ETLs foringestion, dbt Cloud for modeling/testing,MWAA/Airflow and/or dbt Cloud’s orchestration for scheduling, and Terraform (with HashiCorp practices)for infrastructure-as-code. The idealcandidate must have hands-on experience with data ingestion frameworks and Snowflake platformdatabase/schema design, security, networking,and governance that satisfy regulatory and compliance audit requirements.ResponsibilitiesModeling & Warehousing Design and implement scalable data ingestion frameworks Implement Raw ? DV 2.0 (Hubs/Links/Satellites) ? Consumption patterns in dbt Cloud with robust tests (unique/not null/relationships/freshness). Build performant Snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning. Orchestration Author and operate Airflow (MWAA) DAGs and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows with SLAs/SLOs. Security & GovernanceIaC & DevOpsData Quality & ObservabilityCost & PerformanceCompliance Enforce RBAC/ABAC, network policies/rules, masking/row access policies, tags, data classification, and least-privilege role hierarchies. Operationalize audit-ready controls (change management, approvals, runbooks, separation of duties, evidence capture). Use CI/CD flows, Terraform, Git branching for code promotion. Bake tests into dbt; implement contract checks, reconciliations, and anomaly alerts. Monitor with Snowflake ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and forward logs/metrics to SIEM/APM (e.g., Splunk, Datadog). Right-size warehouses, configure auto-suspend/auto-resume, multi-cluster for concurrency, resource monitors, and query optimization. Build controls and evidence to satisfy internal audit, SOX/GLBA/FFIEC/PCI-like expectations.Qualifications Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership Preferred: Snowflake Platform (hands-on, production): Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations. Data protection: Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging. Workloads & features: Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns. Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication. Networking concepts: AWS PrivateLink/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/ endpoint flows. DBT Cloud: Dimensional + Data Vault 2.0 modeling in dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts. Testing and documentation discipline; deployment environments (DEV/QA/UAT/PROD) and job orchestration. Orchestration:Data Quality & Observability:Audit & Controls:Programming & Cloud:Bonus Skills : Airflow (MWAA): Operators/Sensors (dbt, Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG design. Experience deciding when to run in dbt Cloud orchestration vs Airflow, and integrating both cleanly. Contract tests, reconciliations, freshness SLAs, anomaly detection; surfacing lineage and test results to stakeholders. Query tuning (profiling, pruning, statistics awareness, result caching). Change control with approvals/evidence, break-glass procedures, production access separation, audit log retention/immutability. Runbooks, PIR/RCAs, control mapping (e.g., to SOX/GLBA/PCI-like controls where relevant). Python (ETL utils, Airflow tasks), SQL (advanced), and AWS basics (S3, IAM, CloudWatch, MWAA fundamentals). Snowflake governance: data classification at scale, Universal Search, tags + masking automation. Iceberg/external table strategies; Kafka or event-driven ingestion patterns. Great Expectations, Monte Carlo/Anomalo/Atlan/Collibra/BigID integrations. dbt: advanced macros, dbt mesh, custom materializations, Slim CI, state comparison, deferral, exposures to BI lineage. BI/Semantic: ThoughtSpot/Looker/Power BI metric-layer design; semantic modeling concepts. Packaging & distribution: internal dbt packages, reusable Terraform modules, cookie-cutter project templates. Platform engineering: FinOps for Snowflake, cost charge-back/show-back, warehouse auto-tuning utilities. Security engineering: SCIM/SSO (Okta), MFA patterns, service-account hardening, ephemeral credentials. SRE practices: SLIs/SLOs, on-call runbooks, incident management. β€’ MUST HAVE: Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership ORHigh School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership Pay: Up to $85.00 per hour Work Location: In person