

Solution Architect (12hrs/week)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Solution Architect focused on Snowflake data ingestion and warehousing, requiring 12 hours/week remotely. Contract runs until August 2025 at $70-80/hr. Key skills include Fivetran expertise, SQL proficiency, and 12+ years in data engineering.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
July 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Quality #Monitoring #Data Pipeline #Consulting #Data Engineering #Data Ingestion #Fivetran #Workday #Vault #Computer Science #dbt (data build tool) #Azure #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Data Management #Metadata #SQL (Structured Query Language) #Cloud #AWS (Amazon Web Services) #Data Integration #Scala #Data Architecture #Snowflake #Data Vault #Data Warehouse #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Solution Architect
β’ No C2C/No Sponsorship
β’ Location: Remote
Hours: 12 hours a week, some work can be done after hours, but must be available during business hours for some calls.
Duration: Contract until end end of Aug 2025 with opportunity to be put on different project codes
Pay: $70-80/hr
SOLUTIONS ARCHITECT β SNOWFLAKE DATA INGESTION & WAREHOUSING (FIVETRAN FOCUS)
Snowflake Professional Services enables enterprise customers to unlock the full value of the Snowflake platform by providing expert guidance on data integration, platform design, and scalable data architecture. We are seeking a Senior Solutions Architect with hands-on expertise in Fivetran-based ingestion pipelines and Snowflake data warehousing best practices to lead and support customers through successful, production-ready implementations.
This role is fully focused on data movement, modeling, and operationalizationβhelping teams architect high-quality, maintainable pipelines from diverse sources into well-designed Snowflake environments.
AS A SENIOR SOLUTIONS ARCHITECT, YOU WILL:
β’ Serve as a technical lead on data ingestion and warehousing engagements, specializing in Fivetran-to-Snowflake pipelines
β’ Design, implement, and optimize data ingestion workflows across systems like Salesforce, Netsuite, Workday, and custom sources via Fivetran
β’ Architect scalable and performant Snowflake data models using dimensional, normalized, or data vault approaches
β’ Guide customers through data pipeline orchestration, transformation (ELT), and dependency management
β’ Provide best practices on data quality, metadata management, lineage, and monitoring
β’ Lead platform configurations: warehouse sizing, resource scaling, access controls, and cost optimization
β’ Collaborate with customer data teams to design future-state architecture and build detailed technical roadmaps
β’ Work across Snowflake internal teams to ensure platform alignment and adoption of new features
REQUIRED SKILLS & QUALIFICATIONS:
β’ Bachelorβs degree in Computer Science, Engineering, or a related field, or equivalent practical experience
β’ 12+ years in data engineering, analytics, or architecture, including significant experience with Snowflake implementations
β’ Strong hands-on expertise with Fivetran, including connector setup, sync optimization, and schema change management
β’ Expert-level SQL skills, including window functions, CTEs, performance tuning, and analytics functions
β’ Proficiency in Snowflake platform features, including role-based access control (RBAC), data sharing, secure views, and time travel
β’ Solid understanding of modern ELT design patterns and orchestration tools such as dbt or Airflow
β’ Experience designing high-volume, high-performance data pipelines in a cloud-native environment
β’ Familiarity with data warehouse testing frameworks and implementation of data validation checks
PREFERRED EXPERIENCE:
β’ Experience with dbt for transformation logic and model versioning
β’ Experience working with third-party APIs or webhook-based integrations into Snowflake
β’ Cloud certifications (AWS, Azure, or GCP), especially related to data engineering
β’ Experience in customer-facing roles within product or consulting organizations
β’ Snowflake SnowPro Core and Advanced Architect or Data Engineer certifications