Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect on a 6+ month contract, offering remote work in the U.S. Requires 10+ years in data engineering, expertise in Snowflake, dbt, Tableau, strong SQL skills, and experience with Python, JavaScript, and data warehousing.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 18, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#JavaScript #Data Quality #Data Engineering #dbt (data build tool) #HTML (Hypertext Markup Language) #Visualization #Scala #Code Reviews #"ETL (Extract #Transform #Load)" #GitHub #Tableau #Version Control #SQL (Structured Query Language) #Python #Oracle #Snowflake #Security #Jira #Cloud #Data Architecture #Workday #Data Pipeline #SQL Queries #Computer Science #Data Access #Data Ingestion
Role description
The client is seeking a hands-on Data Architect to lead the design and development of its enterprise-wide data platform. You will architect scalable solutions from data ingestion to visualization, enabling actionable insights across the business. As a strategic and technical leader, you’ll collaborate with executives and cross-functional teams to gather requirements, define data models, and build analytics solutions using modern tools such as Snowflake, dbt, and Tableau. This is a high-impact role requiring both architectural vision and the ability to code and optimize the data stack directly. Responsibilities β€’ Architect and implement data pipelines from ingestion to visualization using Snowflake, dbt, and Tableau. β€’ Model and transform non-uniform data for analytics and dashboards. β€’ Optimize platform performance and ensure data quality, reliability, and security. β€’ Develop reusable components, SQL queries, and scripts using Python, JavaScript, and HTML. β€’ Collaborate across teams to gather requirements and deliver data-driven insights. β€’ Provide mentorship and code reviews for junior engineers. β€’ Manage data access controls, CI/CD pipelines, and version control in GitHub. Qualifications β€’ Bachelor’s degree in Computer Science or related field. β€’ 10+ years of data engineering experience; 8+ years with data warehousing (Snowflake preferred). β€’ Strong SQL skills; experience with dbt Cloud and semantic modeling. β€’ Expertise in Tableau dashboarding and data source creation. β€’ 2+ years of Python; familiarity with JavaScript/HTML preferred. β€’ Experience integrating with tools such as Salesforce, Oracle, Workday, Jira, and Anaplan. β€’ Strong communication skills and stakeholder engagement experience. Type: Contract (6+ months) Location: United States (Remote)