

Senior Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with a contract length of "unknown" and a pay rate of "unknown." It is fully remote, requiring 10–12+ years of experience in data engineering, advanced SQL proficiency, and expertise in BigQuery on GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date discovered
June 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Philadelphia, PA
-
🧠 - Skills detailed
#Cloud #AWS (Amazon Web Services) #Data Warehouse #Debugging #Data Quality #BigQuery #Azure #Datasets #GCP (Google Cloud Platform) #Data Pipeline #Scala #Automation #SQL (Structured Query Language) #SQL Server #Data Architecture #Data Engineering #Agile #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Architect / Senior Data Engineer
Location: Fully remote - NO C2C for this position
Overview:
We’re seeking an experienced Data Architect / Senior Data Engineer to help design and implement robust data solutions that support business goals and enable informed decision-making. This role requires both strategic architectural insight and hands-on engineering skills to develop scalable data systems on cloud platforms. You’ll work closely with cross-functional teams to ensure data quality, drive solutioning efforts, and support enterprise data initiatives.
Key Responsibilities:
• Design and implement end-to-end data architecture and data flows aligned with business and technical requirements
• Build and optimize data warehouses and data pipelines, particularly in BigQuery (GCP preferred)
• Process and manipulate large, complex datasets using advanced SQL
• Review, troubleshoot, and optimize data code for accuracy and performance
• Create proofs of concept (POC) to validate architectural approaches and data solutions
• Collaborate with business stakeholders and technical teams to define data strategies and drive adoption
• Work within Agile development frameworks and contribute to sprint planning and delivery
• Use tools such as Airflow for orchestration and automation of data workflows
Required Qualifications:
• 10–12+ years of experience in data engineering, architecture, or solutioning roles
• Strong hands-on experience with SQL, data warehouse development, and cloud platforms
• Advanced SQL proficiency for processing large datasets and data troubleshooting
• Experience with BigQuery and the Google Cloud Platform (GCP) is strongly preferred
• Will also consider candidates with AWS or Azure experience if other qualifications align
• Strong knowledge of Airflow, SQL Server, and enterprise data warehousing practices
• Proven experience reviewing and debugging data code
• Excellent communication skills – able to engage with both technical and business stakeholders
• Experience working in an Agile environment
Preferred Qualifications:
• Background in healthcare or provider data systems
• Experience building and validating architectural proofs of concept