

Jobs via Dice
System Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a System Architect on a 12+ month remote contract, offering competitive pay. Key skills include AWS architecture, data pipeline design (EMR/Spark, AWS Glue), SQL proficiency, and strong experience in analytics solutions. AWS certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#PySpark #"ETL (Extract #Transform #Load)" #Leadership #Data Management #Documentation #Indexing #Data Pipeline #Compliance #SQL (Structured Query Language) #AWS Glue #Normalization #Snowflake #Cloud #Security #Data Modeling #Monitoring #Data Governance #DevOps #Schema Design #Batch #Scala #Data Engineering #Data Lake #Metadata #AWS (Amazon Web Services) #BI (Business Intelligence) #Consulting #Data Lineage #Strategy #Amazon Redshift #Agile #Data Quality #Spark (Apache Spark) #Terraform #S3 (Amazon Simple Storage Service) #Data Processing #Storage #Infrastructure as Code (IaC) #Data Ingestion #Redshift #IAM (Identity and Access Management) #Automation #Data Lifecycle
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Fourways Consulting Services, is seeking the following. Apply via Dice today!
Title: System Architect
Location: Remote Position
Duration: 12+ Month Contract
Responsibilities:
Architecture & Technical Leadership
• Provide AWS and VAEC architecture leadership for an analytics team supporting VA LGY initiatives.
• Define end-to-end reference architectures, design patterns, and engineering standards for analytics solutions.
• Lead architecture reviews, ensuring solutions meet security, compliance, scalability, resiliency, and performance requirements.
Data Platform & Pipeline Design
• Architect and design data ingestion and transformation pipelines using EMR/Spark, AWS Glue, and related AWS services.
• Design and optimize Amazon Redshift architectures, including cluster sizing, distribution/sort keys, workload management, and query tuning.
• Create and maintain data models (conceptual/logical/physical) aligned to analytics and reporting use cases.
Performance Tuning & Reliability
• Tune Spark jobs and pipeline performance (e.g., partitioning strategy, executor sizing, shuffle optimization).
• Improve Redshift performance through schema design, query optimization, compression, and vacuum/analyze strategies.
• Establish monitoring, alerting, and runbooks for production operations; drive root-cause analysis and continuous improvement.
Client Advisory & Solutioning
• Identify gaps or problems in client systems and assess technical needs against business strategy.
• Develop solution options, including trade-off analysis and cost implications (e.g., compute/storage choices, workload patterns).
• Produce plans and diagrams showing how solutions will work and present them in an understandable way to technical and non-technical audiences.
Governance, Security & Compliance
• Ensure architecture aligns with required industry regulations and security standards (e.g., data handling, access controls, auditing).
• Partner with security and governance stakeholders to implement data protection, least privilege, and data lifecycle management practices.
Collaboration & Stakeholder Management
• Provide coaching and technical direction to team members; elevate engineering practices through standards and code/design reviews.
• Maintain ongoing relationships with clients and internal partners; provide timely feedback to production teams to improve delivery outcomes.
Required Skills:
Bachelor s degree
AWS & Analytics Architecture
Strong experience designing analytics solutions on AWS, including services such as:
EMR (Spark), AWS Glue, Amazon Redshift, S3, IAM, CloudWatch
Hands-on experience with data pipeline architecture (batch and/or near-real-time), orchestration, and production operations.
Data Engineering & Modeling
Proficiency in Spark (PySpark/Scala) and SQL; strong grasp of distributed data processing concepts.
Experience with data modeling (dimensional modeling, star/snowflake patterns, normalization trade-offs).
Strong SQL skills for performance analysis and tuning (query plans, indexing concepts where applicable, join strategies).
Performance & Cost Optimization
Demonstrated experience with performance tuning for Spark workloads and Redshift queries.
Ability to evaluate and communicate cost implications of architecture choices; implement cost controls and optimization strategies.
Delivery & Documentation
Ability to create clear architecture diagrams, solution documents, and operational runbooks.
Experience collaborating in Agile delivery models and partnering across engineering, product, and operations teams.
Preferred Skills
Experience with VAEC environments, government cloud implementations, or regulated enterprise environments.
AWS certifications (preferred): Solutions Architect, Data Analytics, Security, or DevOps Engineer.
Infrastructure as Code (IaC): Terraform, CloudFormation, and CI/CD automation.
Data governance and cataloging tools (e.g., data lineage, metadata management) and best practices.
Experience with data quality frameworks, testing strategies, and schema evolution in data lakes/warehouses.
Familiarity with streaming/event-driven architectures (e.g., Kinesis/MSK) and/or lakehouse patterns.
Experience integrating BI tools and semantic layers for analytics consumption.
Soft Skills:
• Consultative communication: Ability to explain technical concepts and trade-offs to non-technical stakeholders clearly.
• Leadership without authority: Drives alignment, standards, and good decisions across teams through influence.
• Problem-solving mindset: Quickly isolates issues, identifies root causes, and proposes pragmatic solutions.
• Customer focus: Builds trust, listens actively, and incorporates feedback to improve outcomes.
• Ownership & accountability: Follows through from design to implementation to operational stability.
• Collaboration: Works effectively across architects, engineers, analysts, security, and business partners.
• Adaptability: Comfortable navigating ambiguity and iterating designs as requirements evolve.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Fourways Consulting Services, is seeking the following. Apply via Dice today!
Title: System Architect
Location: Remote Position
Duration: 12+ Month Contract
Responsibilities:
Architecture & Technical Leadership
• Provide AWS and VAEC architecture leadership for an analytics team supporting VA LGY initiatives.
• Define end-to-end reference architectures, design patterns, and engineering standards for analytics solutions.
• Lead architecture reviews, ensuring solutions meet security, compliance, scalability, resiliency, and performance requirements.
Data Platform & Pipeline Design
• Architect and design data ingestion and transformation pipelines using EMR/Spark, AWS Glue, and related AWS services.
• Design and optimize Amazon Redshift architectures, including cluster sizing, distribution/sort keys, workload management, and query tuning.
• Create and maintain data models (conceptual/logical/physical) aligned to analytics and reporting use cases.
Performance Tuning & Reliability
• Tune Spark jobs and pipeline performance (e.g., partitioning strategy, executor sizing, shuffle optimization).
• Improve Redshift performance through schema design, query optimization, compression, and vacuum/analyze strategies.
• Establish monitoring, alerting, and runbooks for production operations; drive root-cause analysis and continuous improvement.
Client Advisory & Solutioning
• Identify gaps or problems in client systems and assess technical needs against business strategy.
• Develop solution options, including trade-off analysis and cost implications (e.g., compute/storage choices, workload patterns).
• Produce plans and diagrams showing how solutions will work and present them in an understandable way to technical and non-technical audiences.
Governance, Security & Compliance
• Ensure architecture aligns with required industry regulations and security standards (e.g., data handling, access controls, auditing).
• Partner with security and governance stakeholders to implement data protection, least privilege, and data lifecycle management practices.
Collaboration & Stakeholder Management
• Provide coaching and technical direction to team members; elevate engineering practices through standards and code/design reviews.
• Maintain ongoing relationships with clients and internal partners; provide timely feedback to production teams to improve delivery outcomes.
Required Skills:
Bachelor s degree
AWS & Analytics Architecture
Strong experience designing analytics solutions on AWS, including services such as:
EMR (Spark), AWS Glue, Amazon Redshift, S3, IAM, CloudWatch
Hands-on experience with data pipeline architecture (batch and/or near-real-time), orchestration, and production operations.
Data Engineering & Modeling
Proficiency in Spark (PySpark/Scala) and SQL; strong grasp of distributed data processing concepts.
Experience with data modeling (dimensional modeling, star/snowflake patterns, normalization trade-offs).
Strong SQL skills for performance analysis and tuning (query plans, indexing concepts where applicable, join strategies).
Performance & Cost Optimization
Demonstrated experience with performance tuning for Spark workloads and Redshift queries.
Ability to evaluate and communicate cost implications of architecture choices; implement cost controls and optimization strategies.
Delivery & Documentation
Ability to create clear architecture diagrams, solution documents, and operational runbooks.
Experience collaborating in Agile delivery models and partnering across engineering, product, and operations teams.
Preferred Skills
Experience with VAEC environments, government cloud implementations, or regulated enterprise environments.
AWS certifications (preferred): Solutions Architect, Data Analytics, Security, or DevOps Engineer.
Infrastructure as Code (IaC): Terraform, CloudFormation, and CI/CD automation.
Data governance and cataloging tools (e.g., data lineage, metadata management) and best practices.
Experience with data quality frameworks, testing strategies, and schema evolution in data lakes/warehouses.
Familiarity with streaming/event-driven architectures (e.g., Kinesis/MSK) and/or lakehouse patterns.
Experience integrating BI tools and semantic layers for analytics consumption.
Soft Skills:
• Consultative communication: Ability to explain technical concepts and trade-offs to non-technical stakeholders clearly.
• Leadership without authority: Drives alignment, standards, and good decisions across teams through influence.
• Problem-solving mindset: Quickly isolates issues, identifies root causes, and proposes pragmatic solutions.
• Customer focus: Builds trust, listens actively, and incorporates feedback to improve outcomes.
• Ownership & accountability: Follows through from design to implementation to operational stability.
• Collaboration: Works effectively across architects, engineers, analysts, security, and business partners.
• Adaptability: Comfortable navigating ambiguity and iterating designs as requirements evolve.






