

Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Architect contract position in Stamford, CT, lasting 12+ years, with a pay rate of "unknown." Key skills include Data Modeling, ETL/ELT, and proficiency in cloud platforms (AWS/Azure/GCP). A relevant degree and certification are preferred.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
September 28, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Stamford, CT
-
π§ - Skills detailed
#Data Architecture #GDPR (General Data Protection Regulation) #SQL Server #"ETL (Extract #Transform #Load)" #Data Pipeline #Spark (Apache Spark) #Oracle #Data Modeling #Migration #Compliance #Data Management #Data Governance #Kafka (Apache Kafka) #Computer Science #Data Warehouse #Looker #Tableau #MongoDB #Azure #Security #Data Engineering #Data Lake #SQL (Structured Query Language) #Hadoop #Databases #API (Application Programming Interface) #Cloud #Data Quality #BigQuery #Redshift #MDM (Master Data Management) #Leadership #GCP (Google Cloud Platform) #BI (Business Intelligence) #AWS (Amazon Web Services) #Metadata #Scala #Microsoft Power BI #Big Data #Data Integration #Synapse #Data Security #NoSQL
Role description
Job Type: Contract
Job Category: IT
Job Title: Data Architect Location: Stamford, CT. Experience: 12+ Years
Job Summary: We are seeking a highly experienced Data Architect to lead the design, implementation, and governance of enterprise-level data solutions. This role is responsible for creating scalable data architectures, ensuring data quality, and aligning data initiatives with business objectives.
Key Responsibilities:
Define and implement data architecture strategies, standards, and best practices.
Design data models (conceptual, logical, physical) and data integration frameworks.
Partner with business stakeholders, data engineers, and analysts to enable data-driven decision making.
Establish data governance, metadata management, and master data management frameworks.
Architect and optimize data lakes, data warehouses, and cloud-native solutions (AWS/Azure/GCP).
Ensure security, compliance, and scalability across all data platforms.
Provide technical leadership to engineering teams on data pipelines, ETL/ELT processes, and data APIs.
Core Skills Required:
Expertise in Data Modeling, Data Warehousing, and ETL/ELT pipelines.
Strong proficiency with SQL/NoSQL databases (Oracle, SQL Server, MongoDB, Cassandra, etc.).
Hands-on with Big Data technologies (Hadoop, Spark, Kafka).
Experience in Cloud Data Platforms (AWS Redshift, Azure Synapse, GCP BigQuery).
Knowledge of Data Governance, MDM, and Metadata Management.
Proficiency in API/Data Integration frameworks.
Strong understanding of Data Security and Compliance (GDPR, HIPAA, etc.).
Familiarity with BI/Analytics tools (Tableau, Power BI, Looker).
Preferred Qualifications:
Bachelorβs/Masterβs in Computer Science, Data Engineering, or related field.
Certification in AWS/Azure/GCP Data Architect track.
Prior experience in enterprise-scale data modernization or cloud migration projects.
Required Skills
PERFORMANCE ARCHITECT
Job Type: Contract
Job Category: IT
Job Title: Data Architect Location: Stamford, CT. Experience: 12+ Years
Job Summary: We are seeking a highly experienced Data Architect to lead the design, implementation, and governance of enterprise-level data solutions. This role is responsible for creating scalable data architectures, ensuring data quality, and aligning data initiatives with business objectives.
Key Responsibilities:
Define and implement data architecture strategies, standards, and best practices.
Design data models (conceptual, logical, physical) and data integration frameworks.
Partner with business stakeholders, data engineers, and analysts to enable data-driven decision making.
Establish data governance, metadata management, and master data management frameworks.
Architect and optimize data lakes, data warehouses, and cloud-native solutions (AWS/Azure/GCP).
Ensure security, compliance, and scalability across all data platforms.
Provide technical leadership to engineering teams on data pipelines, ETL/ELT processes, and data APIs.
Core Skills Required:
Expertise in Data Modeling, Data Warehousing, and ETL/ELT pipelines.
Strong proficiency with SQL/NoSQL databases (Oracle, SQL Server, MongoDB, Cassandra, etc.).
Hands-on with Big Data technologies (Hadoop, Spark, Kafka).
Experience in Cloud Data Platforms (AWS Redshift, Azure Synapse, GCP BigQuery).
Knowledge of Data Governance, MDM, and Metadata Management.
Proficiency in API/Data Integration frameworks.
Strong understanding of Data Security and Compliance (GDPR, HIPAA, etc.).
Familiarity with BI/Analytics tools (Tableau, Power BI, Looker).
Preferred Qualifications:
Bachelorβs/Masterβs in Computer Science, Data Engineering, or related field.
Certification in AWS/Azure/GCP Data Architect track.
Prior experience in enterprise-scale data modernization or cloud migration projects.
Required Skills
PERFORMANCE ARCHITECT