

Senior Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect in Reading, PA, requiring 14+ years of tech experience, including 12+ years in data engineering. Contract length is 6+ months with a market pay rate. Onsite presence is needed 2-3 days per week.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
August 7, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reading, PA
-
π§ - Skills detailed
#Data Lineage #Spark (Apache Spark) #SQL (Structured Query Language) #ERWin #Data Modeling #API (Application Programming Interface) #Cloud #Kafka (Apache Kafka) #Qlik #Strategy #Leadership #NoSQL #Data Ingestion #Metadata #Data Lake #Talend #Data Management #Data Integration #Python #Tableau #AWS (Amazon Web Services) #Data Architecture #Visualization #Data Pipeline #Data Engineering #PySpark #Data Warehouse #"ETL (Extract #Transform #Load)" #Agile #QlikView #Datasets #Scala #Databases #MDM (Master Data Management) #Big Data #Data Governance
Role description
DATA SOLUTION ARCHITECT
Location: Reading, PA - Must be onsite 2-3 days per week
6+ months contract
Pay rate: Market/DOE
Client: Transportation
Job Duties:
β’ Technical Leader on the Data engineering team, defining IT Process and Technology strategies, familiarity with emerging technology trends, defining a roadmap and building consensus with Senior IT Leadership
β’ Help drive innovation by staying current with rapidly developing Data Engineering landscape and share knowledge internally and with customers
β’ Define and maintain data engineering strategy & roadmap, build consensus with Senior IT Leadership
β’ Meet with Senior & Mid-Level Management to align business process initiatives and strategies with current and planned Penske Data Engineering & Analytics Architecture
β’ Evaluate Data Engineering & Analytics Tools/Techniques/ Approaches and determine impact on strategic objectives and manage overall implementation
Minimum Skills Required:
β’ 14+ years of overall technology experience required
β’ 12+ years of Data Engineering, Data Modeling, Data Warehousing, Master Data Management, Reference Data Management, Data Lineage, Data Governance and Meta Data Management experience required
β’ 7+ years of experience in defining Data & Analytics architecture implementing multiple large volume projects across onprem and oncloud environments. Preferably AWS.
β’ 5+ years of experience collaborating with Agile teams preferred
β’ Expertise in designing, validating, and implementing multiple projects across the hybrid infrastructure (On-cloud to On-Premises and vice versa)
β’ Expertise in setting up Data Lakes and analytical environments
β’ Expertise with Data Engineering tools such as Talend, BODS, Python, PySpark,Kafka,d APIβs etc.
β’ Expertise with relational SQL and NoSQL databases
β’ Experience in building data pipelines, data ingestions, data integrations, data preparations, and traditional Data warehouses and Datamarts
β’ Experience in building processes supporting data transformation, data structures, metadata, dependency, and workload management
β’ Strong experience with data modelling techniques utilizing tools such as Erwin, ER Studio etc.
β’ Experience with visualization tools such as QlikView, Qlik Sense, Tableau, etc.
β’ Experience in message queuing, stream processing, and highly scalable βbig dataβ data stores
β’ Experience with big data tools such as Kafka
β’ Strong analytic skills related to working with complex datasets
β’ Expert knowledge of appropriate design frameworks and patterns and experience in implementing them
β’ Ability to consistently develop knowledge of industry wide technology strategies and best practices
β’ Able to communicate highly technical concepts in clear/concise manner to non-technical individuals
β’ Experience prioritizing technology needs within set budget requirements
β’ Excellent interpersonal and decision-making skills"
DATA SOLUTION ARCHITECT
Location: Reading, PA - Must be onsite 2-3 days per week
6+ months contract
Pay rate: Market/DOE
Client: Transportation
Job Duties:
β’ Technical Leader on the Data engineering team, defining IT Process and Technology strategies, familiarity with emerging technology trends, defining a roadmap and building consensus with Senior IT Leadership
β’ Help drive innovation by staying current with rapidly developing Data Engineering landscape and share knowledge internally and with customers
β’ Define and maintain data engineering strategy & roadmap, build consensus with Senior IT Leadership
β’ Meet with Senior & Mid-Level Management to align business process initiatives and strategies with current and planned Penske Data Engineering & Analytics Architecture
β’ Evaluate Data Engineering & Analytics Tools/Techniques/ Approaches and determine impact on strategic objectives and manage overall implementation
Minimum Skills Required:
β’ 14+ years of overall technology experience required
β’ 12+ years of Data Engineering, Data Modeling, Data Warehousing, Master Data Management, Reference Data Management, Data Lineage, Data Governance and Meta Data Management experience required
β’ 7+ years of experience in defining Data & Analytics architecture implementing multiple large volume projects across onprem and oncloud environments. Preferably AWS.
β’ 5+ years of experience collaborating with Agile teams preferred
β’ Expertise in designing, validating, and implementing multiple projects across the hybrid infrastructure (On-cloud to On-Premises and vice versa)
β’ Expertise in setting up Data Lakes and analytical environments
β’ Expertise with Data Engineering tools such as Talend, BODS, Python, PySpark,Kafka,d APIβs etc.
β’ Expertise with relational SQL and NoSQL databases
β’ Experience in building data pipelines, data ingestions, data integrations, data preparations, and traditional Data warehouses and Datamarts
β’ Experience in building processes supporting data transformation, data structures, metadata, dependency, and workload management
β’ Strong experience with data modelling techniques utilizing tools such as Erwin, ER Studio etc.
β’ Experience with visualization tools such as QlikView, Qlik Sense, Tableau, etc.
β’ Experience in message queuing, stream processing, and highly scalable βbig dataβ data stores
β’ Experience with big data tools such as Kafka
β’ Strong analytic skills related to working with complex datasets
β’ Expert knowledge of appropriate design frameworks and patterns and experience in implementing them
β’ Ability to consistently develop knowledge of industry wide technology strategies and best practices
β’ Able to communicate highly technical concepts in clear/concise manner to non-technical individuals
β’ Experience prioritizing technology needs within set budget requirements
β’ Excellent interpersonal and decision-making skills"