

Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in the Pharma domain, based in Indianapolis, US, for a 12+ month contract at a pay rate of "unknown." Requires 12+ years of experience, expertise in database management, ETL processes, and familiarity with big data technologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 3, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
1099 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Indianapolis, IN
-
🧠 - Skills detailed
#Airflow #Data Lake #Kafka (Apache Kafka) #AWS (Amazon Web Services) #Data Modeling #Spark (Apache Spark) #Microsoft Power BI #Hadoop #"ETL (Extract #Transform #Load)" #Oracle #Documentation #Database Management #Data Profiling #Database Systems #ML (Machine Learning) #Apache Airflow #MySQL #Data Science #PostgreSQL #Data Integration #Scala #Data Storage #Databases #Azure #Data Architecture #Computer Science #Data Management #Compliance #AI (Artificial Intelligence) #Physical Data Model #Data Integrity #SQL (Structured Query Language) #Data Engineering #Data Pipeline #BI (Business Intelligence) #Big Data #Security #GDPR (General Data Protection Regulation) #Storage #SQL Server #Data Warehouse #DevOps #Snowflake #Data Quality #Tableau #Cloud #Microsoft Azure #Data Governance
Role description
Position: Data Architect-Pharma Domain
Location: Indianapolis, US-Onsite
Duration: 12+ Months Contract
Position Summary:
• We are seeking a skilled and experienced Data Architect to design, optimize, and oversee our organization's data systems and infrastructure esp for Customer Support Program. The ideal candidate will play a key role in ensuring the organization’s data is organized, secure, and effectively leveraged for decision-making and innovation
• This position is responsible for providing support for ETL / ELT / File Movement of data. The key responsibilities will be to process and move data between different compute and storage services, as well as on-premises data sources at specified intervals. The employee will also be responsible for the creation, scheduling, orchestration and management of data pipelines.
Key Responsibilities:
Data Architecture Design & Implementation
• Design and implement robust, scalable, and efficient data architectures to support current and future business requirements.
• Develop and maintain conceptual, logical, and physical data models.
• Ensure seamless integration of data from multiple sources into a unified structure.
Data Management & Optimization
• Establish and enforce data governance policies, including data quality, privacy, and security standards.
• Optimize database systems and data pipelines for performance, reliability, and scalability.
• Conduct data profiling, cleansing, and transformation activities to ensure data integrity.
Collaboration & Stakeholder Management
• Work closely with business stakeholders, analysts, and developers to gather data requirements and translate them into technical solutions.
• Collaborate with IT and DevOps teams to deploy and manage databases and data solutions.
• Provide guidance to data engineers, analysts, and other team members on best practices and architectural principles.
Technology Evaluation & Adoption
• Stay updated on emerging data technologies and recommend tools and platforms that align with business needs.
• Evaluate, select, and implement data storage solutions (e.g., data lakes, data warehouses, cloud solutions).
• Drive adoption of modern data architecture approaches such as event-driven architecture, real-time processing, or distributed data systems.
Documentation & Reporting:
• Develop comprehensive documentation for data structures, processes, and architectural designs.
• Provide regular updates to management on the status and performance of data systems.
Required Qualifications:
• Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field with 12+ years of experience
• Proven experience as a Data Architect, Data Engineer, or similar role.
• Expertise in database management systems (e.g., SQL Server, Oracle, PostgreSQL, MySQL).
• Proficiency in data modeling tools (e.g., ER/Studio, Lucidchart) and techniques.
• Hands-on experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, Google Cloud).
• Strong understanding of ETL processes, data pipelines, and data integration.
Basic Salesforce Knowledge
• Knowledge of data governance, security practices, and compliance standards (e.g., GDPR, HIPAA).
• Familiarity with modern tools such as Snowflake, Kafka, or Apache Airflow is a plus.
Key Competencies:
• Excellent analytical and problem-solving skills.
• Strong communication and interpersonal skills for collaboration with technical and non-technical teams.
• Ability to handle multiple projects and prioritize effectively.
• A proactive approach to learning and adopting new technologies.
Preferred Skills:
• Experience with machine learning frameworks or data science tools.
• Familiarity with BI tools (e.g., Tableau, Power BI).
• Certification in cloud or database technologies (e.g., AWS Data Analytics, Microsoft Azure Data Engineer).
• Exposure to AI and Salesforce are added advantage
Position: Data Architect-Pharma Domain
Location: Indianapolis, US-Onsite
Duration: 12+ Months Contract
Position Summary:
• We are seeking a skilled and experienced Data Architect to design, optimize, and oversee our organization's data systems and infrastructure esp for Customer Support Program. The ideal candidate will play a key role in ensuring the organization’s data is organized, secure, and effectively leveraged for decision-making and innovation
• This position is responsible for providing support for ETL / ELT / File Movement of data. The key responsibilities will be to process and move data between different compute and storage services, as well as on-premises data sources at specified intervals. The employee will also be responsible for the creation, scheduling, orchestration and management of data pipelines.
Key Responsibilities:
Data Architecture Design & Implementation
• Design and implement robust, scalable, and efficient data architectures to support current and future business requirements.
• Develop and maintain conceptual, logical, and physical data models.
• Ensure seamless integration of data from multiple sources into a unified structure.
Data Management & Optimization
• Establish and enforce data governance policies, including data quality, privacy, and security standards.
• Optimize database systems and data pipelines for performance, reliability, and scalability.
• Conduct data profiling, cleansing, and transformation activities to ensure data integrity.
Collaboration & Stakeholder Management
• Work closely with business stakeholders, analysts, and developers to gather data requirements and translate them into technical solutions.
• Collaborate with IT and DevOps teams to deploy and manage databases and data solutions.
• Provide guidance to data engineers, analysts, and other team members on best practices and architectural principles.
Technology Evaluation & Adoption
• Stay updated on emerging data technologies and recommend tools and platforms that align with business needs.
• Evaluate, select, and implement data storage solutions (e.g., data lakes, data warehouses, cloud solutions).
• Drive adoption of modern data architecture approaches such as event-driven architecture, real-time processing, or distributed data systems.
Documentation & Reporting:
• Develop comprehensive documentation for data structures, processes, and architectural designs.
• Provide regular updates to management on the status and performance of data systems.
Required Qualifications:
• Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field with 12+ years of experience
• Proven experience as a Data Architect, Data Engineer, or similar role.
• Expertise in database management systems (e.g., SQL Server, Oracle, PostgreSQL, MySQL).
• Proficiency in data modeling tools (e.g., ER/Studio, Lucidchart) and techniques.
• Hands-on experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, Google Cloud).
• Strong understanding of ETL processes, data pipelines, and data integration.
Basic Salesforce Knowledge
• Knowledge of data governance, security practices, and compliance standards (e.g., GDPR, HIPAA).
• Familiarity with modern tools such as Snowflake, Kafka, or Apache Airflow is a plus.
Key Competencies:
• Excellent analytical and problem-solving skills.
• Strong communication and interpersonal skills for collaboration with technical and non-technical teams.
• Ability to handle multiple projects and prioritize effectively.
• A proactive approach to learning and adopting new technologies.
Preferred Skills:
• Experience with machine learning frameworks or data science tools.
• Familiarity with BI tools (e.g., Tableau, Power BI).
• Certification in cloud or database technologies (e.g., AWS Data Analytics, Microsoft Azure Data Engineer).
• Exposure to AI and Salesforce are added advantage