

Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Data Architect contract-to-hire position, offering a pay rate of "unknown." Candidates should have 10-15 years of experience with SQL, MongoDB, ETL, Power BI, Kafka, and Python. A bachelor's degree in a relevant field is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
May 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Security #AWS (Amazon Web Services) #SQL (Structured Query Language) #Microsoft Power BI #Data Integration #Data Pipeline #Data Management #Storage #Visualization #Data Transformations #Data Processing #NoSQL #Cloud #Data Extraction #Scala #Data Engineering #Documentation #MongoDB #Computer Science #Data Modeling #Kafka (Apache Kafka) #Data Storage #Predictive Modeling #Python #RDBMS (Relational Database Management System) #GCP (Google Cloud Platform) #Data Architecture #BI (Business Intelligence) #Apache Airflow #Database Management #Database Performance #Data Strategy #JSON (JavaScript Object Notation) #Automation #Data Quality #Data Governance #Azure #Strategy #"ETL (Extract #Transform #Load)" #Data Science #Talend #Airflow #Databases
Role description
As a Data Architect, you will be responsible for designing, creating, and managing the data architecture for the organization. You will ensure the integration and management of data across various platforms, optimizing processes for data storage, retrieval, and visualization. The ideal candidate will play a key role in shaping our data strategy and infrastructure.
This is a remote, contract-to-hire opportunity at Omni Fiber. We are not able to provide sponsorship currently.
Responsibilities:
β’ Design and implement scalable and efficient data architectures to meet the organization's data needs, including both relational (SQL) and NoSQL (MongoDB) databases.
β’ Develop, manage, and optimize ETL (Extract, Transform, Load) pipelines to facilitate seamless data integration across multiple platforms. Ensure that data is transformed and delivered accurately and efficiently to end-users.
β’ Work with Kafka to design and implement real-time data pipeline modeling, ensuring efficient streaming and event-driven architecture.
β’ Manage and optimize SQL, MongoDB, and JSON-based databases to ensure high performance, security, and reliability. Monitor and improve database performance by implementing best practices.
β’ Collaborate with business intelligence teams to design and implement interactive, real-time dashboards and reports in Power BI. Develop data models to empower business users with actionable insights.
β’ Establish and enforce data governance standards, ensuring that data quality, security, and integrity are maintained across the organization's platforms.
β’ Utilize Python for data processing, automation, and analytics, enabling advanced data transformations and predictive modeling.
β’ Work closely with cross-functional teams including data engineers, analysts, and business stakeholders to ensure alignment between business needs and data architecture.
β’ Create and maintain comprehensive documentation for all data architecture, data flow processes, and data pipeline models. Promote and enforce best practices in data management, modeling, and integration.
Qualifications:
β’ Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field (or equivalent experience).
β’ 10-15 years of proven experience as a Data Architect or similar role, with a focus on SQL, MongoDB, ETL, Power BI, JSON, Kafka, and Python.
β’ Strong expertise in SQL and experience with relational database management systems (RDBMS).
β’ Experience with MongoDB and designing scalable, high-performance NoSQL solutions.
β’ Hands-on experience in ETL processes, including data extraction, transformation, and loading from various sources.
β’ Proficiency in designing and creating interactive reports and dashboards in Power BI.
β’ Strong understanding of data warehousing concepts and data modeling techniques.
β’ Experience with Kafka for data pipeline orchestration and streaming data architecture.
β’ Proficiency in Python for data processing, automation, and analytics.
β’ Experience working with JSON-based data formats and APIs for data integration.
β’ Experience with cloud data platforms (AWS, Azure, GCP) is a plus.
β’ Excellent problem-solving, analytical, and troubleshooting skills.
β’ Strong communication skills and the ability to collaborate with both technical and non-technical teams.
β’ Ability to work independently and handle multiple tasks in a fast-paced environment.
Preferred Qualifications:
β’ Certification in cloud technologies or Power BI.
β’ Experience with data pipeline orchestration tools (Apache Airflow, Talend, etc.).
β’ Familiarity with other data visualization tools or technologies.
As a Data Architect, you will be responsible for designing, creating, and managing the data architecture for the organization. You will ensure the integration and management of data across various platforms, optimizing processes for data storage, retrieval, and visualization. The ideal candidate will play a key role in shaping our data strategy and infrastructure.
This is a remote, contract-to-hire opportunity at Omni Fiber. We are not able to provide sponsorship currently.
Responsibilities:
β’ Design and implement scalable and efficient data architectures to meet the organization's data needs, including both relational (SQL) and NoSQL (MongoDB) databases.
β’ Develop, manage, and optimize ETL (Extract, Transform, Load) pipelines to facilitate seamless data integration across multiple platforms. Ensure that data is transformed and delivered accurately and efficiently to end-users.
β’ Work with Kafka to design and implement real-time data pipeline modeling, ensuring efficient streaming and event-driven architecture.
β’ Manage and optimize SQL, MongoDB, and JSON-based databases to ensure high performance, security, and reliability. Monitor and improve database performance by implementing best practices.
β’ Collaborate with business intelligence teams to design and implement interactive, real-time dashboards and reports in Power BI. Develop data models to empower business users with actionable insights.
β’ Establish and enforce data governance standards, ensuring that data quality, security, and integrity are maintained across the organization's platforms.
β’ Utilize Python for data processing, automation, and analytics, enabling advanced data transformations and predictive modeling.
β’ Work closely with cross-functional teams including data engineers, analysts, and business stakeholders to ensure alignment between business needs and data architecture.
β’ Create and maintain comprehensive documentation for all data architecture, data flow processes, and data pipeline models. Promote and enforce best practices in data management, modeling, and integration.
Qualifications:
β’ Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field (or equivalent experience).
β’ 10-15 years of proven experience as a Data Architect or similar role, with a focus on SQL, MongoDB, ETL, Power BI, JSON, Kafka, and Python.
β’ Strong expertise in SQL and experience with relational database management systems (RDBMS).
β’ Experience with MongoDB and designing scalable, high-performance NoSQL solutions.
β’ Hands-on experience in ETL processes, including data extraction, transformation, and loading from various sources.
β’ Proficiency in designing and creating interactive reports and dashboards in Power BI.
β’ Strong understanding of data warehousing concepts and data modeling techniques.
β’ Experience with Kafka for data pipeline orchestration and streaming data architecture.
β’ Proficiency in Python for data processing, automation, and analytics.
β’ Experience working with JSON-based data formats and APIs for data integration.
β’ Experience with cloud data platforms (AWS, Azure, GCP) is a plus.
β’ Excellent problem-solving, analytical, and troubleshooting skills.
β’ Strong communication skills and the ability to collaborate with both technical and non-technical teams.
β’ Ability to work independently and handle multiple tasks in a fast-paced environment.
Preferred Qualifications:
β’ Certification in cloud technologies or Power BI.
β’ Experience with data pipeline orchestration tools (Apache Airflow, Talend, etc.).
β’ Familiarity with other data visualization tools or technologies.