

Tachyon Technologies
Cloud Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a long-term contract for a Cloud Architect specializing in AWS. Key skills include AWS services (S3, Glue, Step Functions), Python, Spark, PostgreSQL, and Boomi. Experience in data migration and cloud adoption is essential. Pay rate is "unknown."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 18, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Apache Airflow #Airflow #Database Management #Data Pipeline #Data Modeling #BI (Business Intelligence) #AWS (Amazon Web Services) #Data Integration #Scala #Data Quality #"ETL (Extract #Transform #Load)" #Boomi #Data Migration #Python #Database Design #Cloud #Data Engineering #Security #PostgreSQL #Tableau #Microsoft Power BI #Databases #Apache Spark #Migration #Programming #S3 (Amazon Simple Storage Service) #AWS S3 (Amazon Simple Storage Service) #AWS Glue #Data Processing #Leadership #Data Governance #Data Analysis #Spark (Apache Spark) #Data Architecture
Role description
Role- AWS Enterprise Architect
Position Type: Long term contract
Job Description:
We are seeking a highly skilled Sr. AWS Architect with expertise in database management, data integration, and orchestration to join our team. The ideal candidate will have hands-on experience in designing, building, and managing large-scale data pipelines on AWS, integrating enterprise systems, and enabling data-driven insights through dashboards and reporting solutions. This role requires deep expertise in AWS services (S3, Glue, Step Functions), data engineering (Python, Spark), integration tools (Boomi), and databases (PostgreSQL), along with strong architecture and problem-solving skills.
Key Responsibilities:
• Design and architect scalable, secure, and cost-efficient data solutions on AWS Cloud.
• Develop and maintain ETL pipelines using AWS Glue, Python, and Apache Spark.
• Implement workflow orchestration using AWS Step Functions or Apache Airflow.
• Design, optimize, and manage relational databases, especially PostgreSQL.
• Integrate enterprise applications and data sources using Boomi or other middleware tools.
• Collaborate with data analysts and BI teams to build dashboards and reports for business insights.
• Ensure data quality, governance, and security across pipelines and platforms.
• Provide architectural guidance, best practices, and mentorship to engineering teams.
• Troubleshoot performance issues, optimize data pipelines, and ensure high system availability.
• Support migration and modernization initiatives, including legacy to AWS cloud transitions.
Professional Skills:
• Proven expertise as an AWS Data Architect / Data Engineer with end-to-end solution design.
• Hands-on experience with AWS S3, Glue ETL, Step Functions, and Airflow orchestration.
• Strong programming skills in Python with Apache Spark for large-scale data processing.
• Solid knowledge of PostgreSQL database design, optimization, and administration.
• Experience with Boomi (or similar iPaaS tools) for integration and orchestration.
• Familiarity with building dashboards and reports (Tableau, QuickSight, Power BI preferred).
• Strong understanding of data modeling, data governance, and security best practices.
• Excellent problem-solving skills and ability to troubleshoot complex data issues.
• Strong communication and collaboration skills to work with business, technical, and leadership teams.
• Experience in data migration, cloud adoption, and modernization projects is a plus.
Role- AWS Enterprise Architect
Position Type: Long term contract
Job Description:
We are seeking a highly skilled Sr. AWS Architect with expertise in database management, data integration, and orchestration to join our team. The ideal candidate will have hands-on experience in designing, building, and managing large-scale data pipelines on AWS, integrating enterprise systems, and enabling data-driven insights through dashboards and reporting solutions. This role requires deep expertise in AWS services (S3, Glue, Step Functions), data engineering (Python, Spark), integration tools (Boomi), and databases (PostgreSQL), along with strong architecture and problem-solving skills.
Key Responsibilities:
• Design and architect scalable, secure, and cost-efficient data solutions on AWS Cloud.
• Develop and maintain ETL pipelines using AWS Glue, Python, and Apache Spark.
• Implement workflow orchestration using AWS Step Functions or Apache Airflow.
• Design, optimize, and manage relational databases, especially PostgreSQL.
• Integrate enterprise applications and data sources using Boomi or other middleware tools.
• Collaborate with data analysts and BI teams to build dashboards and reports for business insights.
• Ensure data quality, governance, and security across pipelines and platforms.
• Provide architectural guidance, best practices, and mentorship to engineering teams.
• Troubleshoot performance issues, optimize data pipelines, and ensure high system availability.
• Support migration and modernization initiatives, including legacy to AWS cloud transitions.
Professional Skills:
• Proven expertise as an AWS Data Architect / Data Engineer with end-to-end solution design.
• Hands-on experience with AWS S3, Glue ETL, Step Functions, and Airflow orchestration.
• Strong programming skills in Python with Apache Spark for large-scale data processing.
• Solid knowledge of PostgreSQL database design, optimization, and administration.
• Experience with Boomi (or similar iPaaS tools) for integration and orchestration.
• Familiarity with building dashboards and reports (Tableau, QuickSight, Power BI preferred).
• Strong understanding of data modeling, data governance, and security best practices.
• Excellent problem-solving skills and ability to troubleshoot complex data issues.
• Strong communication and collaboration skills to work with business, technical, and leadership teams.
• Experience in data migration, cloud adoption, and modernization projects is a plus.






