

Senior Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with 10+ years of experience, based in Atlanta, Georgia or Richmond, Virginia. Contract length is unspecified, with a pay rate of "$XX/hour." Key skills include SQL, Python, ETL tools, and cloud platforms.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 26, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Atlanta, GA
🧠 - Skills detailed
#Informatica #Spark (Apache Spark) #Talend #Data Governance #GIT #Python #AWS (Amazon Web Services) #Data Storage #BigQuery #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Kafka (Apache Kafka) #Metadata #Security #Airflow #Data Processing #Data Lake #Apache NiFi #Data Lineage #SQL (Structured Query Language) #Agile #Scala #Infrastructure as Code (IaC) #Data Modeling #PostgreSQL #SQL Server #DevOps #Snowflake #Scrum #Data Quality #Data Architecture #Data Warehouse #Compliance #Jenkins #MongoDB #Cloud #NiFi (Apache NiFi) #Storage #Terraform #Documentation #Data Pipeline #Azure #Databases #Data Engineering #Data Management #Redshift
Role description
Job Title: Data Architect / Data Engineer -----------------------------W2
Experience Required: 10+ Years
Location: Atlanta, Georgia OR Richmond, Virginia
Visa: H1B, USC, GC, H4 EAD, L2 EAD
Job Description:
We are seeking a highly skilled and experienced Data Architect / Data Engineer with 10+ years of experience to join our growing data team. The ideal candidate will be responsible for designing and developing robust, scalable data architectures and pipelines to support enterprise-level analytics and reporting solutions.
Key Responsibilities:
• Design and implement scalable and efficient data architectures for enterprise data platforms.
• Develop and manage data pipelines and ETL processes using modern data engineering tools.
• Collaborate with cross-functional teams to understand data requirements and translate them into solutions.
• Ensure data quality, integrity, and governance across platforms.
• Work with cloud platforms (AWS, Azure, or GCP) to build and manage data lakes and warehouses.
• Optimize data storage and retrieval for performance and scalability.
• Create and maintain technical documentation and data flow diagrams.
• Implement best practices in data modeling, metadata management, and data lineage tracking.
Required Skills & Qualifications:
• 10+ years of hands-on experience in data architecture and engineering.
• Proficient in SQL, Python, and/or Scala.
• Strong experience with ETL tools (e.g., Apache NiFi, Informatica, Talend, Airflow).
• Deep knowledge of relational and non-relational databases (e.g., SQL Server, PostgreSQL, MongoDB, Cassandra).
• Experience with data warehouse solutions (Snowflake, Redshift, BigQuery, etc.).
• Cloud experience with AWS, Azure, or Google Cloud Platform.
• Familiarity with CI/CD processes and tools (e.g., Git, Jenkins).
• Strong understanding of data governance, security, and compliance standards.
• Excellent communication and problem-solving skills.
Preferred Qualifications:
• Certifications in cloud platforms or data engineering tools.
• Experience with real-time data processing tools (Kafka, Spark Streaming, etc.).
• Knowledge of DevOps practices and infrastructure as code (Terraform, CloudFormation).
• Background in working within Agile/Scrum environments.
Job Title: Data Architect / Data Engineer -----------------------------W2
Experience Required: 10+ Years
Location: Atlanta, Georgia OR Richmond, Virginia
Visa: H1B, USC, GC, H4 EAD, L2 EAD
Job Description:
We are seeking a highly skilled and experienced Data Architect / Data Engineer with 10+ years of experience to join our growing data team. The ideal candidate will be responsible for designing and developing robust, scalable data architectures and pipelines to support enterprise-level analytics and reporting solutions.
Key Responsibilities:
• Design and implement scalable and efficient data architectures for enterprise data platforms.
• Develop and manage data pipelines and ETL processes using modern data engineering tools.
• Collaborate with cross-functional teams to understand data requirements and translate them into solutions.
• Ensure data quality, integrity, and governance across platforms.
• Work with cloud platforms (AWS, Azure, or GCP) to build and manage data lakes and warehouses.
• Optimize data storage and retrieval for performance and scalability.
• Create and maintain technical documentation and data flow diagrams.
• Implement best practices in data modeling, metadata management, and data lineage tracking.
Required Skills & Qualifications:
• 10+ years of hands-on experience in data architecture and engineering.
• Proficient in SQL, Python, and/or Scala.
• Strong experience with ETL tools (e.g., Apache NiFi, Informatica, Talend, Airflow).
• Deep knowledge of relational and non-relational databases (e.g., SQL Server, PostgreSQL, MongoDB, Cassandra).
• Experience with data warehouse solutions (Snowflake, Redshift, BigQuery, etc.).
• Cloud experience with AWS, Azure, or Google Cloud Platform.
• Familiarity with CI/CD processes and tools (e.g., Git, Jenkins).
• Strong understanding of data governance, security, and compliance standards.
• Excellent communication and problem-solving skills.
Preferred Qualifications:
• Certifications in cloud platforms or data engineering tools.
• Experience with real-time data processing tools (Kafka, Spark Streaming, etc.).
• Knowledge of DevOps practices and infrastructure as code (Terraform, CloudFormation).
• Background in working within Agile/Scrum environments.