

Sr. Cloud Developer (Active Secret or Above Required)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Cloud Developer with an Active Secret clearance, offering a contract length of "unknown" and a pay rate of "unknown." Key skills include AWS services, ETL pipeline development, data governance, and experience in SAFe Agile environments.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 23, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Yes
-
📍 - Location detailed
Clarksburg, WV
-
🧠 - Skills detailed
#Stories #Data Governance #Infrastructure as Code (IaC) #Splunk #Computer Science #Security #Logging #Monitoring #Kubernetes #Kanban #Schema Design #S3 (Amazon Simple Storage Service) #VPC (Virtual Private Cloud) #Java #Data Encryption #Documentation #Network Security #AWS (Amazon Web Services) #Jira #Data Management #Ansible #Deployment #Automation #Docker #Prometheus #API (Application Programming Interface) #Compliance #Metadata #Anomaly Detection #AWS CloudWatch #"ETL (Extract #Transform #Load)" #Data Engineering #Scala #Scrum #Data Privacy #Data Catalog #Grafana #Lambda (AWS Lambda) #Data Integration #Agile #Scripting #Python #DevOps #Data Processing #Cloud #Jenkins #Redshift #Storage #Data Transformations
Role description
Please apply only if you hold an Active Secret (or higher) clearance
Job Description:
Summary:
The Senior Cloud Developer will design, implement, and maintain cloud-based infrastructure to support the CJIS Division’s Data Mesh architecture, ensuring efficient, secure, and scalable data management across enterprise systems. This role focuses on developing robust Extract, Transform, Load (ETL) pipelines, automating deployments, managing AWS-native data services, and fostering data governance to enable self-service analytics and interoperability. The Senior Cloud Developer will collaborate within a SAFe Agile framework to deliver high-quality data products and support organizational adoption of data mesh principles.
Responsibilities:
• Participate in SAFe Agile ceremonies (Program Increment Planning, Sprint Planning, Daily Standups, Sprint Reviews, Retrospectives) to align development with strategic objectives.
• Collaborate with Product Owners, Product Managers, and data domain teams to refine and prioritize the product backlog, ensuring alignment with data mesh and governance standards.
• Design and deploy domain-specific data products, adhering to organizational standards for schema design, data transformations, and storage solutions.
• Develop and maintain scalable ETL pipelines using AWS services (e.g., S3, Redshift, Glue, Lake Formation, Lambda) to support self-service analytics and data domain requirements.
• Implement and manage Infrastructure as Code (IaC) using tools like Ansible, Packer, and AWS CloudFormation to automate cloud deployments and ensure configuration consistency.
• Build and maintain CI/CD pipelines using tools like Bamboo to enable efficient software development and deployment in the cloud.
• Utilize monitoring and logging tools (e.g., AWS CloudWatch, CloudTrail, Splunk, Prometheus, Grafana) for proactive performance monitoring, anomaly detection, and troubleshooting of cloud and data infrastructure.
• Enforce cloud security best practices, including network security, data encryption, multi-factor authentication, and integration with AWS Security Hub, to protect sensitive data assets.
• Maintain a data catalog to enhance discoverability, lineage tracking, and usage transparency across domains.
• Support the operation, maintenance, and reliability of the organization’s Data Mesh software, ensuring high availability.
• Lead training initiatives for data stewards and users on data mesh principles, governance practices, and self-service analytics tools to drive cultural adoption.
• Document work in Jira (Epics, Features, User Stories) and Confluence, maintaining detailed backlogs and ensuring transparency in progress tracking.
• Ensure compliance with data governance standards, including data privacy, access controls, and lineage requirements, using governance tools to automate workflows.
Required Skills:
• 7+ years of experience in AWS cloud development, with a focus on data engineering and DevOps.
• Expertise in AWS data services, including S3, Redshift, Glue, Lake Formation, and Lambda, for building scalable ETL pipelines and data infrastructure.
• Proficiency in Data Mesh architecture and decentralized data management principles.
• Strong knowledge of data governance, including data privacy, access controls, lineage, and metadata management, with experience using governance tools.
• Advanced scripting skills in languages such as Python, Shell, or Java for automation and data processing.
• Experience with IaC tools (e.g., Ansible, Packer, AWS CloudFormation) for automating cloud infrastructure deployments.
• Proficient in CI/CD pipelines using tools like Bamboo, Jenkins, or similar for efficient software delivery.
• Expertise in monitoring and logging tools (e.g., AWS CloudWatch, CloudTrail, Splunk, Prometheus, Grafana) for performance optimization and troubleshooting.
• Strong understanding of cloud security practices, including VPC configuration, data encryption, multi-factor authentication, and AWS Security Hub integration.
• Proven experience in SAFe Agile environments, using Scrum, Kanban, Jira, and Confluence for work management and documentation.
• Ability to lead training on data mesh principles, governance, and analytics tools for technical and non-technical stakeholders.
• Excellent collaboration and communication skills to work with cross-functional data domain teams and stakeholders.
Preferred Qualifications:
• Experience with containerization technologies (e.g., Docker, Kubernetes, ECS) for scalable data processing.
• Familiarity with RESTful API development for data integration and interoperability.
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• Active Secret clearance
Work Environment:
• Primarily remote, with occasional on-site requirements at the CJIS facility in Clarksburg, WV, for equipment pickup or meetings.
• Core hours: 9:00 AM–4:00 PM ET, with potential weekend/non-business hours for maintenance and on-call support (response within one hour).
Please apply only if you hold an Active Secret (or higher) clearance
Job Description:
Summary:
The Senior Cloud Developer will design, implement, and maintain cloud-based infrastructure to support the CJIS Division’s Data Mesh architecture, ensuring efficient, secure, and scalable data management across enterprise systems. This role focuses on developing robust Extract, Transform, Load (ETL) pipelines, automating deployments, managing AWS-native data services, and fostering data governance to enable self-service analytics and interoperability. The Senior Cloud Developer will collaborate within a SAFe Agile framework to deliver high-quality data products and support organizational adoption of data mesh principles.
Responsibilities:
• Participate in SAFe Agile ceremonies (Program Increment Planning, Sprint Planning, Daily Standups, Sprint Reviews, Retrospectives) to align development with strategic objectives.
• Collaborate with Product Owners, Product Managers, and data domain teams to refine and prioritize the product backlog, ensuring alignment with data mesh and governance standards.
• Design and deploy domain-specific data products, adhering to organizational standards for schema design, data transformations, and storage solutions.
• Develop and maintain scalable ETL pipelines using AWS services (e.g., S3, Redshift, Glue, Lake Formation, Lambda) to support self-service analytics and data domain requirements.
• Implement and manage Infrastructure as Code (IaC) using tools like Ansible, Packer, and AWS CloudFormation to automate cloud deployments and ensure configuration consistency.
• Build and maintain CI/CD pipelines using tools like Bamboo to enable efficient software development and deployment in the cloud.
• Utilize monitoring and logging tools (e.g., AWS CloudWatch, CloudTrail, Splunk, Prometheus, Grafana) for proactive performance monitoring, anomaly detection, and troubleshooting of cloud and data infrastructure.
• Enforce cloud security best practices, including network security, data encryption, multi-factor authentication, and integration with AWS Security Hub, to protect sensitive data assets.
• Maintain a data catalog to enhance discoverability, lineage tracking, and usage transparency across domains.
• Support the operation, maintenance, and reliability of the organization’s Data Mesh software, ensuring high availability.
• Lead training initiatives for data stewards and users on data mesh principles, governance practices, and self-service analytics tools to drive cultural adoption.
• Document work in Jira (Epics, Features, User Stories) and Confluence, maintaining detailed backlogs and ensuring transparency in progress tracking.
• Ensure compliance with data governance standards, including data privacy, access controls, and lineage requirements, using governance tools to automate workflows.
Required Skills:
• 7+ years of experience in AWS cloud development, with a focus on data engineering and DevOps.
• Expertise in AWS data services, including S3, Redshift, Glue, Lake Formation, and Lambda, for building scalable ETL pipelines and data infrastructure.
• Proficiency in Data Mesh architecture and decentralized data management principles.
• Strong knowledge of data governance, including data privacy, access controls, lineage, and metadata management, with experience using governance tools.
• Advanced scripting skills in languages such as Python, Shell, or Java for automation and data processing.
• Experience with IaC tools (e.g., Ansible, Packer, AWS CloudFormation) for automating cloud infrastructure deployments.
• Proficient in CI/CD pipelines using tools like Bamboo, Jenkins, or similar for efficient software delivery.
• Expertise in monitoring and logging tools (e.g., AWS CloudWatch, CloudTrail, Splunk, Prometheus, Grafana) for performance optimization and troubleshooting.
• Strong understanding of cloud security practices, including VPC configuration, data encryption, multi-factor authentication, and AWS Security Hub integration.
• Proven experience in SAFe Agile environments, using Scrum, Kanban, Jira, and Confluence for work management and documentation.
• Ability to lead training on data mesh principles, governance, and analytics tools for technical and non-technical stakeholders.
• Excellent collaboration and communication skills to work with cross-functional data domain teams and stakeholders.
Preferred Qualifications:
• Experience with containerization technologies (e.g., Docker, Kubernetes, ECS) for scalable data processing.
• Familiarity with RESTful API development for data integration and interoperability.
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• Active Secret clearance
Work Environment:
• Primarily remote, with occasional on-site requirements at the CJIS facility in Clarksburg, WV, for equipment pickup or meetings.
• Core hours: 9:00 AM–4:00 PM ET, with potential weekend/non-business hours for maintenance and on-call support (response within one hour).