

AWS DevOps Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS DevOps Engineer in Dallas, TX, for 12 months at a competitive rate. Requires 10+ years of experience, strong AWS skills, data pipeline management, SharePoint administration, and scripting proficiency. Legal tech industry experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 30, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Snowflake #Cloud #Security #Terraform #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #Scripting #Bash #Datadog #Libraries #"ETL (Extract #Transform #Load)" #Documentation #Automation #SharePoint #Data Engineering #Python #Data Integrity #Data Warehouse #Scala #RDS (Amazon Relational Database Service) #AWS DevOps #Data Pipeline #AWS (Amazon Web Services) #DevOps #Monitoring #Infrastructure as Code (IaC) #Logging
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position :: AWS DevOps Engineer
Location :: Dallas, TX ( Day1 Onsite )
Exp Req :: 10+
Duration : 12 Months
Job Description:
We are seeking an experienced DevOps Engineer to join our team for a critical infrastructure rebuild project.
The ideal candidate will be responsible for managing and monitoring the cloud infrastructure on AWS, ensuring the reliability and performance of our data warehouse solution, and supporting our automation and integration workflows.
This role requires a strong background in AWS, data pipeline management, and SharePoint administration.
Key Responsibilities
β’ AWS Infrastructure Management:
Manage and monitor AWS resources, including RDS and other services used for the data warehouse.
Ensure the security, scalability, and performance of the AWS environment.
Manage costs and optimize resource utilization.
β’ Data Warehouse and Pipeline Management:
Support the data engineering team in building and maintaining the data pipeline from various sources to the Snowflake data warehouse.
Monitor the health and performance of the data pipeline, ensuring data integrity and availability.
Troubleshoot and resolve any issues related to the data infrastructure.
β’ SharePoint Administration:
Administer the SharePoint environment, including managing permissions, libraries, and site collections.
Support the 2-way file synchronization process between Filevine and SharePoint, ensuring its reliability and performance.
Troubleshoot any issues related to SharePoint and the file synchronization process.
β’ Monitoring and Support:
Implement and manage monitoring and alerting solutions for the entire infrastructure.
Provide support to the development team for any infrastructure-related issues.
Participate in on-call rotation for critical infrastructure support.
β’ Collaboration and Documentation:
Work closely with the development and data engineering teams to ensure the infrastructure meets their needs.
Create and maintain clear and comprehensive documentation for the infrastructure and all related processes.
Participate in project planning, status updates, and architecture discussions.
## Required Skills and Qualifications
Strong experience with AWS:
Demonstrable experience managing AWS resources, including RDS, S3, and IAM.
Experience with data pipelines and ETL processes:
A good understanding of how to build and manage data pipelines.
Proficiency with SharePoint administration:
Experience managing SharePoint Online, including permissions, sites, and libraries.
Experience with monitoring and logging tools:
Familiarity with tools like CloudWatch, Datadog, or similar.
β’ Strong scripting skills:
Proficiency in a scripting language like Python or Bash.
Excellent problem-solving and communication skills.
Desired Skills
β’ Experience with Snowflake.
β’ Experience with Workato or other integration platforms.
β’ Knowledge of infrastructure as code (IaC) tools like Terraform or CloudFormation.
β’ Experience in the legal tech industry.