

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Phoenix, AZ, with a contract length of "unknown" and a pay rate of "unknown." Candidates should have 5+ years in data engineering, expertise in AWS and Azure Data Factory, and strong SQL skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
608
-
ποΈ - Date discovered
July 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#SQL (Structured Query Language) #Azure Data Factory #Amazon RDS (Amazon Relational Database Service) #Agile #Snowflake #AWS RDS (Amazon Relational Database Service) #Cloud #PostgreSQL #Data Security #Qlik #Security #SQL Server #ADF (Azure Data Factory) #Terraform #IAM (Identity and Access Management) #DynamoDB #S3 (Amazon Simple Storage Service) #Microsoft Power BI #Virtualization #dbt (data build tool) #Azure #Storage #Lambda (AWS Lambda) #"ETL (Extract #Transform #Load)" #Java #Tableau #Automation #ML (Machine Learning) #NoSQL #AWS S3 (Amazon Simple Storage Service) #AWS Lambda #Replication #Data Engineering #Computer Science #Statistics #Data Lake #Python #Scala #RDS (Amazon Relational Database Service) #SQL Queries #Data Pipeline #Data Ingestion #AWS (Amazon Web Services) #DevOps #GIT #Infrastructure as Code (IaC) #Oracle #AI (Artificial Intelligence) #BI (Business Intelligence)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Responsibilities
Kforce has a client that is seeking a Data Engineer in Phoenix, AZ. This is a critical, hands-on role where you'll help build and support foundational data infrastructure. You'll be responsible for designing, implementing, and optimizing cloud-based data pipelines and storage solutions, while documenting processes and supporting cross-functional teams. Key Responsibilities:
β’ Data Engineer will administer and optimize Amazon RDS for PostgreSQL: backups, tuning, replication, patching
β’ Design secure, highly available PostgreSQL environments (cloud-native and hybrid)
β’ Implement and support DynamoDB solutions with AWS Lambda/Kinesis integration
β’ Build scalable ETL pipelines using Azure Data Factory
β’ Ingest data from Oracle, SQL Server, and flat files into AWS
β’ Develop and maintain data lakes on AWS S3
β’ As a Data Engineer, you will automate ETL/ELT processes using Python, shell scripts, and cloud-native tools
β’ Collaborate on CI/CD, infrastructure as code (Terraform/CloudFormation), and Git workflows
β’ Enable BI and analytics access to structured/semi-structured data
β’ Participate in Agile ceremonies and contribute to best practices in data engineering
Requirements
β’ Bachelor's degree in Computer Science, Engineering, Statistics, or related field
β’ 5+ years in data engineering and cloud integration
β’ 3+ years of experience with AWS data services and Azure Data Factory
β’ 3+ years of experience in data ingestion and automation using Python or Java
β’ 3+ years of hands-on experience with AWS (RDS, S3, Lambda, Kinesis, IAM, DynamoDB)
β’ 3+ years building pipelines with Azure Data Factory
β’ Experience with DevOps, CI/CD, Git, and data security best practices
β’ Deep expertise in PostgreSQL and RDS administration
β’ Familiarity with NoSQL systems (DynamoDB preferred)
β’ Strong SQL skills and working knowledge of Python or Java
β’ Proven ability to write efficient, optimized SQL queries
β’ Exposure to BI tools (Power BI, Qlik, Tableau), ML/AI, and data virtualization (e.g., Denodo) is a plus
β’ Bonus: Experience with Snowflake, DBT, or other modern data stack tools
The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking βApply Todayβ you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.