Noblesoft Solutions

Data Engineer (Need Local to St Pete, FL and ONLY on W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer based in St Petersburg, FL, offering a hybrid work arrangement. Contract length is unspecified, with a pay rate of "unknown". Key skills include SQL, Oracle, AWS services, Python, and ETL/ELT pipeline experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
St. Petersburg, FL
-
🧠 - Skills detailed
#Data Engineering #Data Orchestration #Python #"ETL (Extract #Transform #Load)" #Pandas #Redshift #Airflow #Java #AWS (Amazon Web Services) #SageMaker #Data Warehouse #AWS Glue #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #Data Catalog #Lambda (AWS Lambda) #Oracle #Version Control #ML (Machine Learning) #Apache Airflow #SQL (Structured Query Language) #Schema Design #Automation #Databases #Data Modeling #Data Science #AWS SageMaker
Role description
This is a Hybrid Position in St Petersburg, FL. Need Candidates Local to these locations, NO RELOCS Allowed • Strong proficiency with SQL and hands-on experience working with Oracle databases. • Experience designing and implementing ETL/ELT pipelines and data workflows. • Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM. • Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.). • Solid understanding of data modeling, relational databases, and schema design. • Familiarity with version control, CI/CD, and automation practices. • Ability to collaborate with data scientists to align data structures with model and analytics requirements Preferred • Experience integrating data for use in AWS SageMaker or other ML platforms. • Exposure to MLOps or ML pipeline orchestration. • Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation). • Knowledge of data warehouse design patterns and best practices. • Experience with data orchestration tools (e.g., Apache Airflow, Step Functions). • Working knowledge of Java is a plus.