SGS Technologie

Sr. Data Engineer (SQL+Python+AWS)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer (SQL+Python+AWS) on a 12+ month contract in St. Petersburg, FL. Key skills include strong SQL, AWS experience, and Python proficiency. A B.S. in a related field and 5+ years of experience are required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 5, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
St. Petersburg, FL
-
🧠 - Skills detailed
#Java #Data Science #AWS SageMaker #Scala #Pandas #Oracle #Batch #DevOps #Apache Airflow #SageMaker #AWS Glue #Cloud #SQL (Structured Query Language) #IAM (Identity and Access Management) #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Airflow #Data Catalog #Security #Redshift #Data Ingestion #S3 (Amazon Simple Storage Service) #Databases #Data Quality #Data Pipeline #Data Orchestration #Python #Version Control #Data Modeling #"ETL (Extract #Transform #Load)" #Computer Science #Predictive Modeling #ML (Machine Learning) #Schema Design #Data Warehouse #Data Engineering #Automation
Role description
looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders. Notes from the Hiring Manager: β€’ Setting up Python environments and data structures to support the Data Science/ML team. β€’ No prior Data Science or Machine Learning experience required. β€’ Role involves building new data pipelines and managing file-loading connections. β€’ Strong SQL skills are essential. β€’ Contract-to-hire position. β€’ Hybrid role based in St. Pete, FL (33716) only. Duties: This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker. It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics. β€’ Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.). β€’ Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads. β€’ Implement and manage data ingestion frameworks, including batch and streaming pipelines. β€’ Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow. β€’ Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency. β€’ Optimize data processes for performance and cost efficiency. β€’ Implement data quality checks, validation, and governance standards. β€’ Work with DevOps and security teams to comply with RJ standards. Skills: Required: β€’ Strong proficiency with SQL and hands-on experience working with Oracle databases. β€’ Experience designing and implementing ETL/ELT pipelines and data workflows. β€’ Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM. β€’ Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.). β€’ Solid understanding of data modeling, relational databases, and schema design. β€’ Familiarity with version control, CI/CD, and automation practices. β€’ Ability to collaborate with data scientists to align data structures with model and analytics requirements Preferred: β€’ Experience integrating data for use in AWS SageMaker or other ML platforms. β€’ Exposure to MLOps or ML pipeline orchestration. β€’ Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation). β€’ Knowledge of data warehouse design patterns and best practices. β€’ Experience with data orchestration tools (e.g., Apache Airflow, Step Functions). β€’ Working knowledge of Java is a plus. Education: B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.