Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date discovered
September 9, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Database Design #Azure #Documentation #Business Analysis #Data Modeling #SQLAlchemy #Python #Databases #Data Quality #Data Pipeline #Deployment #Grafana #Schema Design #Docker #Code Reviews #Computer Science #Data Science #Monitoring #FastAPI #Data Engineering
Role description
• • • • NO THIRDS PLEASE • • • • TITLE: Python Data Engineer CLIENT: Power Industry LOCATION: California TYPE: Contract RATE: DOE URGENCY: Open to interview and hire WORK SCHEDULE: REMOTE WORK SCHEDULE SUMMARY: We are seeking a skilled Senior Level Python Data Engineer to join our team in Houston, TX and supporting one of our key clients out of California. As a Senior Python Data Engineer, you will be responsible for designing, implementing, and maintaining data pipelines to support our data-driven business. You will also be responsible for developing and maintaining their APIs using the FASTAPI framework and SQL Alchemy ORM. Experience working with LLMs is a plus! The ideal candidate will have 10+ years of experience in data engineering with a focus on Python and will work closely with our Data Scientists and Business Analysts to ensure that data is properly collected, processed, and analyzed to generate insights and drive business decisions. You must have a good understand of SQL. Experience with Postgres is a plus, although not a requirement. Responsibilities: · Design, develop and maintain RESTful APIs using Python and FASTAPI framework. · Extensive experience building APIs and integrating them with various systems and platforms. · Work closely with our database team to ensure seamless integration with Postgres. · Write efficient and reusable code using best practices. · Ensure code quality, including automated tests and code reviews · Collaborate with Data Scientists and Business Analysts to ensure data quality and accuracy. · Develop and maintain data models and data schema. · Deploy and manage applications in Azure using Docker · Set up monitoring and metrics using Grafana · Create and maintain documentation for data pipelines and processes. · Monitor data pipelines to identify and address issues in a timely manner. · Work with cross-functional teams to ensure data is properly integrated across systems. Requirements: Prefer candidates with Power industry experience. · Bachelor's degree in computer science, Information Systems, or a related field · Minimum 3 years of experience in developing APIs using Python · Strong experience with FASTAPI and SQLAlchemy ORM · Strong experience building APIs and integrating them with various systems and platforms · Experience with Postgres or other relational databases · Experience with Docker and Azure deployment is a plus · Familiarity with Grafana or similar monitoring tools is a plus · Strong understanding of software development principles, design patterns and best practices · Knowledge of SQL and database design principles is a plus. · Experience with data modeling and schema design · Strong problem-solving and analytical skills · Excellent communication and collaboration skills