Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 12-month contract, remote work location, offering competitive pay. Key skills include 7+ years in data integration, cloud architecture in state-approved environments, and proficiency in Python, SQL, and Airflow.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 5, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Schema Design #Python #Data Architecture #Azure #Monitoring #GCP (Google Cloud Platform) #Snowflake #Airflow #Data Engineering #Data Integration #SQL (Structured Query Language) #Cloud #"ETL (Extract #Transform #Load)" #Normalization #BigQuery #Logging #AWS (Amazon Web Services) #Redshift #API (Application Programming Interface) #Data Modeling #Databricks
Role description
Job Title: Data Architect Duration: 12 Months+ high Possibility of extension Location: Remote Core duties β€’ Design the API-driven overlay framework; integrate MFMP, FLAIR/PALM, FACTS, VIP, P-Card, SunBiz, STMS (API and/or secure SFTP flat files). β€’ Data modeling, normalization, lineage, quality, logging/monitoring. Must-have β€’ 7+ yrs data integration (APIs, ETL/ELT), secure file exchanges (SFTP), schema design. β€’ Cloud architecture in state-approved environments (AWS Gov/Azure Gov/GCP w/ FedRAMP). β€’ Strong with one or more: Python, SQL, Airflow, Databricks/Snowflake/Redshift/BigQuery. Nice-to-have β€’ AWS/Azure data certs (e.g., AWS Data Analytics, Azure Data Engineer). β€’ Experience with government financial/procurement data.