

American Unit, Inc
W2 Only -Need Only Local to North Carolina - ETL Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a W2 ETL Data Engineer position based in North Carolina, offering a competitive pay rate for a contract length of "unknown". Key skills include Python, SQL, and data quality assurance, with a focus on ETL processes and data integration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Raleigh, NC
-
🧠 - Skills detailed
#Data Access #Data Analysis #DevOps #Quality Assurance #Data Engineering #SQL (Structured Query Language) #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Python #Security #S3 (Amazon Simple Storage Service) #Monitoring #Data Quality #Cloud #Agile #Snowflake #Cybersecurity #Data Integration
Role description
We are seeking a skilled mid-level+ Data Engineer to join our team and focus on quality assurance, quality checking, and ETL processes. The successful candidate will be responsible for ensuring the integrity and accuracy of data transferred from a shared file transfer service to an S3 bucket and subsequently into and through our Snowflake data platform. This data will be utilized by downstream applications and reporting systems. These applications and the corresponding consumed data are critical to business process execution.
Key Responsibilities:
• Quality Assurance & Quality Checking: Implement and maintain data quality to ensure the accuracy and reliability of data throughout the ETL process.
• ETL Processes: Design, develop, and optimize ETL workflows to efficiently transfer data from file transfer services to S3 buckets and Snowflake.
• Data Integration: Ensure seamless data integration into data platform, enabling efficient consumption by downstream applications and reporting tools.
• Data Quality Management: Address data quality challenges, including inconsistencies in source data that do not meet ingestion requirements, which can lead to load failures or data backouts.
• Collaboration: Work closely with business owners, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver high-quality data solutions.
• Monitoring & Troubleshooting: To preserve data flow and integrity, monitor pipelines, identify issues, and implement solutions.
Qualifications (Knowledge/Skills/Abilities):
• Demonstrated mid-level+ experience in data engineering, with a emphasis on data quality assurance and ETL processes.
• Expertise in Python, PyPI, and SQL
• Expert analytical and problem-solving skills.
• Demonstrate a strong understanding of cybersecurity principles related to code development, DevOps, data access, and fundamental cybersecurity.
• Understanding of fundamental public-cloud capabilities.
• Proven capacity to comprehend business needs and convert them into technical requirements.
• Demonstrated excellence in communication and collaboration abilities.
• Proven capacity to define success, deliver, and operate in an agile setting.
We are seeking a skilled mid-level+ Data Engineer to join our team and focus on quality assurance, quality checking, and ETL processes. The successful candidate will be responsible for ensuring the integrity and accuracy of data transferred from a shared file transfer service to an S3 bucket and subsequently into and through our Snowflake data platform. This data will be utilized by downstream applications and reporting systems. These applications and the corresponding consumed data are critical to business process execution.
Key Responsibilities:
• Quality Assurance & Quality Checking: Implement and maintain data quality to ensure the accuracy and reliability of data throughout the ETL process.
• ETL Processes: Design, develop, and optimize ETL workflows to efficiently transfer data from file transfer services to S3 buckets and Snowflake.
• Data Integration: Ensure seamless data integration into data platform, enabling efficient consumption by downstream applications and reporting tools.
• Data Quality Management: Address data quality challenges, including inconsistencies in source data that do not meet ingestion requirements, which can lead to load failures or data backouts.
• Collaboration: Work closely with business owners, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver high-quality data solutions.
• Monitoring & Troubleshooting: To preserve data flow and integrity, monitor pipelines, identify issues, and implement solutions.
Qualifications (Knowledge/Skills/Abilities):
• Demonstrated mid-level+ experience in data engineering, with a emphasis on data quality assurance and ETL processes.
• Expertise in Python, PyPI, and SQL
• Expert analytical and problem-solving skills.
• Demonstrate a strong understanding of cybersecurity principles related to code development, DevOps, data access, and fundamental cybersecurity.
• Understanding of fundamental public-cloud capabilities.
• Proven capacity to comprehend business needs and convert them into technical requirements.
• Demonstrated excellence in communication and collaboration abilities.
• Proven capacity to define success, deliver, and operate in an agile setting.






