Aptonet Inc

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Atlanta, GA (Hybrid) with a contract length of "unknown" and a pay rate of "competitive." Key skills include 3+ years in data engineering, strong SQL, ETL/ELT tools, Python, and cloud platforms (AWS/GCP/Azure).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 5, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Atlanta Metropolitan Area
-
🧠 - Skills detailed
#ML (Machine Learning) #Snowflake #Scala #SQL (Structured Query Language) #Strategy #Kafka (Apache Kafka) #Scripting #GCP (Google Cloud Platform) #Airflow #Data Access #"ETL (Extract #Transform #Load)" #BigQuery #Python #Redshift #Cloud #Data Science #Data Quality #dbt (data build tool) #Monitoring #AWS (Amazon Web Services) #Azure #Data Engineering #Data Strategy #Data Warehouse #DevOps #Data Modeling
Role description
Data Engineer Location: Atlanta, GA (Hybrid) Work Authorization: Must be authorized to work in the U.S. (sponsorship not available) About the Role We’re looking for a Data Engineer to join our team in Atlanta. If you love building scalable data systems, optimizing pipelines, and enabling data-driven decisions, you’ll feel right at home here. This role collaborates closely with engineering, analytics, and business stakeholders to turn raw data into meaningful, trusted insights. What You’ll Do β€’ Design, build, and maintain scalable ETL/ELT pipelines and data workflows β€’ Develop and optimize data models to support analytics and applications β€’ Work with internal teams to understand data requirements and deliver reliable solutions β€’ Ensure data quality, integrity, and governance across systems β€’ Build tools to automate data delivery and monitoring β€’ Support cloud-based data infrastructure and pipelines (AWS/GCP/Azure) β€’ Partner with analysts and data scientists to enhance data accessibility and performance What You Bring β€’ 3+ years of professional experience in data engineering or a similar role β€’ Strong SQL skills and experience with modern data warehouses (Snowflake, Redshift, BigQuery, etc.) β€’ Experience with ETL/ELT tools and frameworks (dbt, Airflow, etc.) β€’ Python expertise (or similar scripting language) β€’ Experience with cloud platforms (AWS, GCP, or Azure) β€’ Solid understanding of data modeling, warehousing concepts, and distributed systems β€’ Strong problem-solving skills and a proactive, collaborative mindset Nice-to-Have β€’ Experience supporting real-time data streaming (Kafka, Kinesis, etc.) β€’ Experience with CI/CD and DevOps tooling β€’ Exposure to machine learning pipelines or MLOps environments Why Join β€’ Meaningful work with a growing team β€’ Opportunity to influence architecture and data strategy β€’ Collaborative, supportive culture β€’ Competitive pay + benefits If you want, I can make variations: β€’ more corporate β€’ more startup-fun energy β€’ more senior-level β€’ one tailored for LinkedIn vs Indeed