

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3-4 years of experience, offering a contract of over 6 months. Key skills include SQL, Python, ETL frameworks, and experience with Snowflake and Apache Spark. U.S. citizenship required; no visa sponsorship available.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 14, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Hawaiian Gardens, CA
-
π§ - Skills detailed
#SQL (Structured Query Language) #Spark (Apache Spark) #Data Pipeline #Data Analysis #Data Governance #Compliance #Consulting #Python #Data Modeling #Programming #Monitoring #dbt (data build tool) #Data Engineering #Security #Snowflake #Apache Spark #Cloud #GCP (Google Cloud Platform) #Data Quality #Azure #"ETL (Extract #Transform #Load)" #Scala #Automation #Kafka (Apache Kafka) #BigQuery #Data Processing #AWS (Amazon Web Services) #Batch #Data Ingestion #Schema Design #Big Data
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, AaraTechnologies Inc, is seeking the following. Apply via Dice today!
Data Engineer (3 4 Years Experience) Employment Type: Full-time (Contract or Contract-to-Hire)
Experience Level: Mid-level (3 4 years) Company: Aaratech Inc
Eligibility: Open to U.S. Citizens and s only. We do not offer visa sponsorship.
About Aaratech Inc Aaratech Inc is a specialized IT consulting and staffing company that places elite engineering talent into high-impact roles at leading U.S. organizations. We focus on modern technologies across cloud, data, and software disciplines. Our client engagements offer long-term stability, competitive compensation, and the opportunity to work on cutting-edge data projects.
Position Overview
We are seeking a Data Engineer with 3 4 years of experience to join a client-facing role focused on building and maintaining scalable data pipelines, robust data models, and modern data warehousing solutions. You'll work with a variety of tools and frameworks, including Apache Spark, Snowflake, and Python, to deliver clean, reliable, and timely data for advanced analytics and reporting.
Key Responsibilities
β’ Design and develop scalable Data Pipelines to support batch and real-time processing
β’ Implement efficient Extract, Transform, Load (ETL) processes using tools like Apache Spark and dbt
β’ Develop and optimize queries using SQL for data analysis and warehousing
β’ Build and maintain Data Warehousing solutions using platforms like Snowflake or BigQuery
β’ Collaborate with business and technical teams to gather requirements and create accurate Data Models
β’ Write reusable and maintainable code in Python (Programming Language) for data ingestion, processing, and automation
β’ Ensure end-to-end Data Processing integrity, scalability, and performance
β’ Follow best practices for data governance, security, and compliance
Required Skills & Experience
β’ 3 4 years of experience in Data Engineering or a similar role
β’ Strong proficiency in SQL and Python (Programming Language)
β’ Experience with Extract, Transform, Load (ETL) frameworks and building data pipelines
β’ Solid understanding of Data Warehousing concepts and architecture
β’ Hands-on experience with Snowflake, Apache Spark, or similar big data technologies
β’ Proven experience in Data Modeling and data schema design
β’ Exposure to Data Processing frameworks and performance optimization techniques
β’ Familiarity with cloud platforms like AWS, Google Cloud Platform, or Azure
Nice to Have
β’ Experience with streaming data pipelines (e.g., Kafka, Kinesis)
β’ Exposure to CI/CD practices in data development
β’ Prior work in consulting or multi-client environments
β’ Understanding of data quality frameworks and monitoring strategies
β’ Seniority Level