

Jobs via Dice
Early Career Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Early Career Data Engineer in Redmond, WA, offering a 4-month contract at a competitive pay rate. Key skills include Python, SQL, Airflow, and Spark. Requires 2 years of data engineering experience and a relevant Bachelor's degree.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 19, 2026
π - Duration
3 to 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Redmond, WA
-
π§ - Skills detailed
#Data Science #Scala #Data Quality #Azure #Data Engineering #Data Modeling #Data Warehouse #Data Governance #Python #Security #Data Pipeline #Cloud #Airflow #Kubernetes #Data Accuracy #Documentation #BigQuery #Data Security #Observability #Redshift #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Consulting #AWS (Amazon Web Services) #Snowflake #GCP (Google Cloud Platform) #Requirements Gathering #SQL (Structured Query Language) #Monitoring #Data Analysis #Computer Science
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Talent Software Services, Inc, is seeking the following. Apply via Dice today!
AI Data Engineer 1
Job Summary: Talent Software Services is in search of an AI Data Engineer for a contract position in Redmond, WA. The opportunity will be four months with a strong chance for a long-term extension.
Position Summary: This team is responsible for designing, developing, and maintaining data platforms. You will have the opportunity to work closely with stakeholders across the company to gather business requirements, build data models, and ensure data quality and accessibility. Your expertise in Python, SQL, Airflow, and Spark will be crucial in optimizing our data infrastructure and enabling data-driven decision-making. This role will contribute to designing, developing, and maintaining efficient and reliable data pipelines.
Primary Responsibilities/Accountabilities:
β’ Data Platform: Design, build, and maintain scalable data platforms and pipelines using Python, SQL, Airflow, and Spark.
β’ Business Requirements Gathering: Collaborate with stakeholders to understand and translate business requirements into technical specifications.
β’ Data Modeling: Develop and implement data models that support analytics and reporting needs.
β’ Data Quality and Governance: Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
β’ Stakeholder Collaboration: Work with cross-functional teams, including data analysts, data scientists, and business leaders, to deliver high-quality data solutions.
β’ Performance Optimization: Continuously monitor and optimize data pipelines for performance, scalability, and cost-efficiency.
β’ Monitoring and Observability: Build and implement monitoring and observability metrics to ensure data quality and detect anomalies in data pipelines.
β’ Documentation and Communication: Maintain clear and comprehensive documentation of data processes and communicate technical concepts effectively to non-technical stakeholders.
Qualifications:
β’ Experience: 2 years of experience in data engineering and infrastructure.
β’ Technical Skills: Proficiency in data warehouse management, Python, SQL, Airflow, and Spark.
β’ Data Pipeline Expertise: Strong experience in building and maintaining robust data pipelines and ETL processes.
β’ Analytical Skills: Ability to gather business requirements and debug issues for ingestion or other areas of the data warehouse.
β’ Communication: Excellent verbal and written communication skills, with the ability to convey technical information to non-technical audiences.
β’ Collaboration: Proven ability to work effectively in a collaborative, cross-functional environment.
β’ Education: A Bachelorβs degree in Computer Science, Engineering, Information Systems, or a related field.
β’ Data Engineering background that includes Python, SQL, Kubernetes, Airflow, and Scala.
Preferred:
β’ Experience with cloud platforms such as AWS, Google Cloud Platform, or Azure.
β’ Familiarity with data warehousing technology (e.g., Deltalake, Azure Fabric, Snowflake, Redshift, BigQuery).
β’ Knowledge of data governance and data security best practices.
If this job is a match for your background, we would be honoured to receive your application!
Providing consulting opportunities to TALENTed people since 1987, we offer a host of opportunities, including contract, contract to hire, and permanent placement. Let's talk!
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Talent Software Services, Inc, is seeking the following. Apply via Dice today!
AI Data Engineer 1
Job Summary: Talent Software Services is in search of an AI Data Engineer for a contract position in Redmond, WA. The opportunity will be four months with a strong chance for a long-term extension.
Position Summary: This team is responsible for designing, developing, and maintaining data platforms. You will have the opportunity to work closely with stakeholders across the company to gather business requirements, build data models, and ensure data quality and accessibility. Your expertise in Python, SQL, Airflow, and Spark will be crucial in optimizing our data infrastructure and enabling data-driven decision-making. This role will contribute to designing, developing, and maintaining efficient and reliable data pipelines.
Primary Responsibilities/Accountabilities:
β’ Data Platform: Design, build, and maintain scalable data platforms and pipelines using Python, SQL, Airflow, and Spark.
β’ Business Requirements Gathering: Collaborate with stakeholders to understand and translate business requirements into technical specifications.
β’ Data Modeling: Develop and implement data models that support analytics and reporting needs.
β’ Data Quality and Governance: Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
β’ Stakeholder Collaboration: Work with cross-functional teams, including data analysts, data scientists, and business leaders, to deliver high-quality data solutions.
β’ Performance Optimization: Continuously monitor and optimize data pipelines for performance, scalability, and cost-efficiency.
β’ Monitoring and Observability: Build and implement monitoring and observability metrics to ensure data quality and detect anomalies in data pipelines.
β’ Documentation and Communication: Maintain clear and comprehensive documentation of data processes and communicate technical concepts effectively to non-technical stakeholders.
Qualifications:
β’ Experience: 2 years of experience in data engineering and infrastructure.
β’ Technical Skills: Proficiency in data warehouse management, Python, SQL, Airflow, and Spark.
β’ Data Pipeline Expertise: Strong experience in building and maintaining robust data pipelines and ETL processes.
β’ Analytical Skills: Ability to gather business requirements and debug issues for ingestion or other areas of the data warehouse.
β’ Communication: Excellent verbal and written communication skills, with the ability to convey technical information to non-technical audiences.
β’ Collaboration: Proven ability to work effectively in a collaborative, cross-functional environment.
β’ Education: A Bachelorβs degree in Computer Science, Engineering, Information Systems, or a related field.
β’ Data Engineering background that includes Python, SQL, Kubernetes, Airflow, and Scala.
Preferred:
β’ Experience with cloud platforms such as AWS, Google Cloud Platform, or Azure.
β’ Familiarity with data warehousing technology (e.g., Deltalake, Azure Fabric, Snowflake, Redshift, BigQuery).
β’ Knowledge of data governance and data security best practices.
If this job is a match for your background, we would be honoured to receive your application!
Providing consulting opportunities to TALENTed people since 1987, we offer a host of opportunities, including contract, contract to hire, and permanent placement. Let's talk!






