

Data Engineer -25-02915
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6+ month contract, paying $70-$75/hr on W2, fully remote. Key skills include Python, Spark, SQL, Snowflake, and AWS. Requires 4+ years in data engineering and a BS/MS in a related field.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
July 15, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Lakehouse #Data Management #Apache Spark #Datasets #Storytelling #Apache Airflow #Snowflake #Data Transformations #Metadata #Data Architecture #Scala #"ETL (Extract #Transform #Load)" #Airflow #SQL (Structured Query Language) #Data Profiling #dbt (data build tool) #Agile #Databases #Data Engineering #Shell Scripting #Spark (Apache Spark) #Data Science #Data Quality #Tableau #Documentation #Computer Science #Spark SQL #Scripting #BI (Business Intelligence) #Data Governance #Java #Python #AWS (Amazon Web Services) #Visualization #Data Pipeline #Data Catalog #Data Lineage #Cloud #Data Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Description
LeadStack Inc. is an award winning, one of the nation's fastest growing, certified minority owned (MBE) staffing services provider of contingent workforce. As a recognized industry leader in contingent workforce solutions and Certified as a Great Place to Work, we're proud to partner with some of the most admired Fortune 500 brands in the world.
Job Title: Data Engineer
Location: Fully Remote
Duration: 6+ months with Possible extension
W2 only
Pay rate: $70-$75/hr on W2
Top skills : Python, Spark, SQL, Snowflake, Airflow, DBT, AWS, Shell Scripting
Job Description
Role Description:
We are seeking a skilled and driven Data Engineer to join our dynamic team. As a Data Engineer, you will play a pivotal role in acquiring, cleaning, and structuring data from diverse sources, enabling our organization to derive actionable insights and make informed decisions. You will collaborate closely with analysts, other engineering teams, and stakeholders to ensure the optimal design and delivery of BI solutions, reports, dashboards, KPIs, and alerts. Your expertise in data visualization, analysis, and integration will be critical in uncovering trends, patterns, and opportunities for product enhancements. Additionally, you will engage with various functional groups, leading data discovery, assessing data samples, and preparing data for hypothesis testing and statistical analysis. A pivotal figure, the Data Engineer will champion and establish BI and data visualization services, empowering analysts and data scientists. Your role will significantly contribute to bringing clarity and insight to our data landscape. If you're ready to make an impactful difference in a fast-paced environment, driving data-enabled decisions and innovation, we invite you to take the lead as our Data Engineer.
Responsibilities:
β’ Build and maintain complex data pipelines that integrate data from a wide variety of sources, including APIs, Salesforce, AWS, Data Cloud, and databases.
β’ Design and develop optimized data models and scalable data structures in Salesforce Data Cloud and Snowflake to support reliable analytics, reporting, and downstream data products.
β’ Perform comprehensive data profiling to identify data gaps and continuously improve data quality across ingestion and transformation layers.
β’ Implement robust data transformation logic to convert raw data into golden records and trustworthy analytical datasets.
β’ Build automated data quality checks and validation rules within pipelines to ensure accuracy, completeness, and consistency of critical datasets.
β’ Partner with product analysts, product engineering, and other stakeholders to understand instrumentation needs for insights and analytics.
β’ Collaborate with data architects to define and maintain data governance, naming standards, and data lineage documentation.
Job Requirements:
β’ BS/MS degree in Computer Science, Engineering or a related field.
β’ 4+ years of experience in data engineering or ELT/ETL development using SQL, Shell Scripting, Python, and Java.
β’ Strong proficiency in Apache Spark, Python, and advanced SQL for large-scale data transformation and analytics.
β’ Hands-on experience with Apache Airflow or equivalent orchestration tools for managing data workflows.
β’ Experience with dbt for building and managing modular, version-controlled, and tested data transformations.
β’ Proven ability to build, deploy, and optimize data pipelines on AWS infrastructure.
β’ Experience with data lakehouse architectures and familiarity with modern data mesh principles.
β’ Strong analytical skills with the ability to work with complex data models, derive insights, and recommend actionable strategies.
β’ Familiarity with Salesforce data models and processes is a plus.
β’ Proficient in Tableau for developing interactive dashboards and visual storytelling.
β’ Exposure to data governance, metadata management, and data cataloging tools.
β’ Ability to work effectively in an agile, distributed, and fast-paced environment, and have a high degree of self-management with clear communication and commitment to delivery timelines.
know more about current opportunities at LeadStack , please visit us on https://leadstackinc.com/careers/
Should you have any questions, feel free to call me on (513) 3184502 or send an email on waseem.ahmad@leadstackinc.com