

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of $70-75/hour W2. It requires 4+ years of experience in data engineering, proficiency in Python, Spark, SQL, and AWS, and a BS/MS in a related field. Remote work only.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
July 15, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#Data Lakehouse #Data Management #Apache Spark #Datasets #Storytelling #Apache Airflow #Snowflake #Data Transformations #Metadata #Data Architecture #Scala #"ETL (Extract #Transform #Load)" #Airflow #SQL (Structured Query Language) #Data Profiling #dbt (data build tool) #Agile #Databases #Data Engineering #Shell Scripting #Spark (Apache Spark) #Data Quality #Tableau #Documentation #Computer Science #Spark SQL #Scripting #Data Governance #Java #Python #AWS (Amazon Web Services) #Data Pipeline #Data Catalog #Data Lineage #Cloud #Data Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
PAY: $70-75/hour W2. Our company offers our consultants a suite of benefits after a qualification period including health, vision, dental, life and disability insurance.
This is a fully remote role
W2 Candidates only
Manager Notes
β’ Preferred top skill sets: Experience working at other tech companies as a data engineer with cloud technologies is valuable.
β’ Must have skills: Python, Spark, SQL, Snowflake, Airflow, DBT, AWS, Shell Scripting
β’ Teachable skills: Salesforce AgentForce and Data Cloud
Responsibilities:
β’ Build and maintain complex data pipelines that integrate data from a wide variety of sources, including APIs, Salesforce, AWS, Data Cloud, and databases.
β’ Design and develop optimized data models and scalable data structures in Salesforce Data Cloud and Snowflake to support reliable analytics, reporting, and downstream data products.
β’ Perform comprehensive data profiling to identify data gaps and continuously improve data quality across ingestion and transformation layers.
β’ Implement robust data transformation logic to convert raw data into golden records and trustworthy analytical datasets.
β’ Build automated data quality checks and validation rules within pipelines to ensure accuracy, completeness, and consistency of critical datasets.
β’ Partner with product analysts, product engineering, and other stakeholders to understand instrumentation needs for insights and analytics.
β’ Collaborate with data architects to define and maintain data governance, naming standards, and data lineage documentation.
Job Requirements:
β’ BS/MS degree in Computer Science, Engineering or a related field.
β’ 4+ years of experience in data engineering or ELT/ETL development using SQL, Shell Scripting, Python, and Java.
β’ Strong proficiency in Apache Spark, Python, and advanced SQL for large-scale data transformation and analytics.
β’ Hands-on experience with Apache Airflow or equivalent orchestration tools for managing data workflows.
β’ Experience with dbt for building and managing modular, version-controlled, and tested data transformations.
β’ Proven ability to build, deploy, and optimize data pipelines on AWS infrastructure.
β’ Experience with data lakehouse architectures and familiarity with modern data mesh principles.
β’ Strong analytical skills with the ability to work with complex data models, derive insights, and recommend actionable strategies.
β’ Familiarity with Salesforce data models and processes is a plus.
β’ Proficient in Tableau for developing interactive dashboards and visual storytelling.
β’ Exposure to data governance, metadata management, and data cataloging tools.
β’ Ability to work effectively in an agile, distributed, and fast-paced environment, and have a high degree of self-management with clear communication and commitment to delivery timelines.
Who We Are:
The Fountain Group is a nationwide staffing firm with over 80 Fortune 100-500 clients. Since 2001, TFG has maintained a consistent standard of excellence, and our work is broadly recognized every year through numerous industry performance awards. Our success is a team effort.
Browse our website below for additional information on our company.
The Fountain Group
3407 W Martin Luther King Jr. Dr. Tampa, FL 33607
βWe work in Life Sciences, Clinical, Engineering, IT, and more. Above all, we specialize in people.β