

IT America Inc
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer with a contract length of 6 months plus extensions. It offers a pay rate of "unknown" and is remote. Key skills required include Python, SQL, AWS, ETL, and Snowflake. Capital One experience is mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 23, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#GitHub #Splunk #Bash #Airflow #Data Engineering #SQL (Structured Query Language) #Scripting #"ETL (Extract #Transform #Load)" #Data Pipeline #AWS (Amazon Web Services) #Agile #Kafka (Apache Kafka) #Java #Python #Spark (Apache Spark) #Databricks #Snowflake #Jira #Jenkins #Cloud #Big Data #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #AWS S3 (Amazon Simple Storage Service)
Role description
We have 2 openings for this and urgently need candidates. Must be Cap One former
Job Title
β’ Lead data engineer
6 month + extensions
Remote
Worksite Location
β’ 1680 Capital One Dr, Mclean, VA, 22102, USA or REMOTE
Top Skills - Must Haves
β’ Python
β’ Sql
β’ Aws
β’ Data
β’ Etl
β’ Integration
β’ Big data
β’ Agile
β’ snowflake
β’ databricks
β’ EMR
β’ Glue
β’ Lambda
β’ cloudwatch
β’ S3
Top Skills' Details
Discover's historical credit card data will be in the Capital One ecosystem in OneStream's OneLake location and these Engineering teams will be responsible for picking the data up, transforming and loading into target state, which could be Snowflake or AWS S3
6 Data Engineering teams with 2 Midlevel Data Engineers each (Capital One formers required)
- Able to work independently
Tech Stack: Must Have / Nice to Have
Must Have
β’ AWS(EMR/Glue, S3, Lambda, Cloudwatch), Python/Java, Spark, Databricks, SQL, Bash)
β’ Python use: scripting, building new data pipelines (not app dev)
Data warehousing experience (Snowflake)
β’ APIs
β’ Agile engineering practices & JIRA usage
Nice to Have
- Kafka, Airflow, Open Table formats (Delta, Hudi, Iceberg), Splunk, NewRelic, Github, Jenkins, prior Capital One experience highly preferred
Can be remote or located in Capital One core locations (listed in order of preference)
- Chicago
- McLean
- Richmond
- Plano
Secondary Skills - Nice to Haves
β’ kafka
β’ airflow
Job Description
Tech Stack: Must Have / Nice to Have
Must Have
β’ AWS(EMR/Glue, S3, Lambda, Cloudwatch), Python/Java, Spark, Databricks, SQL, Bash)
β’ Python use: scripting, building new data pipelines (not app dev)
Data warehousing experience (Snowflake)
β’ APIs
β’ Agile engineering practices & JIRA usage
Nice to Have
- Kafka, Airflow, Open Table formats (Delta, Hudi, Iceberg), Splunk, NewRelic, Github, Jenkins
Additional Skills & Qualifications
We have 2 openings for this and urgently need candidates. Must be Cap One former
Job Title
β’ Lead data engineer
6 month + extensions
Remote
Worksite Location
β’ 1680 Capital One Dr, Mclean, VA, 22102, USA or REMOTE
Top Skills - Must Haves
β’ Python
β’ Sql
β’ Aws
β’ Data
β’ Etl
β’ Integration
β’ Big data
β’ Agile
β’ snowflake
β’ databricks
β’ EMR
β’ Glue
β’ Lambda
β’ cloudwatch
β’ S3
Top Skills' Details
Discover's historical credit card data will be in the Capital One ecosystem in OneStream's OneLake location and these Engineering teams will be responsible for picking the data up, transforming and loading into target state, which could be Snowflake or AWS S3
6 Data Engineering teams with 2 Midlevel Data Engineers each (Capital One formers required)
- Able to work independently
Tech Stack: Must Have / Nice to Have
Must Have
β’ AWS(EMR/Glue, S3, Lambda, Cloudwatch), Python/Java, Spark, Databricks, SQL, Bash)
β’ Python use: scripting, building new data pipelines (not app dev)
Data warehousing experience (Snowflake)
β’ APIs
β’ Agile engineering practices & JIRA usage
Nice to Have
- Kafka, Airflow, Open Table formats (Delta, Hudi, Iceberg), Splunk, NewRelic, Github, Jenkins, prior Capital One experience highly preferred
Can be remote or located in Capital One core locations (listed in order of preference)
- Chicago
- McLean
- Richmond
- Plano
Secondary Skills - Nice to Haves
β’ kafka
β’ airflow
Job Description
Tech Stack: Must Have / Nice to Have
Must Have
β’ AWS(EMR/Glue, S3, Lambda, Cloudwatch), Python/Java, Spark, Databricks, SQL, Bash)
β’ Python use: scripting, building new data pipelines (not app dev)
Data warehousing experience (Snowflake)
β’ APIs
β’ Agile engineering practices & JIRA usage
Nice to Have
- Kafka, Airflow, Open Table formats (Delta, Hudi, Iceberg), Splunk, NewRelic, Github, Jenkins
Additional Skills & Qualifications






