Convergenz

Data Engineer (W2 ONLY)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (W2 ONLY) with a contract length of "unknown" and a pay rate of "unknown." Key skills include AWS, Snowflake, Delta Lake, and data pipeline development. A Bachelor's degree and 5 years of relevant experience are required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date
October 31, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Arlington, VA
-
🧠 - Skills detailed
#Data Ingestion #Computer Science #AWS (Amazon Web Services) #Airflow #Python #Data Engineering #Forecasting #Lambda (AWS Lambda) #Snowflake #Libraries #Data Science #ML (Machine Learning) #Visualization #Data Pipeline #BI (Business Intelligence) #Cloud #Data Lake #Data Processing #ODBC (Open Database Connectivity) #API (Application Programming Interface) #Delta Lake #S3 (Amazon Simple Storage Service) #Data Warehouse #Quality Assurance #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Data Governance
Role description
We are building the industry’s most advanced Data Analytics capability. Join a green-field opportunity to take responsibility for developing Cloud Native Data Science and Business Intelligence capability. The role includes: β€’ Managing and expanding our Data Warehouse solution which leverages Snowflake, Dagster, Arrow-based streaming and Delta Lake (delta-rs) β€’ Ensure resilient data pipelines supporting API, SFTP, Database and Streaming sources β€’ Supporting Business Intelligence solutions built on Snowflake, Power BI and AWS based technologies such as S3 and Lambda for enterprise clients β€’ Working closely with our Data Science team to implement machine learning, forecasting and simulation models β€’ Working closely with Senior Management to develop metrics, reporting and analysis solutions that deliver data driven insights β€’ Implementing Data Governance best practices β€’ Implementing automated quality assurance best practices Qualifications required: β€’ A Bachelor’s degree in Computer Science or other technical field and 5 years experience with AWS and Data Pipeline development β€’ Data Lake (Delta-rs) Dagster/PyArrow, Polars, ECS experience. β€’ Strong knowledge of data warehousing methodologies and data modelling concepts. The following technical experience is a strong plus β€’ Dagster (or Airflow) for Orchestration and ETL solutions β€’ Data ingestion with Arrow libraries such as ADBC, Arrow-ODBC and PyArrow β€’ Data processing with Polars data frames β€’ Scaling with ECS β€’ Data persistence with Delta Lake (delta-rs) β€’ Experience with AWS cloud-based data technologies for developing data pipelines and coding proficiency in Python β€’ Experience with data visualization tools such as QuickSight or Power BI β€’ The ability to explain complex technical material to nontechnical audiences