Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract opportunity lasting up to six months, offering remote work in New York. Key skills include strong SQL, ETL/ELT pipeline development, and experience with cloud platforms (Snowflake preferred).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
360
-
πŸ—“οΈ - Date discovered
September 25, 2025
πŸ•’ - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Security #NoSQL #MySQL #Data Engineering #AWS (Amazon Web Services) #Data Pipeline #Data Warehouse #Data Analysis #Data Modeling #"ETL (Extract #Transform #Load)" #Bash #Azure #Data Processing #Hadoop #Snowflake #Big Data #Data Framework #Kafka (Apache Kafka) #BI (Business Intelligence) #Data Quality #Automation #Spark (Apache Spark) #Cloud #PostgreSQL #Scripting #Airflow #Databases #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Python
Role description
We're looking for a skilled Data Engineer to join a growing team based in New York on a remote basis. This is a contract opportunity where you'll design, build, and maintain robust data pipelines and analytics solutions. What You'll Do: Develop, optimize, and maintain ETL/ELT pipelines for large-scale data processing. Work with relational and NoSQL databases, data warehouses, and cloud data platforms (Snowflake preferred). Collaborate with data analysts, engineers, and business stakeholders to deliver actionable insights. Implement best practices for data quality, security, and governance. Automate and streamline data workflows using Python, SQL, or similar tools. Must-Have Skills & Experience: Strong SQL and database experience (PostgreSQL, MySQL, or similar). ETL/ELT pipeline development experience. Experience with cloud data platforms (Snowflake preferred). Scripting and automation skills (Python, Bash, or similar). Understanding of data modeling, warehousing, and BI tools. Nice-to-Have: Experience with big data frameworks (Spark, Hadoop, or Kafka). Familiarity with workflow orchestration tools (Airflow preferred). Cloud certifications (AWS, Azure, GCP) a plus. This is a Contract role likely to last up to six-months. If you're a hands-on Data Engineer looking for your next challenge and open to remote work, get in contact now.