TechTriad

Data Engineer - Hybrid in NY

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in New York, hybrid with 3 days onsite weekly. Contract length is unspecified, offering competitive pay. Requires 5+ years in data engineering, proficiency in Python, SQL, AWS, Terraform, and experience with machine learning and data visualization tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 23, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#Deployment #"ETL (Extract #Transform #Load)" #Version Control #Data Management #AWS Glue #Pandas #GIT #Mathematics #Data Pipeline #AI (Artificial Intelligence) #Programming #Libraries #Linux #Jupyter #Tableau #Matplotlib #Computer Science #Kafka (Apache Kafka) #Terraform #Data Engineering #ML (Machine Learning) #Cloud #Python #Infrastructure as Code (IaC) #React #Scala #TypeScript #Visualization #Java #SQL (Structured Query Language) #DevOps #BI (Business Intelligence) #AWS (Amazon Web Services) #SageMaker
Role description
USC or GC ONLY Locals Preferred 3 days onsite per week. Summary: You will work with a variety of data sets to design, prototype, and deploy analytical models, applications, and data visualizations for both internal users and customers. You will leverage your data engineering skills to transform raw data into analytical and statistical models, deriving meaningful insights and presenting business recommendations based on your results. You will collaborate with cross-functional teams, such as product, technology, operations and risk, to streamline data pipelines and to create automated solutions. You will use innovative technologies, including open-source software, AI and machine-learning libraries, and cloud-native analytics, to build scalable data platforms that automate and enhance the delivery of our new payment services. Qualifications: Bachelor’s degree or the equivalent in a scientific or quantitative discipline, such as Mathematics, Engineering, Computer Science, Economics, or Physical Sciences 5+ years of experience or equivalent in the data engineering/science field as a programmer, analyst, engineer or scientist. Hands-on development experience using programming languages, ideally Python, with statistical and data visualization libraries, such as pandas and matplotlib. Experience with data structures, algorithms and database query languages, such as SQL. Familiarity using git code repositories and interactive editors, such as Jupyter notebooks, Sublime or VS Code. Familiarity with cloud services for data management and analysis, such as AWS Glue Experience managing a variety of large-scale data sets on Linux, Windows or MacOS. Required Qualifications: Infrastructure as code using Terraform. Building and managing cloud infrastructure primarily AWS, writing reusable Terraform modules and collaborating with DevOps and development teams to support scalable, secure and automated deployments. Understanding of cloud services, version control(Git) and CI/CD practices is essential. Any graduate-level coursework or degree in a scientific or quantitative discipline, with original research work using statistical methods, analytic models, or quantitative techniques. Experience with machine learning concepts, algorithms, and tools, such as Amazon SageMaker Experience building data visualizations with business intelligence tools, such as AWS QuickSight or Tableau Experience developing Web applications with Java, Typescript and/or front-end UI frameworks, such as React, ideally leveraging real-time streaming events from Kafka or Amazon Kinesis An enthusiasm to learn and to teach others