

Data Engineer LOCALS TO NY or NJ and NEED RECENT FINANCIAL DOMAIN EXPERIENCE MANDATORY
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 10+ years of experience, located in New York, NY (Hybrid). Requires recent financial domain experience, strong Python skills, SQL proficiency, and familiarity with databases like DB2 and Snowflake. Contract position.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 17, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New Haven, CT
-
π§ - Skills detailed
#Databases #NumPy #Jira #Unix #Data Processing #Deployment #Time Series #Greenplum #Kafka (Apache Kafka) #Artifactory #Debugging #Anomaly Detection #Snowflake #Logging #Pandas #SQL (Structured Query Language) #Python #Pytest #Data Quality #PostgreSQL #JSON (JavaScript Object Notation) #Clustering #GIT #Monitoring #Programming #Data Engineering #ML (Machine Learning) #SQL Queries
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Experience : 10+ years
Job Description :
ο»ΏPosition Details
Title : Data Engineer
Location :Β New York, NY (HYBRID)/ LOCALS TO NY or NJ and NEED RECENT FINANCIAL DOMAIN EXPERIENCE MANDATORY
Position Type : Contract
Skills Required:
- Strong proficiency in Python with experience developing production-grade data processing pipelines.
- In-depth knowledge of database concepts, SQL queries, and stored procedures.
- Familiarity with various databases either DB2, Greenplum, Snowflake, PostgreSQL, KDB, and SingleStore.
- Expertise in Artifactory, Pandas, NumPy and object-oriented programming in Python.
- Strong data analytics skills and ability to identify data issues easily.
- Working knowledge of Unix and handling files in various formats like CSV and JSON.
- Experience with handling messages on Kafka.
- Knowledge of Git repositories, Jira tracking, and job scheduling tools like Autosys.
- Proficiency in writing unit tests (e.g., using pytest).
- Self-starter with the ability to work in a fast-paced environment and manage multiple projects.
- Finance data domain knowledge is preferred.
Good to Have:
- Working in some Data Quality related infrastructure
- Knowledge of anomaly detection algorithms and techniques, particularly isolation forest, clustering, time series analysis, and pattern mining
- Understanding of model performance monitoring, model debugging, and logging systems
- Some experience with containerization and deployment of ML services in enterprise environments