

TechnoSphere, Inc.
Data Developer(Data Engineer)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Developer (Data Engineer) with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Python programming, ETL pipeline development, real-time data integration, and financial markets expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Docker #Python #Monitoring #Scala #Cloud #Kafka (Apache Kafka) #Kubernetes #Azure #GCP (Google Cloud Platform) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Modeling #Data Pipeline #Pandas #Deployment #Spark (Apache Spark) #Version Control #GIT #PySpark #NumPy #DevOps #AWS (Amazon Web Services) #Schema Design #Data Processing #Automation #Programming #Libraries
Role description
Specific skills required:
Python Programming:
Strong proficiency in Python for data processing, analysis, and automation.
Experience with libraries such as Pandas, NumPy, and PySpark.
Ability to write efficient, maintainable, and scalable code for data pipelines and analytics.
Data Engineering & Development:
Experience designing and building robust ETL/ELT pipelines.
Familiarity with data modeling, data warehousing concepts, and schema design.
Proficiency in working with structured and unstructured data.
Experience with version control systems (e.g., Git) and CI/CD pipelines.
Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and containerization (e.g., Docker, Kubernetes) is a plus.
Real-Time Data Feeds:
Experience integrating data platforms with real-time messaging systems (Solace/ION/Kafka).
Deep understanding of real-time market data feeds and data processing techniques.
Financial Markets:
Strong domain knowledge in Fixed Income, FX, and Equities trading.
DevOps:
Implement DevOps practices to automate the build, test, and deployment of data solutions.
Monitoring Tools:
Experience using tools like ITRS to monitor performance and health of data systems.
Specific skills required:
Python Programming:
Strong proficiency in Python for data processing, analysis, and automation.
Experience with libraries such as Pandas, NumPy, and PySpark.
Ability to write efficient, maintainable, and scalable code for data pipelines and analytics.
Data Engineering & Development:
Experience designing and building robust ETL/ELT pipelines.
Familiarity with data modeling, data warehousing concepts, and schema design.
Proficiency in working with structured and unstructured data.
Experience with version control systems (e.g., Git) and CI/CD pipelines.
Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and containerization (e.g., Docker, Kubernetes) is a plus.
Real-Time Data Feeds:
Experience integrating data platforms with real-time messaging systems (Solace/ION/Kafka).
Deep understanding of real-time market data feeds and data processing techniques.
Financial Markets:
Strong domain knowledge in Fixed Income, FX, and Equities trading.
DevOps:
Implement DevOps practices to automate the build, test, and deployment of data solutions.
Monitoring Tools:
Experience using tools like ITRS to monitor performance and health of data systems.






