

RV Global Soft Co., Ltd.
Senior Data Engineers/ Architects - Remote
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer/Architect on a contract basis, offering $60.00 - $65.00 per hour, with a focus on cloud platforms, big data technologies, and ETL processes. Remote work is available; requires strong programming and database design skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
November 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Agile #Azure #Hadoop #Programming #Security #Informatica #Looker #SQL (Structured Query Language) #Java #AWS (Amazon Web Services) #SQL Server #Data Lake #Python #"ETL (Extract #Transform #Load)" #Cloud #Compliance #Big Data #Talend #Database Design #API (Application Programming Interface) #Scala #Storage #Data Framework #Data Security #Oracle #Unix #Data Architecture #Bash #Spark (Apache Spark) #Data Engineering #Data Warehouse #BI (Business Intelligence) #Shell Scripting #Data Pipeline #VBA (Visual Basic for Applications) #Apache Hive #Databases #Datasets #Visualization #Scripting
Role description
OverviewWe are seeking experienced Senior Data Engineers and Architects to join our dynamic team remotely. This role involves designing, developing, and maintaining scalable data solutions that support our organization's analytics and business intelligence initiatives. The ideal candidate will have a strong background in cloud platforms, big data technologies, and data warehousing, with the ability to lead complex data projects in an agile environment. This position offers the opportunity to work on cutting-edge data architecture and contribute to strategic decision-making through advanced analytics.
Duties
Design, develop, and optimize large-scale data pipelines using ETL tools such as Informatica, Talend, and custom scripting with Python, Bash, or Shell Scripting.
Architect and implement data models for data warehouses utilizing SQL Server, Oracle, and Azure Data Lake to ensure efficient storage and retrieval.
Build and maintain robust data architectures leveraging AWS, Azure Data Lake, Hadoop, Spark, Apache Hive, and related big data technologies.
Develop RESTful APIs for seamless integration of data services across platforms.
Collaborate with cross-functional teams to understand data requirements and translate them into scalable solutions.
Lead efforts in database design, schema development, and performance tuning for optimal analytics performance.
Support model training and analysis activities by providing clean, well-structured datasets.
Ensure compliance with best practices in data security, governance, and quality standards within an agile development environment.
Stay current with emerging technologies such as Linked Data and advanced analytics tools to continuously improve data infrastructure.
Skills
Extensive experience with cloud platforms including AWS and Azure Data Lake.
Strong proficiency in programming languages such as Java, Python, VBA, Bash (Unix shell), and Shell Scripting.
Deep understanding of big data frameworks like Hadoop, Spark, Apache Hive, and related ecosystems.
Proven expertise in ETL processes using Informatica, Talend, or similar tools.
Solid knowledge of SQL Server, Oracle databases, Data Warehouse design principles, and database modeling.
Familiarity with Looker for business intelligence visualization.
Experience with RESTful API development for system integrations.
Ability to perform analysis skills including model training and advanced analytics techniques.
Knowledge of database design principles combined with strong problem-solving capabilities in a fast-paced environment.
Agile methodology experience to manage project workflows efficiently. This role is ideal for seasoned professionals passionate about building innovative data solutions that drive strategic insights across organizations while working remotely in a flexible environment.
Job Type: Contract
Pay: $60.00 - $65.00 per hour
Expected hours: 40 per week
Work Location: Remote
OverviewWe are seeking experienced Senior Data Engineers and Architects to join our dynamic team remotely. This role involves designing, developing, and maintaining scalable data solutions that support our organization's analytics and business intelligence initiatives. The ideal candidate will have a strong background in cloud platforms, big data technologies, and data warehousing, with the ability to lead complex data projects in an agile environment. This position offers the opportunity to work on cutting-edge data architecture and contribute to strategic decision-making through advanced analytics.
Duties
Design, develop, and optimize large-scale data pipelines using ETL tools such as Informatica, Talend, and custom scripting with Python, Bash, or Shell Scripting.
Architect and implement data models for data warehouses utilizing SQL Server, Oracle, and Azure Data Lake to ensure efficient storage and retrieval.
Build and maintain robust data architectures leveraging AWS, Azure Data Lake, Hadoop, Spark, Apache Hive, and related big data technologies.
Develop RESTful APIs for seamless integration of data services across platforms.
Collaborate with cross-functional teams to understand data requirements and translate them into scalable solutions.
Lead efforts in database design, schema development, and performance tuning for optimal analytics performance.
Support model training and analysis activities by providing clean, well-structured datasets.
Ensure compliance with best practices in data security, governance, and quality standards within an agile development environment.
Stay current with emerging technologies such as Linked Data and advanced analytics tools to continuously improve data infrastructure.
Skills
Extensive experience with cloud platforms including AWS and Azure Data Lake.
Strong proficiency in programming languages such as Java, Python, VBA, Bash (Unix shell), and Shell Scripting.
Deep understanding of big data frameworks like Hadoop, Spark, Apache Hive, and related ecosystems.
Proven expertise in ETL processes using Informatica, Talend, or similar tools.
Solid knowledge of SQL Server, Oracle databases, Data Warehouse design principles, and database modeling.
Familiarity with Looker for business intelligence visualization.
Experience with RESTful API development for system integrations.
Ability to perform analysis skills including model training and advanced analytics techniques.
Knowledge of database design principles combined with strong problem-solving capabilities in a fast-paced environment.
Agile methodology experience to manage project workflows efficiently. This role is ideal for seasoned professionals passionate about building innovative data solutions that drive strategic insights across organizations while working remotely in a flexible environment.
Job Type: Contract
Pay: $60.00 - $65.00 per hour
Expected hours: 40 per week
Work Location: Remote





