

SMX Services & Consulting, Inc.
Data Analyst Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst Developer, offering a contract of unspecified length with a pay rate of "unknown." Requires 3-5 years in ETL pipeline development and complex SQL, alongside strong expertise in Oracle PLSQL and Azure Data Lake.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 17, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Hollywood, FL
-
🧠 - Skills detailed
#RDBMS (Relational Database Management System) #Data Management #Base #Data Cleansing #SAP BODS (BusinessObjects Data Services) #Data Engineering #Azure #Synapse #Database Schema #Computer Science #Data Architecture #Data Analysis #Data Design #Oracle #Data Profiling #Data Warehouse #SAP #Data Lake #"ETL (Extract #Transform #Load)"
Role description
What the role will do:
• Conduct analysis of existing data structures along with the data architect
• Perform data feed low level design to feed the Oracle database schemas and Data Lake in Azure Synapse.
• Participate in data analysis and data design meetings and clearly articulate ideas, gather detailed business needs and outcomes expected, share best practices, and ensure that the DB design is reviewed in terms of structural integrity.
• Perform data profiling and data analysis to assist in understanding the data.
• Work with large data sets to migrate to new data structures
• Clean, merge, manage, and ensure quality of data sets.
Qualifications:
• Bachelor's Degree Information Technology, Computer Science, or related field. Preferred
• Bachelor's Degree Information Technology, Data Analytics, Business Administration-Information Processing or related discipline. Preferred
• 3-5 years in building pipelines to feed a data warehouse or other data entities using ETL tools or pipelines - Required
• 3-5 years in writing complex SQLs to perform data analysis, data cleansing and segmentation - Required
• 5-6 years in systems and data impact analysis, data base logical and physical design, building ERDs, conducting DB design review sessions with sound knowledge of RDBMS, Data Engineering principles and the tenets of data management – Required
Technical Skills: Strong hands-on experience in Oracle PLSQL, Relational DB principles and design, building Azure Pipelines to feed a Data Warehouse or Data Lake, and building ETL scripts using any standard ETL tool (SAP BODS Preferred), and ability to develop scripts to schedule ETL jobs
What the role will do:
• Conduct analysis of existing data structures along with the data architect
• Perform data feed low level design to feed the Oracle database schemas and Data Lake in Azure Synapse.
• Participate in data analysis and data design meetings and clearly articulate ideas, gather detailed business needs and outcomes expected, share best practices, and ensure that the DB design is reviewed in terms of structural integrity.
• Perform data profiling and data analysis to assist in understanding the data.
• Work with large data sets to migrate to new data structures
• Clean, merge, manage, and ensure quality of data sets.
Qualifications:
• Bachelor's Degree Information Technology, Computer Science, or related field. Preferred
• Bachelor's Degree Information Technology, Data Analytics, Business Administration-Information Processing or related discipline. Preferred
• 3-5 years in building pipelines to feed a data warehouse or other data entities using ETL tools or pipelines - Required
• 3-5 years in writing complex SQLs to perform data analysis, data cleansing and segmentation - Required
• 5-6 years in systems and data impact analysis, data base logical and physical design, building ERDs, conducting DB design review sessions with sound knowledge of RDBMS, Data Engineering principles and the tenets of data management – Required
Technical Skills: Strong hands-on experience in Oracle PLSQL, Relational DB principles and design, building Azure Pipelines to feed a Data Warehouse or Data Lake, and building ETL scripts using any standard ETL tool (SAP BODS Preferred), and ability to develop scripts to schedule ETL jobs