

Analytics Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytics Data Engineer in Scottsdale, AZ or Chicago, IL, with a contract length of unspecified duration. Pay rate is competitive. Required skills include Python, Hadoop, Hive SQL, and SAS conversion expertise. Fin-tech experience preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Monitoring #Storage #SAS #Data Processing #ML (Machine Learning) #Documentation #Hadoop #Java #Data Science #Python #Data Engineering #Indexing #AWS (Amazon Web Services) #Databases #Observability #Data Profiling #Spark (Apache Spark) #Scala #SQL (Structured Query Language) #Data Ingestion #S3 (Amazon Simple Storage Service)
Role description
Role: Analytics Data Engineer
Locations: Scottsdale, AZ or Chicago, IL (parking and commuter passes are not covered)
Must have a valid GC or be a US Citizen
MUST HAVE PYTHON and HADOOP
1. Help convert legacy code-based assets to modern high-performance tools (SAS to Python)
1. Existing data processing scripts including data movement, cleaning, and aggregation
1. Value Testing Process. This scores a potential customers data through our models to help determine the value of client solutions
1. SQL/Hive query performance tuning and enhancement
1. Develop shared toolkit to automate certain data science processes
1. Data profiling
1. Feature importance and effectiveness evaluation
1. Automate documentation of model development processes
1. Assist in upskilling existing team
Overall Purpose
Technical expert responsible for building and modernizing Data Science Tools and Data Assets.
Essential Functions
β’ Write data infrastructure & analytics software tools for Data Science.
β’ Guide Data Scientists on best practices in software engineering.
β’ Modernize existing SAS and Hive SQL codebase to higher performance structures leveraging parallel computing capabilities
β’ Instrument data feeds for observability, profiling, and monitoring
β’ Develop shared resources to support Data Science work such as automated profiling and analysis tools
β’ Write detailed, complete, and enduring documentation to enable long term support
β’ Supporting the companyβs commitment to risk management and protecting the integrity and confidentiality of systems and data
Minimum Qualifications
β’ Education and/or experience typically obtained through a Bachelorβs degree
β’ Must be able to convert from SAS to Python
β’ Hive SQL
β’ Spark
β’ Software/data engineering in Python
β’ Machine Learning toolkits in Python/Spark
β’ Building reusable frameworks for data ingestion and storage in Hadoop or AWS/S3
β’ Relational databases, optimizing/tuning queries, indexing, partitioning, etc.
β’ SAS and the ability to convert from SAS to Python
β’ Strong communication skills
β’ Background and drug screen
Preferred Qualifications
β’ Prior demonstrated experience 2-4 years
β’ Scala/Java experience
β’ Fin-tech experience
Thanks and Regards,
Jeet Kumar Thapa
Technical Recruiter
Oreva Technologies Inc.
P: 972-996-6477 Ext: 323
E: jeet.t@orevatech.com
L:https://www.linkedin.com/in/jeet-kumar-thapa-816873155/
A: 1320 Greenway Drive, Suite 460, Irving, TX 75038
W: https://orevatech.com/
Role: Analytics Data Engineer
Locations: Scottsdale, AZ or Chicago, IL (parking and commuter passes are not covered)
Must have a valid GC or be a US Citizen
MUST HAVE PYTHON and HADOOP
1. Help convert legacy code-based assets to modern high-performance tools (SAS to Python)
1. Existing data processing scripts including data movement, cleaning, and aggregation
1. Value Testing Process. This scores a potential customers data through our models to help determine the value of client solutions
1. SQL/Hive query performance tuning and enhancement
1. Develop shared toolkit to automate certain data science processes
1. Data profiling
1. Feature importance and effectiveness evaluation
1. Automate documentation of model development processes
1. Assist in upskilling existing team
Overall Purpose
Technical expert responsible for building and modernizing Data Science Tools and Data Assets.
Essential Functions
β’ Write data infrastructure & analytics software tools for Data Science.
β’ Guide Data Scientists on best practices in software engineering.
β’ Modernize existing SAS and Hive SQL codebase to higher performance structures leveraging parallel computing capabilities
β’ Instrument data feeds for observability, profiling, and monitoring
β’ Develop shared resources to support Data Science work such as automated profiling and analysis tools
β’ Write detailed, complete, and enduring documentation to enable long term support
β’ Supporting the companyβs commitment to risk management and protecting the integrity and confidentiality of systems and data
Minimum Qualifications
β’ Education and/or experience typically obtained through a Bachelorβs degree
β’ Must be able to convert from SAS to Python
β’ Hive SQL
β’ Spark
β’ Software/data engineering in Python
β’ Machine Learning toolkits in Python/Spark
β’ Building reusable frameworks for data ingestion and storage in Hadoop or AWS/S3
β’ Relational databases, optimizing/tuning queries, indexing, partitioning, etc.
β’ SAS and the ability to convert from SAS to Python
β’ Strong communication skills
β’ Background and drug screen
Preferred Qualifications
β’ Prior demonstrated experience 2-4 years
β’ Scala/Java experience
β’ Fin-tech experience
Thanks and Regards,
Jeet Kumar Thapa
Technical Recruiter
Oreva Technologies Inc.
P: 972-996-6477 Ext: 323
E: jeet.t@orevatech.com
L:https://www.linkedin.com/in/jeet-kumar-thapa-816873155/
A: 1320 Greenway Drive, Suite 460, Irving, TX 75038
W: https://orevatech.com/