

Connect Tech+Talent
Compression Expert
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Compression Expert for a remote contract position requiring 7+ years of experience. Key skills include lossless compression algorithms, C/C++ mastery, AI-based compression, and expertise in cloud environments. Familiarity with statistics and data pipeline integration is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
March 17, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Cloud #Cybersecurity #Security #C++ #AI (Artificial Intelligence) #Data Pipeline #Big Data #Statistics #Neural Networks #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Data Management
Role description
Compression Expert
Remote
Contract
Years Of Experience: 7+
β’ Subject Matter Expertise to include, but not limited to current lossless compression algorithms (Huffman, etc), how compression is implemented and optimized across the data supply chain in multiple high-volume settings (Cloud, data center, etc), data/cybersecurity, and Information Theory with a deep knowledge of entropy and the theoretical limits of compression (Shannonβs Source Coding Theorem)
β’ Expert in developing and optimizing lossless algorithms
β’ Mastery of C or C++ for memory management and performance-critical implementations including delta compression data management techniques
β’ Expertise in Transform Coding, quantization, and psychovisual/psychoacoustic modeling for media-specific compression
β’ Strong foundations in Statistics, probability distributions, and Linear Algebra for modeling complex data
β’ Proficient in AI-Based Compression in high-volume environments using Neural Networks (e.g., Autoregressive Models) to model data patterns for higher compression ratios
β’ Understanding SIMD (Single Instruction, Multiple Data) and GPU acceleration to parallelize encoding and decoding tasks
β’ Performance Benchmarking: Ability to analyze trade-offs between compression ratio, memory usage, and latency
β’ Critical thinking to identify redundancies in specialized data types and question theoretical standards
β’ Understanding of advanced techniques like Bits-Back ANS that allow for lossless compression using latent variable models
β’ Experience designing system integrations dictating how neural compression components fit into existing big data pipelines and cloud environments like AWS or Google
Compression Expert
Remote
Contract
Years Of Experience: 7+
β’ Subject Matter Expertise to include, but not limited to current lossless compression algorithms (Huffman, etc), how compression is implemented and optimized across the data supply chain in multiple high-volume settings (Cloud, data center, etc), data/cybersecurity, and Information Theory with a deep knowledge of entropy and the theoretical limits of compression (Shannonβs Source Coding Theorem)
β’ Expert in developing and optimizing lossless algorithms
β’ Mastery of C or C++ for memory management and performance-critical implementations including delta compression data management techniques
β’ Expertise in Transform Coding, quantization, and psychovisual/psychoacoustic modeling for media-specific compression
β’ Strong foundations in Statistics, probability distributions, and Linear Algebra for modeling complex data
β’ Proficient in AI-Based Compression in high-volume environments using Neural Networks (e.g., Autoregressive Models) to model data patterns for higher compression ratios
β’ Understanding SIMD (Single Instruction, Multiple Data) and GPU acceleration to parallelize encoding and decoding tasks
β’ Performance Benchmarking: Ability to analyze trade-offs between compression ratio, memory usage, and latency
β’ Critical thinking to identify redundancies in specialized data types and question theoretical standards
β’ Understanding of advanced techniques like Bits-Back ANS that allow for lossless compression using latent variable models
β’ Experience designing system integrations dictating how neural compression components fit into existing big data pipelines and cloud environments like AWS or Google




