The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. Non Life Sciences
  3. Technology
  4. Huffman coding being replaced with ANS - ongoing revolution in data compression?
« previous next »
  • Print
Pages: [1]   Go Down

Huffman coding being replaced with ANS - ongoing revolution in data compression?

  • 1 Replies
  • 2957 Views
  • 0 Tags

0 Members and 1 Guest are viewing this topic.

Offline Jarek Duda (OP)

  • Sr. Member
  • ****
  • 169
  • Activity:
    0%
  • Thanked: 1 times
    • http://th.if.uj.edu.pl/~dudaj/
Huffman coding being replaced with ANS - ongoing revolution in data compression?
« on: 22/01/2017 07:12:46 »
Beside imaginary data compression revolution in some HBO TV series, it turns out a lot has also actually changed in the real world in the last years - much better compressors you can just download and use.

One reason is more efficient coding.
Huffman coding (e.g. A -> 010) is fast but approximates probabilities with powers of 1/2, while e.g. symbol of probability 0.99 carries only ~0.014 bits of information. This inaccuracy was repaired by arithmetic coding, but it is an order magnitude slower (more costly).
Above compromise has been recently ended with ANS coding, which is both fast(cheap) and accurate:
wiki: https://en.wikipedia.org/wiki/Asymmetric_Numeral_Systems

Arithmetic coding in 2013 had ~50MB/s/core decoding ... now analogous task is made by ANS with ~1500MB/s decoding on the same processor - nearly 30x software boost for the bottleneck of data compressors.
benchmarks: https://sites.google.com/site/powturbo/entropy-coder

ANS is used in data compressors since 2014, like in the currently default Apple LZFSE or great and free Facebook Zstd - it is 2-5x faster than gzip and provides much better compression:
https://github.com/facebook/zstd/
7-zip with zstd: https://mcmilk.de/projects/7-Zip-zstd/
Total Commander plugin: http://totalcmd.net/plugring/zstdwcx.html

« Last Edit: 22/01/2017 12:30:53 by Jarek Duda »
Logged
 



Offline evan_au

  • Global Moderator
  • Naked Science Forum GOD!
  • ********
  • 11035
  • Activity:
    9%
  • Thanked: 1486 times
Re: Ongoing revolution in data compression
« Reply #1 on: 22/01/2017 10:27:55 »
Please rephrase the title as a question, as per forum guidelines - Moderator.
Logged
 



  • Print
Pages: [1]   Go Up
« previous next »
Tags:
 
There was an error while thanking
Thanking...
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.33 seconds with 29 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.