86
Algorithm based on LLMs doubles lossless data compression rates
(techxplore.com)
This is a most excellent place for technology news and articles.
Where I work, we've been looking into data compression that's optimized by an ML system. We have a shit-ton of parameters, and the ML algorithm compares the number of sig figs in each parameter to its byte size, and truncates where that doesn't cause any loss of fidelity. So far, it looks promising, really good compression factor, but we still need to do more work on de-skilling the decompression at the receiving end.
I wouldn't have thought LLM was the right technology to use for something like this.