AI Says It Can Compress Better Than FLAC?! Hold My Entropy đż (Ep. 268)
Data Science at Home - En podcast av Francesco Gadaleta
Kategorier:
Can AI really out-compress PNG and FLAC? đ€ Or is it just another overhyped tech myth? In this episode of Data Science at Home, Frag dives deep into the wild claims that Large Language Models (LLMs) like Chinchilla 70B are beating traditional lossless compression algorithms. đ§ đ„ But before you toss out your FLAC collection, let's break down Shannon's Source Coding Theorem and why entropy sets the ultimate limit on lossless compression. We explore: âïž How LLMs leverage probabilistic patterns for compression đ Why compression efficiency doesnât equal general intelligence đ The practical (and ridiculous) challenges of using AI for compression đĄ Can AI actually BREAK Shannonâs limitâor is it just an illusion? If you love AI, algorithms, or just enjoy some good old myth-busting, this oneâs for you. Don't forget to hit subscribe for more no-nonsense takes on AI, and join the conversation on Discord! Letâs decode the truth together. Join the discussion on the new Discord channel of the podcast https://discord.gg/4UNKGf3  Don't forget to subscribe to our new YouTube channel https://www.youtube.com/@DataScienceatHome   References Have you met Shannon? https://datascienceathome.com/have-you-met-shannon-conversation-with-jimmy-soni-and-rob-goodman-about-one-of-the-greatest-minds-in-history/  Â