This concept forms the basis of Information Theory. In digital communications, it places an absolute limit on the maximum possible rate at which data can be transmitted on a channel reliably. This limit is theoretical, and the challenge is to implement coding schemes that can get us closer and closer to such limits in real systems.
How Claude Shannon’s Concept of Entropy Quantifies Information | Quanta Magazine
How Claude Shannon’s Concept of Entropy Quantifies Information | Quanta Magazine