Breaking Math Podcast

Hosted by Gabriel Hesch and Autumn Phaneuf, who have advanced degrees in EE and industrial engineering/operations research respectively, come together to discuss mathematics as a pure field al in its own as well as how it describes the language of science, engineering, and even creativity.   Breaking Math brings you the absolute best in interdisciplinary science discussions -  bringing together experts in varying fields including artificial intelligence, neuroscience, evolutionary biology, physics, chemistry and materials-science, and more -  to discuss where humanity is headed. website:  breakingmath.io  linktree:  linktree.com/breakingmathmedia email:  breakingmathpodcast@gmail.com

http://www.breakingmath.io

subscribe
share






24: Language and Entropy (Information Theory in Language)


Information theory was founded in 1948 by Claude Shannon, and is a way of both qualitatively and quantitatively describing the limits and processes involved in communication. Roughly speaking, when two entities communicate, they have a message, a medium, confusion, encoding, and decoding; and when two entities communicate, they transfer information between them. The amount of information that is possible to be transmitted can be increased or decreased by manipulating any of the aforementioned variables. One of the practical, and original, applications of information theory is to models of language. So what is entropy? How can we say language has it? And what structures within language with respect to information theory reveal deep insights about the nature of language itself?

---

This episode is sponsored by
· Anchor: The easiest way to make a podcast. https://anchor.fm/app

Support this podcast: https://anchor.fm/breakingmathpodcast/support

This show is part of the Spreaker Prime Network, if you are interested in advertising on this podcast, contact us at https://www.spreaker.com/show/5545277/advertisement


fyyd: Podcast Search Engine
share








 March 7, 2018  46m