Learntropy: A SimpleGuru Summary
- Piotr Wozniak's concept of Learntropy was strongly influenced by Shannon Entropy.
- Shannon Entropy is an objective measure of the amount of ... in an information source, which can be mathematically quantified and measured in a unit of measurement called bits.
- Wozniak's Learntropy is a subjective measurement - it depends on the current state of the learner's prior knowledge and other factors.
- Everyone's subjective measure of the learntropy of a given information source will be different.
ΒΆ Main Nuggets
- Important distinction between information and meaning:
- The measure of information does not depend on the brain.
- The measure of meaning MUST involve the interpretation of the brain itself.
- Shannon Entropy measures information.
- Learntropy measures meaning.
I think a lot of people get scared off from reading the full pleasure of learning article because it begins with some stuff about Shannon Entropy. It's actually not too hard of a concept to grasp. Here's my attempt at breaking it down.
- Claude Shannon worked at Bell Labs (when?).
- His work was strongly influenced by the invention of the telephone and the telegraph.
- Shannon wanted to
- Created a new branch of mathematics called Information Theory.
- Each time you compress a file, or zip and unzip a file, you are using technologies that depend on the theoretical advancements Shannon made.
- Shannon defined entropy as the average level of "information", "surprise", or "uncertainty" in a random variable.
Fun fact:
Shannon originally wasn't sure what to call his new concept. He asked John Von Neumann for help - Von Neumann said "You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage!"
So don't be too discouraged if you don't fully understand the entropy analogy
.
- Wozniak's concept of learntropy is derived from Shannon's Entropy.
- But Wozniak created a new concept, "Learntropy", because of the importance of the distinction between information and meaning.
Piotr Wozniak concept of learntropy is derived from [Shannon's Entropy] - the average level of "information", "surprise", or "uncertainty" in a random variable. Woz says: "it is highly imprecise to evoke the term of entropy in the context of efficient learning." Why? because entropy in information theory does not consider the semantic aspects of information - like the interpretation of the meaning of a signal, the effect of prior knowledge on the interpretation of meaning etc, therefore he created his concept of learntropy.
More info here: https://supermemo.guru/wiki/Learntropy and it's discussed quite a bit in the Pleasure of Learning article.
- Wow!
- EUREKA!!
- These are reactions