Picking on yesterday’s post on entropy, I see this discussion of the cost of moving energy / data pops up in many ways.
There is a lot of good writing on how weak ties are good for you. Facebook prospers by driving ‘many lightweight interactions over time.‘ These are various embraces of complexity.
It seems perverse to introduce this cost of data into one’s ecosystem, but no! Entropy is good for you.
In information theory, “entropy” is the term used to describe how much actual information there is in any given set of data; [its] quantifiable value analogous to the amount of randomness in a system.
Unpredictable information contains high entropy.
And herein is the opportunity.
Brian Christian suggests:
- we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we’re least certain.
- to gain the most insight into a person, we should ask the question of whose answer we’re least certain.
The article concludes:
High-entropy information lets us learn new things at a faster, more efficient rate.
This is a recurring theme: an embrace of complexity.
More on the organizational leadership imperatives of this in this Forbes article.
←This Much We Know.→