Entropy
*planted: 24/06/2023last tended: 24/06/2023
The definitions of entropy in thermodynamics and in information theory are equivalent
Information requires differentiation: in its absence, the signals degenerate into mere noise, and this noise is another way of describing entropy. In this sense, a system with high entropy might better be described not as disordered, but as homogenised or uniform. This way of seeing things will be very important in addressing the globalising phase of capitalism, which precisely tends to remove diversity.
1. Elsewhere
1.1. In my garden
Notes that link to this note (AKA backlinks).