![]() Pynchon also crosses his discussion of thermodynamic entropy with a consideration of information entropy from mathematics. The second law of thermodynamics states that thermodynamic entropy increases over time, meaning that heat spreads out, eventually creating a more or less random distribution of hot molecules (and therefore a more or less homogeneous temperature throughout the space). When entropy is lower, or heat is more concentrated, then the system can do work, or make things move (like in Nefastis’s box, which tries to raise a piston by concentrating all the heat in one part of the container). For instance, if heat is spread out throughout a room, entropy is higher than if all the heat is concentrated in one corner. In thermodynamics, entropy refers to the amount of unavailable energy or the (degree of disorder) inside a given system. ![]() Entropy is the name of two related concepts, one in thermodynamics and one in information theory. ![]()
0 Comments
Leave a Reply. |