Yes, you’re just using the wrong term. Entropy isn’t chaos, it’s a measure of how much energy has been lost to irreversible processes such that the energy can no longer be used to do work. You can’t undo pulling the tree branch into its component pieces, the process is irreversible.
The term entropy isn’t wrong here, I was more referring to the other terms I was using. “Entropy” itself is heavily overloaded as a term, and some uses of it relate to information theory and combinatorics, i.e. information entropy, configuration entropy etc.
That’s the reasoning I’m going by here - in those uses the entropy of a system is directly proportional to the log of possible combinations a system can have, and clearly the bottom system is a lot more constrained than the top one.
Yes, you’re just using the wrong term. Entropy isn’t chaos, it’s a measure of how much energy has been lost to irreversible processes such that the energy can no longer be used to do work. You can’t undo pulling the tree branch into its component pieces, the process is irreversible.
The term entropy isn’t wrong here, I was more referring to the other terms I was using. “Entropy” itself is heavily overloaded as a term, and some uses of it relate to information theory and combinatorics, i.e. information entropy, configuration entropy etc.
That’s the reasoning I’m going by here - in those uses the entropy of a system is directly proportional to the log of possible combinations a system can have, and clearly the bottom system is a lot more constrained than the top one.