Jump to content

Talk:Entropy (information theory)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Grammar, please?

[edit]

Would someone familiar with the terms and practices AND of the English language go through and fix, please?

Page Clarity

[edit]

This is one of the worst-written pages I've ever encountered on Wikipedia; it sounds as though it was copied directly from a poorly-written upper-level textbook on the subject. The major problem is clarity - it uses longwinded, unnecessary phrases where simple ones would be both synonymous and much clearer to the reader, and has an inflated vocabulary. I don't understand the subject matter myself well enough to do a complete revision of it, though I am going to go through and do a cleanup of anything I am sure I know the meaning of.

If someone with more knowledge of the subject could do more in-depth work on it, that'd be very helpful. This is supposed to be an *encyclopedia* entry - that is, it's supposed to be a reasonably easily-understood explanation of a complex topic. Currently the "complex topic" part is more than covered, but there's not nearly enough of the "easily understood" part. In particular, the intro paragraph needs *heavy* revision, as it's the main thing non-technical readers will look at if they encounter this topic.

To forestall complaints of "It's a complex topic so it NEEDS complex language!": Yes, that's true, but there's a difference between technical terminology used to explain something and incomprehensible masses of unnecessarily-elevated vocabulary and tortured phrasing.

Posturing and Silliness

[edit]

The glorification of the field of information entropy as conceptually distinct and more fundamental, compared to statistical mechanics and thermostatistics, is quite oblivious of the pre-existing vastness and sophistication of this basic field (statistical mechanics) from whence the Shannon paper and subsequent efforts are merely offshoots. The statements under this "comparison" are rife with patently false statements about what "entropy" was limited to previously in statistical mechanics (which, contrary to this silly paragraph, was defined by probability distributions, non-observably small fluctuations, and other such concepts that fundamentally define entropy!!). "Information entropy" is a small conceptual contribution to a vast, existing body of development on entropy theory that is falsely distinguished from by this paragraph. The author of this piece should be entirely better educated on the subject, rather than making this display of ignorance and presuming it corresponds to the pre-existing field.2602:306:CF87:A200:6D19:A65A:57A0:F49D (talk) 18:44, 7 January 2016 (UTC)[reply]

If your general objections are valid, it's difficult to make corrections without a specific example or examples of what you object to. Can you state specifically the text that you object to? PAR (talk) 01:01, 8 January 2016 (UTC)[reply]
It is my impression that the author of the article (or their source) might refer to the concept of entropy in the phenomenological theory of thermodynamics which predates statistical physics. This would explain the odd phrasing "no reference to any probability distribution", which is clearly absurd when taken as a commentary on statistical physics. Unfortunately, this isn't made clear in the text and the paragraph's intro "At an everyday practical level" doesn't help at all. 193.29.81.234 (talk) 12:34, 6 August 2024 (UTC)[reply]

Mistake in “Further properties”?

[edit]

I think the proof in the second point of the section "Further properties" is incorrect. -log(x) (negative log) is a convex function so Jensen's inequality should have the opposite sign. This can be checked with a very simple example, e.g. p_0=3/4 and p_1=1/4. Moreover, note that the right-hand side of the first inequality is the Renyi-2 entropy and the left-hand side is the Shannon entropy and indeed the latter must be greater or equal than the former, not lesser or equal as stated in the article. Finally, I believe H(p) < log(n) is proved in a different way in the reference given (Cover and Thomas). If what I am saying is correct, I suggest replacing this line with a proper proof, e.g., from the Cover and Thomas. Qlodo (talk) 09:11, 20 September 2023 (UTC)[reply]

@Qlodo: checking your example I agree the text looks wrong. I've rephrased to omit a proof (and be slightly clearer about what n is). Are you able to add a correct proof, from Cover and Thomas or another text you can access? When you find a mistake on Wikipedia the etiquette is to fix it yourself. — Bilorv (talk) 21:14, 24 September 2023 (UTC)[reply]

Confusion between cross-entropy and KL divergence

[edit]

In the article, one can find the following definition for KL divergence:

But this is the definition of cross-entropy . The reltionship between the 2 is

And this page should link to cross-entropy at https://en.wikipedia.org/wiki/Cross-entropy.

Rambip (talk) 09:12, 1 September 2024 (UTC)[reply]