What Is Information?
Not Entropy; Not Negative Entropy. Why the True Nature of Information Is Triadic, Not Dyadic.
By Sungchul Ji, Ph.D.
Emeritus Professor of Theoretical Cell Biology
Ernest Mario School of Pharmacy, Rutgers University
We live in an age of deep confusion about one of the most foundational concepts of science and philosophy: information. Despite its pivotal role in everything from quantum computing and genetics to black hole thermodynamics and artificial intelligence, information is often reduced to an oversimplified slogan [1]:
“Information is entropy.” Or, slightly more colorfully,
“Information is negative entropy.”
But these claims are not only misleading—they are logically invalid. They collapse a fundamentally triadic, energetic, and teleological process into a dyadic mathematical identity. In doing so, they obscure the very mechanisms by which meaning, purpose, and biological structure arise.
What Is Information, Really?
Let’s start from first principles. According to a revised formulation based on Shannon’s foundational work and enriched by biosemiotics, thermodynamics, and Aristotelian metaphysics, information emerges when a system selects from a set of possible messages [2].
More precisely:
I = log2(m/n)
Which is a simplified version of the Shannon formula [2], where:
n = the number of available choices (potential messages),
m = the number of messages actually selected by a mechanism or agent,
I = the information content of the selection, measured in bits.
This deceptively simple expression hides a powerful truth:
Information is generated not by randomness, but by selection.
Selection Is Triadic
For any act of information to occur, three irreducible elements must be present:
Possibility — A set of alternative messages (Shannon’s “uncertainty”)
Actuality — One or more messages selected from that set
Constraint — An energy-driven mechanism or agent that performs the selection
This mirrors the Peircean semiotic triad:
Sign – Object – Interpretant
And it also aligns with Aristotle’s four causes, which are remarkably modern in this context:
Therefore, information is not a thing. It is a process—a selection-driven transformation of potential into meaningful structure.
Why “Information = Entropy” Is False
Let us now be precise. The popular identification of information with entropy is a form of category mistake, rooted in what may be called the Triad-to-Dyad Collapse (TDC) Fallacy:
The error of reducing a three-term process (potential, selection, result) into a two-term identity (information = entropy), thereby erasing the mechanism of selection and its energetic cost.
This fallacy manifests in:
The assumption that entropy and information are interchangeable,
The belief that no agent or mechanism is needed to extract structure from randomness,
The neglect of teleology—that is, purpose—as essential to biological or semantic information.
Selection Requires Energy
Landauer’s Principle tells us that erasing one bit of information costs at least kTln2 (i.e., 0.018 eV or 2.9×10−21 J) in energy [3]. This reveals a fundamental truth:
Information is thermodynamically expensive.
In your genome, in your brain, in every measurement you make—selection dissipates energy into heat to create meaning. A truly random message has high entropy, but zero semantic content. Only when filtered, interpreted, or embodied in a system with purpose does it become information.
Final Insight: Information Is Constraint with Purpose
Information is not entropy, but a selection from entropy.
It is the logarithmic trace of constraints imposed on disorder, driven by agents or systems with goals and energy budgets.
To define information as “negative entropy” is like defining a house as “negative rubble.” It misses the blueprint, the builder, and the reason for building it.
Closing Thought
As we move deeper into the 21st century—into the age of artificial intelligence, synthetic biology, and conscious matter—we must outgrow the crude slogans of 20th-century thermodynamics [1]. We must rediscover what Aristotle [5] and Peirce [6] understood intuitively:
Information is selection for a purpose—not entropy, not randomness, and certainly not free.
References:
[1] Wicken, J. S. (1987). Entropy and Information: suggestions for Common Language. Phil. Sci. 54:176-193.
[2] Entropy. https://en.wikipedia.org/wiki/Entropy_(information_theory)
[3] Landauer’s principle. https://en.wikipedia.org/wiki/Landauer%27s_principle
[4] Ji, S. (2018). The Cell Language Theory: Connecting Mind and Matter. World Scientific Publishing, New Jersey.
[5] Four causes. https://en.wikipedia.org/wiki/Four_causes
[6] Charles Sanders Peirce. https://en.wikipedia.org/wiki/Charles_Sanders_Peirce




