A more general definition of entropy was proposed by Boltzmann (1877) as S = k ln W, where k is Boltzmann’s constant, and W is the number of possible states of a system, in the units J⋅K−1, tying entropy to statistical mechanics. Szilard (1929) suggested that entropy is fundamentally a measure of the information content of a system. Shannon (1948) defined informational entropy as \(S=-\sum_{i}{p}_{i}{log}_{b}{p}_{i}\) where pi is the probability of finding message number i in the defined message space, and b is the base of the logarithm used (typically 2 resulting in units of bits). Landauer (1961) proposed that informational entropy is interconvertible with thermodynamic entropy such that for a computational operation in which 1 bit of information is erased, the amount of thermodynamic entropy generated is at least k ln 2. This prediction has been recently experimentally verified in several independent studies (Bérut et al. 2012; Jun et al. 2014; Hong et al. 2016; Gaudenzi et al. 2018).
The equivalency of thermodynamic and informational entropy suggests that critical points of instability and subsequent self-organization observed in thermodynamic systems may be observable in computational systems as well. Indeed, this agrees with observations in cellular automata (e.g., Langton 1986; 1990) and neural networks (e.g., Wang et al. 1990; Inoue and Kashima 1994), which self-organize to maximize informational entropy production (e.g., Solé and Miramontes 1995). The source of additional information used for self-organization has been identified as bifurcation and deterministic chaos (Langton 1990; Inoue and Kashima 1994; Solé and Miramontes 1995; Bahi et al. 2012) as defined by Devaney (1986). This may provide an explanation for the phenomenon termed emergence, known since classical antiquity (Aristotle, c. 330 BCE) but lacking a satisfactory explanation (refer to Appendix A for discussion on deterministic chaos, and Appendix B for discussion on emergence). It is also in full agreement with extensive observations of deterministic chaos in chemical (e.g., Nicolis 1990; Györgyi and Field 1992), physical (e.g., Maurer and Libchaber 1979; Mandelbrot 1983; Shaw 1984; Barnsley et al. 1988) and biological (e.g., May 1975; Chay et al. 1995; Jia et al. 2012) dissipative structures and systems.
This theoretical framework establishes a deep fundamental connection between cyberneticFootnote 1 and biological systems, and implicitly predicts that as more work is put into cybernetic systems composed of hierarchical dissipative structures, their complexity increases, allowing for more possibilities of coupled feedback and emergence at increasingly higher levels. Such high-level self-organization is routinely exploited in machine learning, where artificial neural networks (ANNs) self-organize in response to inputs from the environment similarly to neurons in the brain (e.g., Lake et al. 2017; Fong et al. 2018). The recent development of a highly organized (low entropy) immutable information carrier, in conjunction with ANN-based artificial intelligence (AI) and distributed computing systems, presents new possibilities for self-organization and emergence.
Comments are closed.