我们的分析考虑了系统演化的两个时间点,分别表示为 t 和 t',其中 t<t'。相应的动力学编码为转移概率。我们考虑通过条件概率产生的特征,这些特征与底层系统具有随附性(supervenience);也就是说,如果系统在 t 时间的完整状态可以完全准确地知道,转移概率对 t'>t 时间的未来状态不提供任何预测能力。我们在下面的定义中形式化地说明这一点。定义1:设为一随机过程,如果对于所有,形成马尔可夫链,那么称对具有随附性。 上面的条件相当于要求当给定时,在统计上与无关。图2说明了随附特征和底层系统之间的关系。
我们的理论基于部分信息分解(Partial Information Decomposition,PID)框架[14],它为推理多变量系统中的信息提供了强大的工具。简而言之,PID 将n个源 提供的关于目标变量Y的信息以信息原子(information atoms)的形式分解: 其中是一套反链集合(antichain collections)[14]。直观地说,对,代表变量集合提供、但其子集合不提供的冗余信息。例如,对于n=2个源变量,α = {{1}{2}} 对应于两个源提供的关于Y的信息,α = {{i}} 对应于 Xi 单独提供的信息。最有趣的是,α = {{12}} 对应于两个源共同提供但不单独提供的信息,这通常被称为信息协同(informational synergy)。PID 的一个缺点是信息原子的数量(即的势)随着源的数量超指数增长。因此,根据特定标准对分解进行粗粒化是很有用的。这里我们引入n个变量之间的 k 阶协同效应(k th-order synergy)的概念,其计算方法为: 其中。直观地说,对应的是关于目标的信息,这些信息由整个提供,但不包含在任何由k个或更少部分组成的集合中,当这些部分与其他部分分开考虑时。因此,只包含有k个以上源的组的集合。同样地,我们引入的独特信息(unique information),其中为关于最多k个其他变量的集合,其计算公式为: 其中,为中不包含在中的所有变量。简单地说,表示携带的关于的信息,但是中任何k个或更少的变量组都不具有这些信息。注意这些粗粒化的项可以用来建立附录S1(第1节)中描述的的一般分解,其属性在附录S1(第2节)中得到证明。PID的一个特点是,它规定了信息原子的结构和它们之间的关系,但它没有规定一个特定的函数形式来计算。事实上,只有一个信息原子必须被指定来确定整个PID——通常是所有单个元素之间的冗余[14]。在PID文献中,有多个关于的具体函数形式的建议,可以见文献[15-18]。一个基于最近的PID[19]的完全计算信息原子的特殊方法将在“通过协同通道测量涌现”一节中讨论。方便的是,我们的理论并不依赖于PID的具体函数形式,而只是依赖于附录S1(第2节)中精确表述的几个基本属性。因此,该理论可以使用任何PID进行实例化,只要这些属性得到满足。重要的是,如“大型系统的实用标准”一节所示,该理论允许推导出独立于所选PID的实用度量。
3.3 定义因果涌现
有了PID的工具,现在我们介绍对因果涌现的形式定义。定义2:对于一个由描述的系统,一个随附特征被认为表现出 k 阶因果涌现,如果 相应地,当一个随附特征具有不可还原因果效应时,即当它施加的因果效应不是由系统的任何部分传递时,因果涌现就发生了。换句话说,代表系统中涌现的集体属性,如果它:1)包含动态相关的信息(这意味着它预测系统的未来演化);2)这个信息超出了系统中k个部分的群体单独考虑时所给出的信息。为了更好地理解这个定义的含义,我们研究它的一些基本属性。
在这里,我们提出用两个著名系统对我们的实用涌现标准(命题1)进行评估。康威生命游戏(Conway’s Game of Life)[27]和雷诺鸟群模型(Reynolds’ flocking model)[28]。两者都被广泛认为是涌现行为的典型例子,并在复杂性和人工生命文献中得到了深入研究[29]。因此,我们使用这些模型作为我们方法的测试案例。模拟的技术细节在附录S1第5节中提供。康威的生命游戏。生命游戏的一个众所周知的特征是粒子的存在:这里的粒子是指负责传递和修改信息的连贯、自我维持的结构[30]。这些粒子一直是广泛研究的对象,并有详细的分类方法和分类[29, 31]。为了测试粒子的涌现特性,我们模拟15x15方形元胞阵列的演化,我们将其视为二进制向量,n = 225。作为初始条件,我们考虑对应于“粒子对撞机”设置的位形,两个已知类型的粒子面对面(图4)。在每次试验中,通过改变粒子的位置、类型和相对位移,系统被随机化。在选择一个初始位形后,著名的生命游戏演化规则[27]被应用1000次,导致最终状态。模拟表明,这个时间间隔足以使系统在碰撞后处于稳定状态。为了使用公式(10)的标准,我们需要选择一个候选的涌现特征。在这种情况下,我们考虑一个符号化的、离散值向量来编码网格上的粒子类型。具体来说,我们考虑,其中,如果在时间t有一个j类型的粒子,无论其位置或方向。有了这些变量,我们使用互信息的贝叶斯估计器来计算公式(10)中的量[32]。结果是,正如预期那样,因果关系出现的标准是。此外,我们发现,这比小了几个数量级。误差代表代用数据的标准偏差,如附录S1第5节所述。使用命题1,这两个结果表明,生命游戏中的粒子动力学可能不仅是涌现的,而且与它们的底层是因果解耦的。
1. Gibb S, Findlay RH, Lancaster T. The Routledge Handbook of Emergence. Routledge; 2019.2. Bedau M. Downward causation and the autonomy of weak emergence. Principia: An International Journal of Epistemology. 2002;6(1):5–50.3. Chalmers DJ. Strong and Weak Emergence. Oxford University Press; 2006. p. 244–256.4. Chang AY, Biehl M, Yu Y, Kanai R. Information Closure Theory of Consciousness. arXiv preprint arXiv:190913045. 2019;.5. Bedau MA. Weak emergence. Noûs. 1997;31:375–399.6. Seth AK. Measuring autonomy and emergence via Granger causality. Artificial Life. 2010;16(2):179–196. pmid:200674057. Hoel EP, Albantakis L, Tononi G. Quantifying causal emergence shows that macro can beat micro. Proceedings of the National Academy of Sciences. 2013;110(49):19790–19795. pmid:242483568. Hoel EP. When the map is better than the territory. Entropy. 2017;19(5):188.9. Klein B, Hoel E. The emergence of informative higher scales in complex networks. Complexity. 2020;2020.10. Pearl J. Causality: Models, Reasoning and Inference. Cambridge University Press; 2000.11. Rosas F, Mediano PAM, Ugarte M, Jensen HJ. An information-theoretic approach to self-organisation: Emergence of complex interdependencies in coupled dynamical systems. Entropy. 2018;20(10). pmid:3326588212. Bressler SL, Seth AK. Wiener–Granger causality: A well established methodology. Neuroimage. 2011;58(2):323–329. pmid:2020248113. James RG, Ayala BDM, Zakirov B, Crutchfield JP. Modes of information flow. arXiv preprint arXiv:180806723. 2018;.14. Williams PL, Beer RD. Nonnegative decomposition of multivariate information. arXiv preprint arXiv:10042515. 2010;.15. Ay N, Polani D, Virgo N. Information decomposition based on cooperative game theory. arXiv preprint arXiv:191005979. 2019;.16. Lizier JT, Bertschinger N, Jost J, Wibral M. Information decomposition of target effects from multi-source interactions: Perspectives on previous, current and future work. Entropy. 2018;20(4). pmid:3326539817. James R, Emenheiser J, Crutchfield J. Unique information via dependency constraints. Journal of Physics A: Mathematical and Theoretical. 2018;.18. Ince RA. Measuring multivariate redundant information with pointwise common change in surprisal. Entropy. 2017;19(7):318.19. Rosas F, Mediano P, Rassouli B, Barrett A. An operational information decomposition via synergistic disclosure. arXiv preprint arXiv:200110387. 2020;.20. Mediano PA, Rosas F, Carhart-Harris RL, Seth AK, Barrett AB. Beyond integrated information: A taxonomy of information dynamics phenomena. arXiv preprint arXiv:190902297. 2019;.21. Kraskov A, Stögbauer H, Grassberger P. Estimating mutual information. Physical Review E. 2004;69(6):066138.22. Lizier JT. JIDT: An information-theoretic toolkit for studying the dynamics of complex systems. Frontiers in Robotics and AI. 2014;1:37.23. McGill WJ. Multivariate information transmission. Psychometrika. 1954;19(2):97–116.24. Timme N, Alford W, Flecker B, Beggs JM. Synergy, redundancy, and multivariate information measures: An experimentalist’s perspective. Journal of Computational Neuroscience. 2014;36(2):119–140. pmid:23820856 25. Rosas FE, Mediano PAM, Gastpar M, Jensen HJ. Quantifying high-order interdependencies via multivariate extensions of the mutual information. Physical Review E. 2019;100:032305. pmid:3164003826. Rassouli B, Rosas FE, Gündüz D. Data Disclosure under Perfect Sample Privacy. IEEE Transactions on Information Forensics and Security. 2019;.27. Conway J. The game of life. Scientific American. 1970;223(4):4.28. Reynolds CW. Flocks, Herds and Schools: A Distributed Behavioral Model. vol. 21. ACM; 1987.29. Adamatzky A, Durand-Lose J. Collision-based computing. Springer; 2012.30. Lizier J. The local information dynamics of distributed computation in complex systems. University of Sydney; 2010.31. Wolfram S. A new kind of science. Wolfram Media; 2002.32.Archer E, Park I, Pillow J. Bayesian and quasi-Bayesian estimators for mutual information from discrete data. Entropy. 2013;15(5):1738–1755.33. Vicsek T. Universal patterns of collective motion from minimal models of flocking. In: 2008 IEEE Conference on Self-Adaptive and Self-Organizing Systems. IEEE; 2008. p. 3–11.34. Chao Z, Nagasaka Y, Fujii N. Long-term asynchronous decoding of arm motion using electrocorticographic signals in monkey. Frontiers in Neuroengineering. 2010;3:3. pmid:2040763935. Dehaene S. Consciousness and the brain: Deciphering how the brain codes our thoughts. Penguin; 2014.36. Turkheimer FE, Hellyer P, Kehagia AA, Expert P, Lord LD, Vohryzek J, et al. Conflicting emergences. Weak vs. strong emergence for the modelling of brain function. Neuroscience & Biobehavioral Reviews. 2019;99:3–10. pmid:3068452037. Corning PA. The synergism hypothesis: On the concept of synergy and its role in the evolution of complex systems. Journal of Social and Evolutionary Systems. 1998;21(2):133–172.38. Rueger A. Physical emergence, diachronic and synchronic. Synthese. 2000;124(3):297–322.39. Breuer HP, Petruccione F, et al. The Theory of Open Quantum Systems. Oxford University Press; 2002.40. Smith TH, Vasilyev O, Abraham DB, Maciołek A, Schmidt M. Interfaces in confined Ising models: Kawasaki, Glauber and sheared dynamics. Journal of Physics: Condensed Matter. 2008;20(49):494237.41. Corning PA. The re-emergence of “emergence”: A venerable concept in search of a theory. Complexity. 2002;7(6):18–30.42. Anderson PW. More is different. Science. 1972;177(4047):393–396. pmid:1779662343. Anderson PW. Basic notions of condensed matter physics. CRC Press; 2018.44.Kauffman S, Clayton P. On emergence, agency, and organization. Biology and Philosophy. 2006;21(4):501–521.45. Kauffman SA. A World Beyond Physics: The Emergence and Evolution of Life. Oxford University Press; 2019.46. Bishop RC, Atmanspacher H. Contextual emergence in the description of properties. Foundations of Physics. 2006;36(12):1753–1777.47. Atmanspacher H, beim Graben P. Contextual emergence. Scholarpedia. 2009;4(3):7997. pmid:1973114848. Jensen HJ, Pazuki RH, Pruessner G, Tempesta P. Statistical mechanics of exploding phase spaces: Ontic open systems. Journal of Physics A: Mathematical and Theoretical. 2018;51(37):375002.49. Mediano PAM, Seth AK, Barrett AB. Measuring Integrated Information: Comparison of Candidate Measures in Theory and Simulation. Entropy. 2018;21(1). pmid:3326673350. Barrett A, Mediano P. The Phi measure of integrated information is not well-defined for general physical systems. Journal of Consciousness Studies. 2019;26(1-2):11–20.51. Seth AK, Barrett AB, Barnett L. Granger causality analysis in neuroscience and neuroimaging. Journal of Neuroscience. 2015;35(8):3293–3297. pmid:2571683052. Chang AYC, Biehl M, Yu Y, Kanai R. Information closure theory of consciousness. Frontiers in Psychology. 2020;11:1504. pmid:3276032053. Takens F. Detecting strange attractors in turbulence. In: Dynamical Systems and Turbulence. Springer; 1981. p. 366–381.54. Cliff OM, Prokopenko M, Fitch R. An information criterion for inferring coupling of distributed dynamical systems. Frontiers in Robotics and AI. 2016;3:71.55. Tajima S, Kanai R. Integrated information and dimensionality in continuous attractor dynamics. Neuroscience of consciousness. 2017;2017(1):nix011. pmid:3004284456. Wilting J, Priesemann V. Inferring collective dynamical states from widely unobserved systems. Nature Communications. 2018;9(1):1–7. pmid:2989933557. Kent A. Toy Models of Top Down Causation. arXiv preprint arXiv:190912739. 2019.
Cosma Rohilla Shalizi and Cristopher Moore. “What Is a Macrostate? Subjective Observations and Objective Dynamics.” ArXiv:Cond-Mat/0303625, March 29, 2003. http://arxiv.org/abs/cond-mat/0303625.
Erik Hoel. “When the Map Is Better Than the Territory.” Entropy 19, no. 5 (April 26, 2017): 188. https://doi.org/10.3390/e19050188
E. P. Hoel, L. Albantakis, and G. Tononi. “Quantifying Causal Emergence Shows That Macro Can Beat Micro.” Proceedings of the National Academy of Sciences 110, no. 49 (December 3, 2013): 19790–95. https://doi.org/10.1073/pnas.1314922110.
Thomas Varley and Erik Hoel. Emergence as the conversion of information: A unifying theory[J]. https://arxiv.org/pdf/2104.13368.pdf
L. Barnett and A. K. Seth. Dynamical independence: discovering emergent macroscopic processes in complex dynamical systems[J]. https://arxiv.org/pdf/2106.06511.pdf
Fernando E. Rosas, et al. Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data[J]. https://arxiv.org/abs/2004.08220
II. 因果涌现应用
Klein, Brennan, and Erik Hoel. “The Emergence of Informative Higher Scales in Complex Networks.” ArXiv:1907.03902 [Physics], January 21, 2020. http://arxiv.org/abs/1907.03902.
Klein, Brennan, and Erik Hoel. “Uncertainty and Causal Emergence in Complex Networks.” ArXiv:1907.03902 [Physics], July 8, 2019. http://arxiv.org/abs/1907.03902.
Griebenow, Ross, Brennan Klein, and Erik Hoel. “Finding the Right Scale of a Network: Efficient Identification of Causal Emergence through Spectral Clustering.” ArXiv:1908.07565 [Physics], August 20, 2019. http://arxiv.org/abs/1908.07565.
Simon Mattsson, Eric J. Michaud, and Erik Hoel. “Examining the Causal Structures of Deep Neural Networks Using Information Theory.” ArXiv:2010.13871 [Cs], October 26, 2020. http://arxiv.org/abs/2010.13871.
Hoel, Erik, and Michael Levin. “Emergence of Informative Higher Scales in Biological Systems: A Computational Toolkit for Optimal Prediction and Control.” Communicative & Integrative Biology 13, no. 1 (January 1, 2020): 108–18. https://doi.org/10.1080/19420889.2020.1802914.
Chvykov, Pavel, and Erik Hoel. “Causal Geometry.” ArXiv:2010.09390 [Hep-Th, Physics:Physics], October 19, 2020. http://arxiv.org/abs/2010.09390.
Bogdan-Eduard-MădălinMursa,LauraDioşan,AncaAndreica:Network motifs: A key variable in the equation of dynamic flow between macro and micro layers in Complex Networks,Knowledge-Based Systems Volume 213, 15 February 2021, 106648
Gutenkunst, Ryan N., Joshua J. Waterfall, Fergal P. Casey, Kevin S. Brown, Christopher R. Myers, and James P. Sethna. “Universally Sloppy Parameter Sensitivities in Systems Biology Models.” PLoS Computational Biology 3, no. 10 (2007): e189. https://doi.org/10.1371/journal.pcbi.0030189.
Machta, B. B., R. Chachra, M. K. Transtrum, and J. P. Sethna. “Parameter Space Compression Underlies Emergent Theories and Predictive Models.” Science 342, no. 6158 (November 1, 2013): 604–7. https://arxiv.org/pdf/1303.6738.
Machine Learning and Renormalization
Li, Shuo-Hui, and Lei Wang. “Neural Network Renormalization Group.” Physical Review Letters 121, no. 26 (December 26, 2018). https://doi.org/10.1103/PhysRevLett.121.260601.
HY Hu, SH Li, L Wang, YZ You Machine learning holographic mapping by neural network renormalization group - Physical Review Research, 2020. https://journals.aps.org/prresearch/pdf/10.1103/PhysRevResearch.2.023369
Hong-Ye Hu, Dian Wu, Yi-Zhuang You, Bruno Olshausen, Yubei Chen: RG-Flow: A hierarchical and explainable flow model based on renormalization group and sparse prior, https://arxiv.org/pdf/2010.00029.pdf
Koch-Janusz, M., Ringel, Z. Mutual information, neural networks and the renormalization group. Nature Phys 14, 578–582 (2018). https://www.nature.com/articles/s41567-018-0081-4
Shuo-Hui Li : Learning Non-linear Wavelet Transformation via Normalizing Flow, https://arxiv.org/abs/2101.11306
Information theory
Tononi, Giulio, Melanie Boly, Marcello Massimini, and Christof Koch. “Integrated Information Theory: From Consciousness to Its Physical Substrate.” Nature Reviews Neuroscience 17, no. 7 (July 2016): 450–61. https://doi.org/10.1038/nrn.2016.44.
David Krakauer: The information theory of individuality, https://link.springer.com/article/10.1007/s12064-020-00313-7#Sec10
Others
Packard, N. H., J. P. Crutchfield, J. D. Farmer, and R. S. Shaw. “Geometry from a Time Series.” Physical Review Letters 45, no. 9 (September 1, 1980): 712–16. https://doi.org/10.1103/PhysRevLett.45.712.
Shalizi, Cosma Rohilla, Kristina Lisa Shalizi, and James P. Crutchfield. “An Algorithm for Pattern Discovery in Time Series.” ArXiv:Cs/0210025, November 26, 2002. http://arxiv.org/abs/cs/0210025.
Defining of Emergence
Kim J. Emergence: Core ideas and issues[J]. Synthese, 2006, 151(3): 547-559.
Kivelson S, Kivelson S A. Defining emergence in physics[J]. NPJ Quantum Materials, 2016, 1(1): 1-2.
Bonabeau E. Predicting the unpredictable[J]. Harvard Business Review, 2002, 80(3): 109-116.
Use of emergence
Holman P. Engaging emergence: Turning upheaval into opportunity[M]. Berrett-Koehler Publishers, 2010.
Walker, S. I., Cisneros, L., and Davies, P. C. W.. Evolutionary Transitions and Top-Down Causation[A]. Proceedings of the ALIFE 2012: The Thirteenth International Conference on the Synthesis and Simulation of Living Systems[C]. East Lansing, Michigan: ASME, 2012, 283-290. https://doi.org/10.1162/978-0-262-31050-5-ch038
Theodore P. Pavlic, Alyssa M. Adams, Paul C. W. Davies, et al. Self-referencing cellular automata: A model of the evolution of information control in biological systems[J]. 2014. https://arxiv.org/pdf/1405.4070.pdf
Nomura T. Formal description of autopoiesis for analytic models of life and social systems[C]//Proc. 8th Int. Conf. Artificial Life (ALIFE VIII). 2002: 15-18. https://dl.acm.org/doi/10.5555/860295.860299
Hofmeyr J H S. A biochemically-realisable relational model of the self-manufacturing cell[J]. Biosystems, 2021: 104463.
Letelier, J. C., Marı́n, G., & Mpodozis, J. (2003). Autopoietic and (M,R) systems. Journal of Theoretical Biology, 222(2), 261–272. doi:10.1016/s0022-5193(03)00034-1
Gánti T. Chemoton theory: theory of living systems[M]. Springer Science & Business Media, 2003.