The Science of Reduction and the Nature of Complexity

Elegance and beauty of scientific and mathematical results are at times judged by the compact form they are presented in. Science looks for unifying theories, and new ideas are analyzed component-wise using models. The evolution of artificial intelligence brings an argument for complexity.

The Science of Reduction and the Nature of Complexity
© Alina Grubnyak @ Unsplash

The Universe in a nutshell” is what Stephen Hawking wanted to express. In mathematics, Euler’s identity, e^ipi + 1 = 0, is one of the most beautiful expressions of profound simplicity. The famous Einsteinian equation, E = mc2, Newton’s law of F = ma, even Pythagoras’ theorem are all examples of supreme conciseness and clarity in laws of nature or fundamental equations. Their presentation includes an aesthetic value, but researchers know that, to deduce or at least understand them, there’s a long and bumpy road ahead, with lots of preliminaries and especially models.

Natural science works inevitably with models. Before formulating a hypothesis or a theory, the complexity of nature must be put in an essential form, partly abstract, subject to certain conventions. Among those, a willingness to reduce, to put away some parameters which are specific.

Modeling Nature

You’ve met some models in the very first years of studying physics. The point-like light sources, objects that are stripped of their shape or material substance, that become simple “bodies”, their frictionless travel through air, electrical connecting wires that have zero resistance and many more. The mere look at things around us provides such models sometimes involuntarily, when we “regulate” shapes: thinking of our planet or our eyes as spheres, blood vessels as long and narrow cylinders or “cones” of coniferous trees.

Such models are fundamental in all disciplines of science. One of the important challenges that a researcher faces is to find the essential parameters that are useful in their experiment and, complementarily, those which they can ignore. That’s why an abstract ― hence general ― result, such as a mathematical theorem is so powerful: it’s not about specifics but describes or prescribes properties shared by a large number of objects.

Research based on models, the very attempt to express phenomena based on theoretical frameworks which simplify makes reductionism. The approach to understanding an object or concept by extracting its essential or through analogies, reductions to previous partial results that are combined to get the big picture goes against complexity. One understands that the whole is foremost the sum of its parts and thus the attention is shifted to them.

When necessary, one can adapt the model to suit their research: there is sufficient data to collect about the trajectory of an object if you only follow the movement of its center of mass. The same object submerged in hot water will behave according to the laws of small particles and thermal agitation. It is therefore clear what details one can ignore without missing the essential, even if the decision is made on a case-by-case basis.

What Is in the Details?

There are, however, situations in science where details not only make the difference, but the whole picture. The artificial intelligence age brought them into focus, but such cases, namely complex systems, have been studied by mathematicians, physicists and more for more than half a century.

The 2024 Nobel Prize in Physics, awarded to Geoffrey Hinton and John J. Hopfield, raised some controversy. The two researchers have been working on problems pertaining to artificial intelligence: machine learning and artificial neural networks. It is beyond any discussion or doubt their competence and the influence their results have on their peers. The main question was what it all has to do with physics.

There’s also something else, going beyond or outside the results themselves: an almost philosophical point of view that has to do with one’s approach to scientific research. The physicist Adam Frank argues in the American magazine The Atlantic that the praise and prestige of Hinton’s and Hopfield’s work show, above all, that physics has a lot to learn from life sciences and instead of reductionism, it could embrace complexity.

Towards a Complex Science

Regardless of the predictability that a set of equations could offer, even when they are not deterministic or precise, in the strict sense, the evolution of a multicellular system (alive or not) does not follow the same rules of physical modeling. There could be emerging behavior ― a feature of many systems made of a huge number of components that interact with each other ―, but the intrinsic motivation, the telos, is not there. Some experts argue that, if an AI model “wants” to do something, it is because the programmers that made it have purposefully included the code that led to the preconditions of this initiative, even if the connection between the initial stage of programming and the subsequent evolution may not be clear.

But this isn’t (only) about agency, will, or consciousness that AI may or may not possess. Adam Frank shows that awarding the Nobel Prize in Physics for research related to artificial intelligence could launch a new era of physics, even a new way of making science.

Complexity, that more-than-unpredictable behavior of systems made of a huge number of components that interact with each other, have feedback, uncertainty, and nondeterminism, that harness a huge computational power ― which is precisely the case of artificial intelligence models nowadays ― brings into focus “a truth that physics can no longer ignore”, as Frank’s titles his piece.

Physics where researchers learn more from living systems or, in any case, complex behavior could not only produce new, surprising results, but a new method, a fresh way of studying nature and our surroundings. Reductionism that looks for models and “theories of everything” is paralleled ― not necessarily contrasted ― with complexity science, where the only model of a system is the system itself, and the details are all essential, due to the very architecture of the ensemble.

I think that one of the important merits of the exponential boom that AI had lately is that it brought to our attention a part of fundamental research that is decades old. Complex systems, cellular automata, feedback processes, autoregulation, and questions about consciousness, agency, will, and ethics are old, but far from settled. Science and its intersection with technology brought them to our attention today.

This could be an extra argument for that “new way of making science” that Frank writes about, which could make the 2024 Nobel Prize one of the very few occasions when the scientific community rewarded breakthroughs for their future implications.


Thank you for reading The Gradient! The articles here will always be free. If you've read something useful or enjoy what we publish, please consider a small contribution by sharing on social media, starting a paid subscription, or a tip.