There are two different takes on explaining the origins of the universe., one modern and the other very ancient. The two explanations can be summed up as the Big Bang and the Big Birth theories. Everyone has heard of the Big Bang theory where an unexplained event is initiated resulting in an incredibly huge amount of inert matter and energy bursting into existence. Life arises much later in equally unexplained circumstances.
In the Big Birth account the life principle thrusts itself onto the scene right from the beginning. In this context, the universe is a living organism. It has a birth and will inevitably eventually encounter a natural death, just like any other living, non-cancerous organism. However, if the universe is based on the life principle. if it is a living organism, then a natural question arises: Where is its DNA? All life forms are organised and regulated via the genetic code. Where and how is this manifested in this so-called living universe? The genetic code is based on a four-letter alphabet. Where can this be discerned in the workings of the universe? It is known that there are four fundamental forces in the physics of nature; perhaps that might have something to do with it. In ancient times, thinkers like the Stoics claimed that nature was based on a sytem of four primordial elements, a concept dating back to Empedocoles. Maybe there might be something in that ancient form of physics after all. This is a major theme in my book The First Science and the Generic Code where I endeavour to drag these ancient gems of wisdom into the modern age. In this post, I attempt to summarise some of the key points and try to add credence to the notion that our universe is based on the same principle of life as all other living orgamsims.
A constant theme in my book is that a comprehensive knowledge of reality requires two different takes. Instead of one science, two sciences of a radically different nature are required. This double take on reality is reflected in the two-hemisphere architecture of the brain. Using the biological brain as a kind of metaphor, I associate the left hemisphere of my metaphorical “epistemological brain” with the common paradigm underlying the traditional sciences and axiomatic mathematics. Traditional scientific thinking is distinctively “left side thinking” – analytic, atomist, abstract, dualist, and so on. In the book I sketch out the nature of the science corresponding the other paradigm, the right side science. The right side science must be non-dualist, synthetic, and non-abstract. Left side science is modern, right side science has its roots in antiquity, and calls out for modernisation. This is the task tackled in my book.
Of course, the history of Western civilisation is really a history of left side thinking versus right side thinking. Like anything fundamental, the idea is not new. I am merely giving it a biological twist. The psychiatrist Iain McGilchrist has recently written an excellent book, The Master and the Emissary, along these lines but without science or mathematics. His book should be companion reading with mine.
The Big Bang
The Big Bang theory is recent, merely dating back to early twentieth century. Monsignor Lemaître, a Belgian priest and physicist, proposed that the universe expanded from an initial “Primeval Atom.” He described it as an “exploding egg,”which is a bit more reminiscent to a Big Birth concept. The modern view now is that the starting point can be explained as a singularity in equations involving Einstein’s General Theory of Relativity. At the level of the singularity, the physics breaks down. However, in the minutest time after, the current scientific explanation kicks in and appears to be remarkably successful. The science of the actual starting point, why and how it all started rests as an enigma. The same can be said as to the end. How will it end? Will it ever end? Definitive answers to these questions also are left hanging.
In the book, I characterise the paradigm underlying all of the left side sciences as being based on second order logic but only first order semantics. Second order logic provides science with its capability to generalise and reason abstractly. First order semantic implies that the science employs a flat nominalist semantics. In order to comprehend something with first order semantics it suffices to find a suitable label distinguishing that something. To understand is to label. First order semantics is nothing more than a flat nominalism. This is the nominalist approach of the left side sciences. Non-constructive, axiomatic mathematics employs the same strategy..
Second order logic is incompatible with what I call second order semantics. In order to get second order semantic something has to give. In the book, I explain the right side scientific paradigm in terms of first order logic and second order semantics. Students of Chrysippus’ Stoic logic will note that his logic is only first order. The logic of Aristotle is more like second order logic, and admits abstraction. The Stoics rejected abstraction and only considered things that exist. Abstractions don’t exist. The first order logic of Chrysippus, by its nature cannot support abstraction. This gave the Stoics the possibility of engaging in second order semantics. Instead of merely labelling things (first order semantics), second order semantic expresses knowledge in the form of oppositions and oppositions between oppositions. This leads to a dialectical form of reasoning. They were pioneers in developing what I call the right side scientific paradigm.
In a nutshell, left side reasoning deals with abstract generalisations. Right side reasoning deals with universals that are forcibly non-abstract. The left side reasoning that is so endemic today does not make this distinction between the general and the universal and continually confuses the two. All laws in traditional physics are general laws, not universal They apply to everything but only in a predetermined situation. For example, the second law of thermodynamics applies to everything in a closed system, everything in a bottle so to speak. The general applies to everything but not everywhere. For the universal, there is no constraining bottle. A universal law applies everywhere but not to everything. It only applies to things that are capable of independent existence. According to my book, the universal laws of nature are expressible in a code, a code that is structurally identical to the genetic code. I call it the generic code. The universe, being an autonomously existing entity, is based on the generic code.
Second order logic and first order semantics sums up the epistemological structure of the traditional sciences including, of course, modern physics. This second order logic, first order semantic, epistemological structure determines the mindset of the modern sciences. It helps to explain what physics can see and, more importantly, what physics cannot see. Because of its epistemological structure, and not because of the limitations of any hadron collider or whatever, there is a huge swathe of reality that traditional physics cannot ever see or even comprehend. This is all due to the standpoint of its underlying epistemological paradigm and could go much further than just dark energy and dark matter.
This left side thinking of the traditional science is very good at seeing the needle in the haystack – once it somehow figures out where and how to look. However, it cannot see the haystack. The needle and the haystack are an example of figure and ground. There may be figure ground reversal with first order semantics, but this simple left sided epistemological brain is unaware of it. It remains oblivious to the accompanying ground. It only is conscious of figure – of what is located there in front of its nose, wherever that may be. The left side perspective is a tunnel vision view of the world. There is always something missing in the left side perspective. The most common missing ingredient is the subject. Without a subject present, the whole notion of figure and ground makes no sense.
In the book, I draw parallels with the left hemisphere of the biological brain and the phenomenon called hemi-neglect or more technically left neglect dyslexia. As Behrmann, et al. describe it, “for example, in humans, following a lesion to the right hemisphere, usually to parietal cortex, the individual may eat food from only the right side of the plate, dress only the right side of the body, and copy or draw only features on the ipsilesional right.”(Behrmann, et al., 2002). In contrast, patients with lesion to the right hemisphere will not display hemi–neglect and will perceive a whole world.
Apparently, the patient loses the ability of managing the focus of attention and merely focuses on the default location – the side that the left hemisphere owns, the ipsilesional right. In the book, I claim that the traditional sciences, including mathematics, suffer from this kind of affliction at an epistemological level. Certainly, a very gifted physicist might be able to compensate for the fundamental limitations of the left side scientific paradigm and so push the boundaries. Such a scientist would resort to intuitively engaging the right side paradigm that I claim to be diametrically opposed to the left side version. I claim that, in mathematics, the great Henri Poincare – “the last of the universalists” – had this kind of genius. In contrast, there was Hilbert, his much disputed foe. Hilbert led mathematics and geometry into the abstraction dominated world so characteristic of modern mathematics, a world where the left side scientific paradigm is undisputed king.
According to my analysis, the right side paradigm would be free of abstraction and so be founded on only first order logic. The upside is that the semantics would be second order rather than first order. Many philosophers such as Hegel have trodden along this path but the thinking remained intuitive, speculative, and not a formal science. What I am claiming is that it is possible to construct a formal “right side” science based on first order logic and second order semantics.
The Big Birth
Thus, I am claiming that modern sciences and mathematics are systemically half-blind. Even more provocatively, one could say that they are “half-brained” and only conceive a half-world. (Reading Iain McGilchrist helps to explain what I am talking about here.) In no way do I challenge the incredible success of the traditional sciences, nor their utility. I only point out a huge expanse of reality of which they are irrevocably and systemically ignorant. In order to arrive at a whole world conception of reality, another radically different paradigm is called for – the paradigm of right side science, a science that incorporates second order semantics and abstraction-free first order logic.
From the perspective of the right side science, everything changes and one starts to see a whole world. In this post, I compare left and right side thinking concerning the genesis of the universe. From the left side perspective, the universe springs into existence as a Big Bang of expanding but inert, dead mass and energy. The broader perspective of the right side science incorporates second order semantics where the Big Bang of matter and energy is only the needle in the haystack. What matters is the haystack that comes along with the needle. This is not just an injection of dead stuff into existence. This is the Big Birth of a highly sophisticated self-regulating entity based on the same life principles of any other living entity. In order to be, the universe must involve more than a cloud of dead stuff. The universe lives and comes along with stuff that makes life possible. This involves a more subtle kind of stuff than the discrete particles detectable in hadron collisions.
The idea that the universe is a living entity is not new. The history of Western philosophy provides a good example with the Stoics. It was also a popular idea during the Enlightenment entertained by both Newton and Leibniz, not to mention Spinoza. Newton even saw the Earth as a living, breathing biosphere like entity. He thought that the veins of rock visible in underground mines were part of the terrestrial circulatory system of earthly vapours. Leibniz took a more modern approach. He saw the possibility of a new science based on a geometry without number. The geometry would be based on a fundamental algebra of a few letters applicable right across the biological and non-biological domains. In the following passage, with astonishing insight, he states the problem:
If it were completed in the way in which I think of it, one could carry out the description of a machine, no matter how complicated, in characters which would be merely the letters of the alphabet, and so provide the mind with a method of knowing the machine and all its parts, their motion and use, distinctly and easily without the use of any figures or models and without the need of imagination. Yet the figure would inevitably be present to the mind whenever one wishes to interpret the characters. One could also give exact descriptions of natural things by means of it, such, for example, as the structure of plants and animals. (Leibniz)
Leibniz claimed that there should be a universal language for coding the structure of “natural things”. We now know that Leibniz was correct. There is indeed such a universal language, the genetic code. If the physical universe we live in is also a “natural thing” then it should also be subject to the dictates of this same universal code. That is the story spelt out in my book.
Leibniz claimed that even more important than the descriptive powers of the algebra, was the explanatory power and the simplicity involved. He argued that traditional explanations were too complicated. Scientific progress is blocked unless we can find a simpler way of explaining things. He used the example of explaining a rainbow:
Finally, I have no hope that we can get very far in physics until we have found some such method of abridgment to lighten its burden of imagination. For example, we see what a series of geometrical reasoning is necessary merely to explain the rainbow, one of the simplest effects of nature.
Leibniz called for a simple and simplifying algebra that simplified geometric reasoning. Once again, Leibniz proved very prophetic, but it took some time. To celebrate the bicentenary of Leibniz’s birth in Leipzig, a mathematical competition based on his famous “geometry without numbers” problem was organised by a local learned society. Encouraged by Mobius, Herman Grassmann was the winner and only entrant in the competition. His presentation described the foundations of his geometric calculus devoid of coordinates and metric properties. This marked the beginning of a new approach to geometry. The new discipline would eventually be called Geometric Algebra (GA). Hamilton followed with his geometric universalisation of complex numbers. Clifford integrated both the algebras of Grassmann and Hamilton to produce the algebras that bear his name. The geometrisation of algebra started by Grassmann then fell into obscurity and was eclipsed by what I call the left side version of geometry. The left side paradigm interpretation of Grassmann’s directed number concept was developed by Gibbs and independently by Heaviside. This leads to the coordinate intensive linear algebra still dominant today. Whitehead, following on from Grassmann and Hamilton, developed Universal Algebra, a science of algebraic structures based on operations rather than relations, devoid of ordering and quantification over sets.
Over recent times there has been some revival of interest in GA. David Hestenes has advanced the geometric algebra approach to provide new insight into a wide range of physical topics from classical mechanics and electromagnetism, to Quantum Mechanics and gauge theory. Hestenes claims that this is the universal algebra which unifies mathematics and physics. Certainly, the resulting geometry is simple and simplifying. In the form of Conformal Geometric Algebra, its prowess has been exploited in computer graphics. The complex graphics of any modern video game. more often than not, would probably have been developed on a GA platform.
However, GA in its present form does not match up to the lofty ambitions of Leibniz. GA might have found its niche in many areas but in no way does it “give exact descriptions of natural things by means of it, such, for example, as the structure of plants and animals.” In order for this to be the case, one would have to show that GA is directly related to the genetic code. No one has shown this – until now. By going back and renovating the ancient ideas of Parmenides, Empedocles, and the Stoic logician Chrysippus, a radically renovated version of GA can be directly matched to the genetic code. The code becomes truly universal. I report this in my book.
Is String Theory another Ptolemaic System?
Leibniz claimed that significant progress in physics was blocked by not having a simple algebraic tool that eased the “burden of imagination”. An ancient system that placed an enormous burden on imagination was the incredibly complex Ptolemaic system explaining the movement of the heavenly bodies. It was based on the assumption that Earth was the centre of the universe. Ptolemy was a highly gifted mathematician. His system actually had some quite accurate predictive power and was studied and used by astronomers for many centuries. Not for the faint hearted, one would have to have considerable mathematical dexterity to master such a system. With the advent of the Copernican revolution and Newton’s theory of gravitation, the whole Ptolemaic enterprise collapsed and gave way to a much simpler and powerful system.
In the book I point out that modern String Theory might (or should) suffer the same fate. Unlike traditional left side science, String Theory is not empirical and aims at working out the structure of the elementary particles of physics from first principles, in some way or other. It is my opinion that a String Theory kind of science should be based on the right side scientific paradigm, not the traditional paradigm of the empirical science and axiomatic mathematics. However, this is not the case. String Theory is decidedly based on the left side paradigm. This means that it dualist. One of the predictions of String Theory is that each elementary particles has a super-symmetric partner. No experimental evidence has ever been put forward that collaborates the existence of such partner pairing. The way I see it, the prediction is purely the consequence of the dualist assumptions of the underlying simplistic paradigm. One could even think of it as another manifestationof the hemi-neglect affliction.
String Theory, as it stands, is extremely complex. In this respect, it is truly the modern equivalent of a Ptolemaic System. One has to be an extremely clever mathematician to master the intricate mathematics of the subject. Just as the ancient Ptolemaic system was founded upon a naïve assumption, so it is for String Theory. String Theory is based on a naïve, dualist, Cartesian assumption concerning the nature of geometry. On the face of it, the assumption seems quite reasonable. Geometric reality is dualist being made up of two separate entities. On the one side, there is reality as space, the container for things. Contained in that space are the things that populate reality. In this context, String Theory has to grapple with a chicken and egg problem. The form and shape of things; does it come from the shape of space or the shape of the thing or perhaps both? The problem is compounded by the hidden Cartesian assumptions about space itself. Ptolemy had it easy.
The position taken in my book is that a science of the universal shape of things demands a non-dualist epistemology. It must be a right side science. In this perspective, space and the thing in space are the same stuff. It is in this non-dualist paradigm that the simple and simplifying science of Leibniz can be achieved. The book provides an outline and the basic principles of such a science. In so doing the geometric semantics of the generic cum genetic code are revealed, Here we find the simple four-letter code of nature applied universally across the board to the biological and non-biological domains.
The vision unveiled by the book is so wide reaching and spectacular, that I have been rather timid in writing it up. One claim is that all of the known as well as the unknown elementary “particles” of nature can be geometrically constructed from the generic cum genetic code. The story seems so outlandish that I will end this post here and simply encourage the interested reader to appraise critically the book and its unusual message.