I've been obsessed with this question for quite a while now. I think it dates back to reading Wolfram's A New Kind of Science, especially the part where he goes into what sort of computational system mathematical proof might be, and in particular the picture on page 153 that shows how the classical iterative map from chaos theory is actually just a digit shift when you look at it in binary notation. So is chaos very simple, or terribly complicated? Well, it depends on how you write down the numbers involved. The idea that the notation you use to represent a number matters in a deep way really got under my skin.
Unfortunately, it turns out to surprisingly hard to find other people who are also obsessed with this question. Shocking, I know. But, today only, for your blog perusing pleasure, I present Brain Rotman, mathematician, philosopher, and author of Ad Infinitum ... The Ghost in Turing's Machine: Taking God Out of Mathematics and Putting the Body Back In. An Essay in Corporeal Semiotics. This is a very interesting little book (though with a title so long that you have to wonder whether it's a cheeky joke) that takes issue with every known philosophical interpretation of mathematics, and tries to open up a new and more realistic way of thinking about just what this particular animal behavior pattern is all about.
First, Rotman dispenses with the age-old absurdity of the Platonist viewpoint -- mathematics as a language for describing pre-existing truths about eternal objects. As loopy as that sounds, I think most mathematicians still look at it this way. They're not inventing stuff, they think they're discovering it. To me, this account of the activity is so obviously theological that it's not even worth exploring. I mean, once you start asking how the apes get in touch with the essences the whole thing sinks faster than the Hindenberg.
Next, he goes on to dispense with Hilbert's reinterpretation of mathematics as a purely formal and computational enterprise devoid of all meaning. As if Godel hadn't already managed that. His critique here is not the same as Godel's of course. He's not concerned with the fact that the computers will never make it to the "end" of mathematics, he's just wondering why on earth anyone would be interested in math if it merely consisted in taking a set of axioms and some rules of inference and grinding out tautological proofs.
Finally, while he has the most sympathy for it, he has a fundamental objection to the constructivist-intuitionist branch that Brouwer and Kronecker advanced under the flag of the latter's infamous, "God made the integers. All the rest is the work of man". He likes their idea that mathematics has to be founded on procedures that could actually be carried out by mathematicians (and hence shares their problems with Cantor's transfinites), but he's not so thrilled with the idea that the acceptable "construction" is judged by some slightly mystical pre-verbal intuition rather than anything as concrete as a symbol processing system. And he's especially not thrilled with that notion because it enables them to sweep under the rug the very thing their intuition most takes for granted: namely, the integers.
Ultimately then, that's what the book is about -- the constructivists should have been constructivist about the integers. Are the natural numbers really so natural, or is that the baggage left behind by some theological dogma? This is the question Rotman is interested in, and after a long introduction dealing with the philosophical impediments that have prevented us from even asking the question, and outlining his own stance -- short version: mammals do math with machines -- he gets down to work on a new theory of arithmetic.
So how do we make the integers then? This is pretty straightforward; we count. And then we add, which is just shorthand for counting and counting again, and then we multiply, which is adding and adding again, and exponentiate, and hyperexponentiate, and ... you get the idea. So we actually count twice. Once at the level of the integers directly, and once at this meta-level that keeps track of our counting operations. In both cases we just repeat the last step again.
And we presume that it doesn't change anything. We presume that the context of the repetition doesn't have any effect on the outcome. In other words, we assume that going from 5 to 6 is the same as going from 10^10^5 to 10^10^10^6. It's just adding zeroes, so to speak. We could continue on like this indefinitely. Or could we? I mean, if we start with a general philosophical acknowledgement that it's monkey doing this mathematics (maybe with computers now) we immediately have to wonder both how far the monkeys and their machines can really count, as well as how far they can really meaningfully conceptualize. I mean, I get a stack overflow just trying to parse hyperexponentiation, and while my mac may get a little further, I'm pretty sure I can blow it's stack as well. So it seems all repetitions are not created equal after all.
So if we start from a mathematics situated in this universe and not mysteriously floating free of it, we have to ask ourselves what we actually mean when we say we count and we put more zeros or levels down ad infinitum ... There has to be some, certainly very large, physical limit to any actual counting. That "..." is just a figure of speech. In real life, counting, like every other physical process, is going to be dissipative. It's going to burn up energy and produce entropy/information. You can't have it endlessly for free. The counting to infinity that mathematicians use in nearly every proof is really a sort of waking dream. No actual agent could ever carry on like that, but everyone knows what you mean when you imagine this ghost in the machine.
And this of course is Rotman's point. He's not trying to "disprove" regular arithmetic. He starts from a general idea that mathematics is a language, and a language is fundamentally a way to generate an interesting inter-subjective game and not, primarily, to describe an objective world. From that perspective it is only necessary to point out that the rules we assume counting plays by are not the only ones. Regular arithmetic imagines some disembodied agent that could never exist in our physical world, and that counts forever and never loses track of whether they are hyperexponentiating or hyperhyper ... or ... is it a weekday? We know what you mean by that even if it is imaginary. But we could also imagine something else.
The situation is exactly analogous to the axiom of parallels in Euclidean geometry. For 2 millennia no one could imagine two parallel lines crossing. It was taken as both a logical axiom and a true description of real geometry. It took until Riemann for people to realize that you could put together a coherent curved geometry that violated Euclid's axiom. And it turns out it was not just a logical possibility, but a better and more general description of reality (even if you wouldn't want to use it to build a bookshelf).
Rotman proposes a model for a non-Euclidean arithmetic, where counting is "locally flat", but, because of dissipation occurring in the real universe, starts to "curve" at very large numbers. I will spare you the definition he tries to present of the terms in quotes. Suffice it to say that counting and numbers take on a very different appearance afterwards, though much like an M.C. Escher drawing, things look pretty normal in the middle. It's really interesting to see what happens when you break the stranglehold that simply counting forever has on our imagination. You see that a whole new system is not only logically possible, but you even start to wonder whether perhaps this model would be a better description of the reality of time. If Riemann's geometry was a more complete description of the structure of space, perhaps non-Euclidean arithmetic would help us investigate the structure of time. After all, what is counting but our image of the passing of time.
Anyhow, interesting stuff. The writing is slightly clause-heavy and academic -- I will chop of his finger the next time he reaches for the comma key -- but it's quite lucid and easy to follow. He also wrote a book about the history of zero which might be interesting. I can predict the punchline of that one already though -- zero merged writing numbers and calculating with numbers.
Unfortunately, it turns out to surprisingly hard to find other people who are also obsessed with this question. Shocking, I know. But, today only, for your blog perusing pleasure, I present Brain Rotman, mathematician, philosopher, and author of Ad Infinitum ... The Ghost in Turing's Machine: Taking God Out of Mathematics and Putting the Body Back In. An Essay in Corporeal Semiotics. This is a very interesting little book (though with a title so long that you have to wonder whether it's a cheeky joke) that takes issue with every known philosophical interpretation of mathematics, and tries to open up a new and more realistic way of thinking about just what this particular animal behavior pattern is all about.
First, Rotman dispenses with the age-old absurdity of the Platonist viewpoint -- mathematics as a language for describing pre-existing truths about eternal objects. As loopy as that sounds, I think most mathematicians still look at it this way. They're not inventing stuff, they think they're discovering it. To me, this account of the activity is so obviously theological that it's not even worth exploring. I mean, once you start asking how the apes get in touch with the essences the whole thing sinks faster than the Hindenberg.
Next, he goes on to dispense with Hilbert's reinterpretation of mathematics as a purely formal and computational enterprise devoid of all meaning. As if Godel hadn't already managed that. His critique here is not the same as Godel's of course. He's not concerned with the fact that the computers will never make it to the "end" of mathematics, he's just wondering why on earth anyone would be interested in math if it merely consisted in taking a set of axioms and some rules of inference and grinding out tautological proofs.
Finally, while he has the most sympathy for it, he has a fundamental objection to the constructivist-intuitionist branch that Brouwer and Kronecker advanced under the flag of the latter's infamous, "God made the integers. All the rest is the work of man". He likes their idea that mathematics has to be founded on procedures that could actually be carried out by mathematicians (and hence shares their problems with Cantor's transfinites), but he's not so thrilled with the idea that the acceptable "construction" is judged by some slightly mystical pre-verbal intuition rather than anything as concrete as a symbol processing system. And he's especially not thrilled with that notion because it enables them to sweep under the rug the very thing their intuition most takes for granted: namely, the integers.
Ultimately then, that's what the book is about -- the constructivists should have been constructivist about the integers. Are the natural numbers really so natural, or is that the baggage left behind by some theological dogma? This is the question Rotman is interested in, and after a long introduction dealing with the philosophical impediments that have prevented us from even asking the question, and outlining his own stance -- short version: mammals do math with machines -- he gets down to work on a new theory of arithmetic.
So how do we make the integers then? This is pretty straightforward; we count. And then we add, which is just shorthand for counting and counting again, and then we multiply, which is adding and adding again, and exponentiate, and hyperexponentiate, and ... you get the idea. So we actually count twice. Once at the level of the integers directly, and once at this meta-level that keeps track of our counting operations. In both cases we just repeat the last step again.
And we presume that it doesn't change anything. We presume that the context of the repetition doesn't have any effect on the outcome. In other words, we assume that going from 5 to 6 is the same as going from 10^10^5 to 10^10^10^6. It's just adding zeroes, so to speak. We could continue on like this indefinitely. Or could we? I mean, if we start with a general philosophical acknowledgement that it's monkey doing this mathematics (maybe with computers now) we immediately have to wonder both how far the monkeys and their machines can really count, as well as how far they can really meaningfully conceptualize. I mean, I get a stack overflow just trying to parse hyperexponentiation, and while my mac may get a little further, I'm pretty sure I can blow it's stack as well. So it seems all repetitions are not created equal after all.
So if we start from a mathematics situated in this universe and not mysteriously floating free of it, we have to ask ourselves what we actually mean when we say we count and we put more zeros or levels down ad infinitum ... There has to be some, certainly very large, physical limit to any actual counting. That "..." is just a figure of speech. In real life, counting, like every other physical process, is going to be dissipative. It's going to burn up energy and produce entropy/information. You can't have it endlessly for free. The counting to infinity that mathematicians use in nearly every proof is really a sort of waking dream. No actual agent could ever carry on like that, but everyone knows what you mean when you imagine this ghost in the machine.
And this of course is Rotman's point. He's not trying to "disprove" regular arithmetic. He starts from a general idea that mathematics is a language, and a language is fundamentally a way to generate an interesting inter-subjective game and not, primarily, to describe an objective world. From that perspective it is only necessary to point out that the rules we assume counting plays by are not the only ones. Regular arithmetic imagines some disembodied agent that could never exist in our physical world, and that counts forever and never loses track of whether they are hyperexponentiating or hyperhyper ... or ... is it a weekday? We know what you mean by that even if it is imaginary. But we could also imagine something else.
The situation is exactly analogous to the axiom of parallels in Euclidean geometry. For 2 millennia no one could imagine two parallel lines crossing. It was taken as both a logical axiom and a true description of real geometry. It took until Riemann for people to realize that you could put together a coherent curved geometry that violated Euclid's axiom. And it turns out it was not just a logical possibility, but a better and more general description of reality (even if you wouldn't want to use it to build a bookshelf).
Rotman proposes a model for a non-Euclidean arithmetic, where counting is "locally flat", but, because of dissipation occurring in the real universe, starts to "curve" at very large numbers. I will spare you the definition he tries to present of the terms in quotes. Suffice it to say that counting and numbers take on a very different appearance afterwards, though much like an M.C. Escher drawing, things look pretty normal in the middle. It's really interesting to see what happens when you break the stranglehold that simply counting forever has on our imagination. You see that a whole new system is not only logically possible, but you even start to wonder whether perhaps this model would be a better description of the reality of time. If Riemann's geometry was a more complete description of the structure of space, perhaps non-Euclidean arithmetic would help us investigate the structure of time. After all, what is counting but our image of the passing of time.
Anyhow, interesting stuff. The writing is slightly clause-heavy and academic -- I will chop of his finger the next time he reaches for the comma key -- but it's quite lucid and easy to follow. He also wrote a book about the history of zero which might be interesting. I can predict the punchline of that one already though -- zero merged writing numbers and calculating with numbers.
No comments:
Post a Comment