Showing posts with label math. Show all posts
Showing posts with label math. Show all posts

22 August 2011

The Blob

Last week, my six-year-old daughter hopped on a bike and took off. Well that was easy.

I’m pretty sure when I learned to ride a bike my dad had to exercise more patience than that. I don’t really remember, but let me just say with absolute confidence: there was blood. Probably some pants had to be thrown out. I don’t think my daughter even skinned a knee. It was ridiculous.

The same week, we all went to a YMCA summer camp site to play. They have something there called the Blob. It’s a huge inflated bag floating in the lake. One person sits on the far end of it. Another person jumps off a fifteen-foot tower onto the near end. When person 2 hits the Blob, person 1 is launched off the end and into the water. Cool idea.

We waited half an hour for it—there was a line, and about one in four kids goes up the tower and stands there, petrified, unable to make themselves jump and unable to give up their spot, for about five minutes. This gives people a lot of time to negotiate who will launch whom. Everybody wants to be launched by the biggest, heaviest guy in the pond. My daughter was the littlest; nobody wanted her to be right after them, so they all voluntarily let her go first. She got sorted to the front of the line way ahead of me.

We waited and waited. The last kid ahead of her finally chickened out.

Up the tower she went.

Will she jump? the guy behind me asked. I didn’t know. She seemed nervous, but it was just nerves, I thought. She wasn’t really afraid. She’s not afraid of this sort of thing. At least I didn’t think she was.

We didn’t have to wait long. She jumped right away. The next kid was maybe an eight-year-old boy. He jumped right away, too.

In hindsight, I should have done the math. My kid weighs about forty-five pounds. The boy said he weighed ninety. E = mgh, right? You can guess the Blob’s efficiency by watching the other jumps. Maybe 20%, 30%. So h2 = m1/m2 × h1 × 30%? And h1 is… fifteen feet…

Everyone gasped as she went flying. None of the other launches had been anything like this. Ten feet up she went, turned a slow back somersault in mid-air, and came down with a crash in the water, on the far side of the Blob where we couldn’t see. It was amazing.

I left the line and swam over to pick her up. She had landed on her face, so she cried a little. By the time we got to the beach, she was fine. You can imagine my feelings. I love that little girl.

21 January 2010

Infinity, part 2: Zeno's paradox

(An ongoing series. See part 1.)

In this capricious world nothing is more capricious than posthumous fame. One of the most notable victims of posterity's lack of judgement is the Eleatic Zeno. Having invented four arguments all immeasurably subtle and profound, the grossness of subsequent philosophers pronounced him to be a mere ingenious juggler, and his arguments to be one and all sophisms. After two thousand years of continual refutation, these sophisms were reinstated, and made the foundation of a mathematical renaissance...

—Bertrand Russell, The Principles of Mathematics (1903).

The previous post involved two different kinds of infinity. There's the infinite on, off, on, off... of Thomson's lamp. And there's the infinite division of time: one minute, then half a minute, then a quarter of a minute, etc.

Maybe your reaction to the paradox was, “Oh, that's impossible, there's no such thing as infinity in the real world.” Perhaps not. Perhaps once you get down to tiny enough fractions of a second, you see that light is emitted in tiny, discrete quanta, and to emit even one quantum of light each time the lamp turned on would require more than enough energy to burn it out.

But math refuses to make that excuse. Math deals with abstracts, ideals. Math must deal with infinity somehow. From the moment you start counting, it is always staring you in the face.

Shortly after I posted part 1, I ran across this old chat log. Oddly enough, it discusses the same two infinities, but in a different guise.

Eudoxus: Hey, are you there?
Plato: yeah
is this about that 0.999... = 1 thing, because I'm kind of busy
Eudoxus: no no
check this out, I ran across this paper about teaching math to children, and the example they were using was infinite series.
The question was, 1 - 1 + 1 - 1 + 1 - 1 + ... = ?
Care to guess?
Plato: 0
Eudoxus: Right, obviously
because all the terms cancel out
only there's another way to see it...
you start with 1, and then all the terms *after* that cancel out with each other. see?
So the answer is 1.
Contradictory.
Plato: hmm...
Eudoxus: You with me so far?
Plato: yup
Eudoxus: So here's the snippet from this paper that jumped out at me.
"It is important to point out that it is not enough to consider at the same time two conflicting statements in order to develop in pupils' minds the awareness of an inconsistency and the necessity of second thoughts (Schoenfeld, 1985): the perception of some mutually conflicting elements does not always imply the perception of the situation as a problematic one (Tirosh, 1990)."
This surprised me because it seems so obvious that inconsistency is a sign something is wrong.
Plato: yes
kids are dumb
from a contradiction, anything follows. Everyone knows that
Can I say something?
Eudoxus: shoot
Plato: this thing about the infinite series
Maybe this is dumb
but I don't see how you can get an answer
i mean it keeps going on and on
where would you put the equals sign?
Eudoxus: :)
Plato: which is a joke
Eudoxus: sure
Plato: but I think that, obviously, you are going to come up with weird answers when you start assuming that if an infinite series "stopped" and you could make an equation out of it, etc.
Eudoxus: But the answers work out just fine, and it's hard to avoid. I don't think you can do calculus, for example, without infinite sums.
Plato: that's why calculus is dumb
Eudoxus: Pfft. Calculus is going to be huge. I'm going to write a book about it, as soon as I get some technical issues resolved. I just hope no one else gets to print first.
Plato: you're all talk
Eudoxus: I am going to try and convince you that infinite sums can work, because if I'm wrong then all my work is contradictory and useless.
Have you heard of Zeno's paradox? The one with Achilles and the hare?
I mean, tortoise
Plato: Is that the half of half of half one?
Eudoxus: Yes.
Zeno says that for Achilles to catch the tortoise would require an infinite number of moves. Clearly impossible. Therefore motion must be an illusion--because there's no way to make sense of it.
Plato: Right. Well, it makes sense...if you assume that the dude is infinitely small...
I mean, in real life you wouldn't be able to do it
because at some point you would just be too big
to go that small a distance and have it mean anythign
anything
and zeno is such a jerk anyway
Eudoxus: Well, hang on.
Would you accept, arguendo, that very small distances exist, even if they are way too small for Achilles to see?
Plato: sigh, yes, Socrates
Eudoxus: :) Then suppose I have two marks on the ground, exactly 1 stadion apart.
Euclid could construct the line between them. And the midpoint. OK?
Plato: yes
Eudoxus: And as many points as you like, successively closer to point B. Right?
Plato: of course
Eudoxus: So you accept that there are (at least in principle) infinitely many points there, getting closer and closer to B?
Plato: yes, of course
Eudoxus: (thinking)
And between every point and its successor, there's some finite distance. That is, between A and the midpoint is 1/2 a stadion; between that point and the next is 1/4 stadion, and so on?
Plato: yes, a measurable distance, agree
Eudoxus: Now, none of the distances overlap. And taken together, they cover the entire line segment AB. Right?
Plato: yes
Eudoxus: So doesn't it make sense to say the sum of all these lengths is 1 stadion?
Plato: you couldn't take a sum
I mean, yes, together they do equal one stadion
But, um, you couldn't really measure each part and add them all up
Eudoxus: I couldn't construct all the points, either, as a practical matter. But that doesn't stop them from existing.
Plato: okay, true
So, yes, all of them together equal one stadion; I guess that's a sum
Eudoxus: I say it's a sum by analogy to the finite case. Other than a fear of infinity, I see no reason not to call this a sum, and say that 1/2 + 1/4 + 1/8 + 1/16 + ... = 1.
Plato: yes, yes, it's a sum
Eudoxus: Ah, then an infinite sum is possible.
Plato: See, here's the thing
You are being tricky
You are using infinity two different ways
On the one hand, infinity is just blah plus blah plus blah, etc.
On the other hand, you are saying blah/this finite thing + blah/this finite thing + blah/this finite thing...
Listen every finite thing in the world could be cut into infinite "parts"
Right?
Eudoxus: Sure.
Plato: Okay, but
that doesn't mean that you can just start with an infinite number of things and decide to multiply and add them and assume that in the end you'll get a finite thing
Eudoxus: Hmm.
I need to think about that.
Incidentally, what would you say about the sum 9/10 + 9/100 + 9/1000 + ...?
Plato: do you mean, an infinite series, the next being 9/10000?
Eudoxus: Yes
Plato: Well, that's the "other" sort of infinity
the type that is just an infinite series of numbers, added, that you are assuming will result in a finite sum
Eudoxus: I don't mean to "assume" that it will or won't. I want to find out if it will or not.
Plato: oh, okay
it won't
Eudoxus: No? But Euclid can also construct a point, call it C, that's exactly 90% of the way from A to B.
OK?
Plato: grr...
yes
Eudoxus: By design, AC is 9/10 of the whole.
What's left, CB, is the other 1/10, right?
Plato: yes
Eudoxus: So repeat with CB. You'll draw a point D that is very close to B.
CD is 9/100 of the whole. What's left, DB, is the other 1/100.
Plato: okay
Argh, so then, they equal a finite thing?
Eudoxus: You see where I'm going? You're constructing a picture of the statement "9/10 + 9/100 + 9/1000 + 9/10000 + ... = 1"
Plato: yes
but, okay, yes, I see that
Eudoxus: And what's another way to write 9/10?
Plato: no, I won't do it; because it is dumb
Eudoxus: (laugh)
when did you see it coming?
Plato: when you said "no no"
liar
Eudoxus: OK, the point of all that mess was of course that 0.999... = 1.
Plato: yes, I know
Eudoxus: I wanted to make an argument in a way that would appeal to a Platonist.
Plato: no name calling :)
Eudoxus: Hey, that's what you are. You should be proud. :)
Plato: I am, sort of. It's just that...okay, I still don't understand exactly. I mean, I get the proofs, but I still feel like we are adding apples and then adding oranges and then saying, voila, apples are oranges.
Eudoxus: Yes, yes, I know...
I'll think about it some more later...
You are right to say that not every infinite sum is as... easy to deal with as 0.999...
Plato: yes, exactly
But it's scary, isn't it
Eudoxus: ?
Plato: I mean, even just mathematically in general
Just the idea that you can get two contradictory answers to an equation
Eudoxus: Uh, yes, it's very troubling.
Plato: I mean, mathematics deals in absolutes
There's no room for contradictions, right?
From a contradiction, everything follows
Eudoxus: Yes.
Plato: Right, so if these infinity things are correct as is, what does that mean?
Eudoxus: Oh, you mean 1 - 1 + 1 - 1 + ...?
Plato: yes
it proves that 0=1, right?
Eudoxus: Well...
that sort of problem doesn't really arise in practice though.
Plato: o rly?
i can prove 0=1 and it doesn't affect your work? what kind of logic is this?
the consequences, man1`
Eudoxus: Sorry, I have to go. I think someone's calling me.
Plato: what???
come back here you coward
Eudoxus is offline.

You might be relieved to know that the paradox has been resolved.

The cure was transformative.

06 December 2009

Mathematicians in training

First off, if you haven't played Set, you really should, because you're just the sort of person who would love it. It's a clever idea elegantly executed, and it just happens to be great competitive fun. I taught the kids to play, and the four-year-old has a definite edge on the six-year-old. I couldn't be more pleased.

We had a bit of a drive yesterday, and along the way we gave the kids some analogies to puzzle over. You know the sort of thing: “Ice is to water as rock is to what?” This turns out to be engrossing and surprisingly fun. (I might try it with adults sometime. I sense opportunities for nerd humor.) Here the kids were not evenly matched at all. The four-year-old would get the easy ones (cow : moo :: pig : x) but would guess random related words on the harder ones, apparently with equal confidence. The six-year-old saw more clearly what the game was about, so he was able to bring his greater general knowledge to bear.

Why is this post titled “Mathematicians in training”? Well, Set is a transparently mathematical game. There are 81 cards because 81 is 34. The deck is the Cartesian product of four three-element sets. They form some kind of algebraic structure with extraordinary symmetry (of a kind I don't really know anything about—it's not a group—such that I'm tempted to get completely sidetracked here). But the kicker here is, the gameplay itself is mathematical. As far as I can tell, the only good strategy is to try to prove there are no sets.

Analogies are just little homomorphisms, which is to say, structure-preserving transformations. The idea that deep sameness is more interesting than superficial differences is more important to mathematics than numbers.

On the surface it seems like analogies are less mathematical than Set. Appearances can be deceiving.

(P.S. Figured it out. The sets in Set are the cosets of cyclic subgroups of (Z/3Z)4. The symmetry I was referring to above was that after you erase the underlying group operation, there's no privileged element. There are isomorphisms on the deck of cards, preserving the sets, mapping any given card to any other given card.)

14 October 2009

Milestone

I probably learned about variables from playing around with a Commodore 64 when I was about the age you are now. But I didn't see them used in mathematics for many years, until they were finally introduced, in about 7th grade, as a tool for solving problems. Take a problem, write down the equation, putting variables for the unknown quantities, and then you have something you can solve.

A little while ago I realized that this isn't the only way, or even the most important way, that variables are used in math. Variables are used to write laws.

The other day you were getting a shower, and I told you that letters could stand for numbers, that you can use letters to write rules about numbers. The letter could stand for any number, and the rule would always be true. I wrote in the condensation on the glass:

A + 0 =

Then I stopped and said, well, A plus zero equals what? You said zero. I said I didn't think that was right, because what if A was seven? Seven plus zero equals zero? So then you said A. And I couldn't be sure but I thought you really got it. That's right, I said. I'm sure you could tell I was very pleased. Actually I was surprised and excited.

Later I wrote some math pages for you to solve. The first one said,

Here is a rule:

0 < A

Do all the numbers follow this rule?

If there is a number that doesn't, that's called a counterexample. It means the rule is false.

In math, a true rule is always true, for all numbers.

The other one had some mathematical statements on it and asked you which ones were true.

You surprised me.

You got them all right. I asked you about the rule on the first page, A < 0, and you had to look up what the < symbol meant, but then you told me right away that it was false, because zero isn't less than zero.

I was amazed. I asked your mother, “Did you see those pages J. did today? What does this mean?” She wasn't surprised. “It means first-graders can learn pre-algebra,” I said, insistent.

Some can,” she said.

She is half right: you are special; you are bright; and you are interested. But I know there are millions of special, bright, curious kids like you in this country, and I think by and large their schools are selling them short. You sure are lucky you've got me, kid. But not as lucky as I am to have you.

02 July 2009

Lockhart's Lament

Lockhart's Lament (PDF, 25 pages) starts out like this:

Everyone knows that something is wrong. The politicians say, “we need higher standards.” The schools say, “we need more money and equipment.“ Educators say one thing, and teachers say another. They are all wrong. The only people who understand what is going on are the ones most often blamed and least often heard: the students. They say, “math class is stupid and boring,” and they are right.

and ends up like this:

How sad that fifth-graders are taught to say “quadrilateral” instead of “four-sided shape”, but are never given a reason to use words like “conjecture”, and “counterexample”. ...

Mathematics is about problems, and problems must be made the focus of a student's mathematical life. Painful and creatively frustrating as it may be, students and their teachers should at all times be engaged in the process—having ideas, not having ideas, discovering patterns, making conjectures, constructing examples and counterexamples, devising arguments, and critiquing each other's work.

The author is a bit crazed, but that just makes it more fun to read. In the unlikely case that you somehow got here while thinking math is stupid and boring, or if you've ever found yourself teaching a stupid, boring math class, take a look.

P.S. As the previous post maybe suggests, I've only recently discovered how to learn math by making conjectures and trying stuff, which is what Lockhart recommends.

In unrelated news, J. is pretty sharp at finding lines of symmetry. I need to give him a circle to play with and see what he says. (evil chuckle)

P.P.S. I got this link from humph, who is also a one-of-a-kind teacher (but not crazed).

01 July 2009

The ring Z[i]

This is a little self-portrait, "The Artist Trying to Learn Abstract Algebra", probably of no interest to anyone else.

I read an introduction to rings (in Gallian, fifth edition, which I enthusiastically recommend). Now I'm trying to come up with some conjectures and prove or disprove them before I start on the exercises. (This book has great exercises, and doesn't bother teaching anything in the text that it can teach in an exercise.)

I figured maybe every ideal of the ring Z[i] is a principal ideal generated by some element of the ring. This morning I think I have the proof. It's a consequence of Z[i] being enough like the integers to support Euclid's algorithm.

That in turn is a consequence of Z[i] having something like integer division. You can define a well-ordered metric M on Z[i] such that M(0) < M(a) where a is any other element; and for any a and nonzero b, there exist a quotient q and remainder r such that a = bq + r and M(r) < M(b). That the domain of M is well-ordered implies that Euclid's algorithm terminates.

Z[i] also has something like prime and composite elements. For example, 5+i can be factored into (1-i)(2+3i). I wonder if these two properties are actually the same thing.

I think the ideals of Z[i] generated by "prime" elements are prime ideals.

09 May 2009

Recently I learned...

  • Your browser uses the public suffix list to determine whether two web sites may share cookies. This is not very robust but better than the previous strategy.

  • If you take a long strip of paper, fold it in half as many times as you can, and unfold it, it'll make an approximation of the fractal shape called the Heighway dragon.

  • If you take two fractions, say 1/2 and 1/3, and add the numerators and denominators, you get the mediant, in this case 2/5. I don't know much about the mediant, but it is linked to Ford circles in a way I don't really understand, and I'm told mediants give a startling way of approximating the value of continued fractions.

  • glibc's qsort only does an actual quicksort as a last resort. If there's enough memory, it does a merge sort.

    This is old news, and I kind of figured it was the case, but I never looked at the source code before.

And I was reminded that complex numbers are key to quantum mechanics, something I missed when I was writing about complex numbers a year or two ago.

We've been unpacking books that have been in boxes for a year and a half. It's like meeting old friends. I read the epic of Gilgamesh, the book of Joshua, and Six Easy Pieces. I love small books.

17 December 2008

The School Mathematics Project

JJ had me look at a set of old mathematics textbooks, and I found this.

4.1 Division and repeated subtraction

We can write 7 + 7 + 7 + 7 + 7 + 7 + 7 + 7 + 7 = 7 × 9 = 63.

(a) What is 63 - 7 - 7 - 7 - 7 - 7 - 7 - 7 - 7 - 7?

(b) What is 63 ÷ 7?

(c) Explain the connection between the last two questions.

(d) If you were to work out 65 - 7 - 7 - 7 - 7 - 7 - 7 - 7 - 7 - 7, what would you find? How would you give your answer?

4.2 Division of a whole number by a whole number

Example 11 (Method I)

If you were asked to work out 5489 ÷ 12 by finding out how many times you could subtract 12 from 5489, you wouldn't be very pleased!

5489
-12
5477
-12
5465
-12
5453
-12
5441
-12
5429
-12
5417

This is just the start. It would certainly take a long time. However, as you will have realized, there are quicker ways of doing this division.

(Method II)

12 )5489 Consider 5400. There are more than 400 (but less than 500) twelves in 5400. Let us subtract 400 of them all at once.
4800 (400 twelves)
689 Now consider 680. There are more than 50 (but less than 60) twelves in 680. Subtract 50 of these all at once.
600 (50 twelves)
89 Finally, we know that there are 7 twelves in 89 which if we subtract them leave us with a remainder of 5.
84 (7 twelves)
5

So we have subtracted (400 + 50 + 7) twelves and have 5 left over.

5489 ÷ 12 = 457,   remainder 5.

If we were dividing in order to find the answer to a ‘fair shares’ question, we would write

5489 ÷ 12 = 457 5/12

You will probably have recognized this method. Why?

I'll stop there. What struck me as cool about this is that it takes long division, a complex procedure which most students learn by rote, and at once (a) explains why it works (b) makes it seem simple and obvious.

The example is from SMP Book C, published 1969 by Cambridge University Press. JJ has the whole series. They seem quite good, relative to what I recall from grade school. The approach is conversational with a lot of questions. Very few paragraphs are more than a few lines long. There are exercises but no “word problems”. The books are printed in black and red ink. There are no photographs or sidebars. The subject matter is richly mathematical: very little arithmetic, which must have been a separate curriculum; but in the first few books (hard to tell but they appear to be directed at students 12-15 years old) there are chapters about things like relations, directed graphs, symmetry, counting possibilities, why a slide rule works.

The SMP stands for School Mathematics Project, a British nonprofit. They're still making mathematics textbooks.

16 September 2007

The principle of explosion

Pop quiz:

1 - 1 + 1 - 1 + 1 - 1 + 1 - 1 + ... = ?

It's obviously 0, right? Or maybe it's 1. In the 17th and 18th centuries, everyone apparently thought the correct answer was ½ (and it wasn't because they were stupid back then: this includes people like Leibniz and Euler).

Eh, so math is inconsistent. So what?

It is important to point out that it is not enough to consider at the same time two conflicting statements in order to develop in pupils' minds the awareness of an inconsistency and the necessity of second thoughts (Schoenfeld, 1985): the perception of some mutually conflicting elements does not always imply the perception of the situation as a problematic one (Tirosh, 1990).

Infinite series: from history to mathematics education (PDF), Giorgio T. Bagni.

Huh.

Now, maybe this is because math doesn't make a whole lot of sense to most kids to begin with. But I think the main cause is that kids, like the rest of us, are used to things being inconsistent sometimes. And they live with it. I mean, what are you going to do?

Well, let me tell you something. In math, you can't live with a contradiction.

The principle of explosion is built into the fundamental rules of logic, rules that both mathematicians and ordinary people use to reason with. Ex falso sequitur quodlibet: from a contradiction, anything follows. Or as an old friend of mine used to say, after you swallow the first pill, the rest go down real easy.

In math, if you accept a single contradiction, the entire system comes crashing down around you.

(Now there's such a thing as paraconsistent logic, in which inconsistencies are not so destructive. But it's quite different from ordinary logic, and not many people are familiar with it.)

In the above case, mathematicians eventually discovered a formal notion of “convergence” and found that the sum 1 - 1 + 1 - 1 + ... does not converge. That is, there's no answer, just as there's no answer for the sum 1 + 2 + 3 + 4 + ..., and for the same reason: you can go on for as long as you like, and your numbers are never going to converge on some specific value.

Is this a cop-out? It's a hard fact of life that some mathematical problems just don't have answers. The ancients considered 2 - 7 to be undefined, because the answer would be less than nothing, which was clearly nonsense. Today we have negative numbers, but other things, like division by zero, are still undefined. Given all that, maybe it's not so surprising if expressions that end in the innocent-looking “+ ...”, as if to say “oh don't mind me, I'm just a little infinite series, tra la”, sometimes fall into this category.

Floating

It's not until you study math a little bit that you realize just how awful floating-point arithmetic is. I mean, so it's a little inexact. So what? But the following are examples of statements that are absolutely true for integers, rationals, and reals, but not for floating-point numbers (even setting aside floating-point infinities and “not-a-number”s):

a - (a - b) = b

If b > 0, then a + b > a.

The associative law: (a + b) + c = a + (b + c)

The distributive law: a × (b + c) = (a × b) + (a × c)

Cancellation: If a × b = a × c and a ≠ 0, then b = c.

In short, if you've ever done any reasoning about floating-point numbers, you were probably wrong.

It's easy to come up with formulas that floating-point arithmetic gets seriously wrong, especially for numbers very close to zero.

The inexactness is impossible to track in a useful way, so you never know just how bad the result is. Typically you just assume it's almost right until you find out it's wrong. The only surprising thing is that it works so often.

24 June 2007

Math links

A mathematician's apology (PDF). The early chapters are mostly grumbling about getting old. But then he starts talking about mathematics, with real poetry and love for the subject. Very nice.

Eudoxus. This ancient Greek anticipated Dedekind cuts and calculus. He also mapped the heavens and the earth. Yet no work of his survives--only frequent citations in the works of Euclid, Aristotle, Hipparchus, and more.

Triangle centers. Every triangle has a centroid. But oddly enough, that isn't the only center of a triangle. MathWorld has a picture showing over a thousand of them.

08 February 2007

Infinity, part 1: Thomson's lamp

Back in the 1950's, the famous electrical engineer James Thomson invented the ultimate strobe light. Instead of flashing at regular intervals, Thomson's lamp (as it was called) would start out flashing slowly and quickly speed up.

The lamp had a button on it. Pushing the button caused the lamp to flash for a total of two minutes, as follows. First the lamp turned itself on. It stayed on for half of the total two minutes (one minute). Then it turned itself off and stayed off for half of the remaining time (half a minute). Then it turned itself on for a quarter of a minute, then off for an eighth of a minute, and so on. Toward the end of the two minutes, it would have been blinking pretty quickly.

The troubling question was, after the two minutes passed, would the lamp be on or off? Thomson was so anxious about the philosophical consequences of his invention that he famously refused to press the button for decades. (The legend is that a physicist friend eventually talked him into it, but there are conflicting stories about epileptic seizures, electrical fires, and divine intervention—I can't make any sense of them.)

So: on or off? What do you think?

16 January 2007

Brouwer's shopping mall diorama theorem

This is a math post, but it also involves some audience participation. There's a crafts project. It may also require some driving. Ready?

Pick any closed, contiguous region of the universe—like, say, the nearest mall. Draw a map of it. Or you can make a diorama, if you're just that fond of the mall, or of making dioramas.

Go ahead. It doesn't have to be to scale.

While you're working, I'll say something profoundly obvious. The whole idea of a map, of course, is that every place in that part of the real world corresponds to exactly one spot on the map.

Done? Good. Now take the map (or model) and put it inside the closed region of space that it represents. That is, go to the mall. Brouwer's fixed-point theorem says that the map now has a fixed point: there's a point on the map that is actually at the very location that it represents.

This works no matter how large or small your map is. If your map is the size of the entire food court, and you take it there and spread it out on the floor, there will be a spot somewhere in the food court that exactly lines up with the corresponding spot on the map. Shift the map a little bit, and that spot won't line up anymore—but some other spot will. Always. You can turn the map around to face the wrong way. You can hold your 3-D model upside down. It doesn't matter. In fact, this works even if your map is not drawn to scale or if things are totally the wrong shape. There are only two requirements regarding accuracy. First, your map can't leave anything out. So if you forgot to draw the Banana Republic, you have to mentally squeeze it in between Orange Julius and The Icing where it belongs. Second, your map must be continuous. That is, any path someone might take from one point to another in the mall has to make a continuous path (without any “jumps”) on your map as well.

In the language of topology, any continuous function that maps a closed ball in Rn into itself has a fixed point. I have no idea why this works. Amazing.

It may have occurred to you that there already are nice, large maps conveniently located throughout the mall. Brouwer's theorem applies to those maps, too. In fact, in honor of Brouwer, the fixed points of these maps are always clearly marked, usually with a red dot or an arrow. Next time you're in a mall, take a look.

11 January 2007

Fiction, meet science

At some point, Dartmouth University offered a semester course on Renaissance Math in Fiction and Drama. From the site:

This course explores scientific developments in Renaissance astronomy and their portrayal in literature past and present. By reading some of the writings by Copernicus, Galileo and the prolific Kepler, we will attempt to draw a portrait of scientific upheaval during that period. The science fiction of the Renaissance offers a window into the popular response to these developments, as do various commentaries of the time. Dramatic pieces both recent and of that period show the artistic reconstruction of scientific events, sometimes through a very modern lens.

“Science fiction of the Renaissance”? There's not a huge amount of this, as it turns out, but one amazing, atypical example is Johannes Kepler's Somnium, which was at once a fanciful journey to the moon and a serious thought experiment in support of Copernican heliocentrism. Wow.

05 January 2007

Perfect numbers

Perfect numbers are numbers that are equal to the sum of their factors: 6 is perfect because its factors are 1, 2, and 3, and 1 + 2 + 3 = 6. Likewise 28 = 1 + 2 + 4 + 7 + 14; and so on. So far, 44 perfect numbers are known.

Puzzle: Can you prove that if 2n - 1 is prime, then 2n - 1(2n - 1) is perfect?

planx_constant mentioned that little theorem to me over vacation. It was first proved by Euclid. Millenia later, Euler proved that all even perfect numbers are produced by this formula. But it is not known whether there are any odd perfect numbers. Most mathematicians seem to think there are none. Here's James Joseph Sylvester, writing in 1888:

...a prolonged meditation on the subject has satisfied me that the existence of any one such—its escape, so to say, from the complex web of conditions which hem it in on all sides—would be little short of a miracle.

Yet there is hope, and indeed the search is on.

A complex story, part 4

(See parts 1, 2, and 3.)

Everybody literally sees the world from a different point of view. Each person is standing in a different location and looking out in a different direction from everyone else. But all viewpoints share certain similarities. If you and I are near one another, we'll see the same events happen in the same order, and although we may differ in our use of the words “right” and “left”, if we're watching something from opposite sides, we'll at least agree on the distances between things. If I see two people holding hands, you'll never see them on separate sides of the street at the same time, no matter where you're standing. All the different viewpoints preserve certain observed properties: distances, angles, durations, causality, and so on.

Mathematically, we can write this in two equations. For each of us, every event has a measurable position in space (x, y, z) and time (t). If we put my observations on the left-hand side and yours on the right, they will match.

We agree on distances: x2 + y2 + z2 = x'2 + y'2 + z'2

We agree on durations: t = t'

Even if I'm in a car doing eighty and you're sitting on the sidewalk enjoying an ice cream cone, we'll agree on the distances between and durations of any events we both happen to witness as I zoom by.

...Or so everyone thought. Don't get me wrong, this is a lovely picture. Mathematically, it's your basic three-dimensional Euclidean geometry, plus a separate dimension for time. All our viewpoints are identical except for a bit of spacial displacement and rotation. There's only one problem. This isn't how the universe really behaves.

1887 was the year of the famous Michelson-Morley experiment, which blew this nice, simple Newtonian view all to hell. For twenty years, confusion reigned. By 1905, a mere eyeblink in academic terms, physics had righted itself, now with a totally new model of space and time.

The new theory was called special relativity. It was built on brilliant new insights from Hendrik Lorentz, Henri Poincaré, and Albert Einstein. And it went something like this: Two observers traveling at incredible velocities (relative to one another) actually do not agree on distances, angles, durations, or even the relative time-order of events. But they will agree on something even more fundamental: the basic laws of nature, including laws of motion, causality, and—in particular—the speed of light.

This had the advantage of being, you know, consistent with experiment. But geometrically, it was awfully weird. It wrecked the two equations above. Individual viewpoints were not simple spacial rotations and translations of one another. They were, uh, Lorentz transformations. Yeah. It was two more years before geometry caught up with physics.

In 1907, Hermann Minkowski discovered a kind of geometry (a four-dimensional manifold) that exactly describes the spacetime of special relativity. That is, Minkowski space is the actual geometry of the universe around us, according to relativity. Minkowski's geometry succeeded by treating space and time as interrelated. For example, in Minkowski space:

We may not agree on the spatial distance between two events: x2 + y2 + z2x'2 + y'2 + z'2

We may not agree on the passage of time: tt'

But we will agree on a particular mathematical combination of the two: x2 + y2 + z2 - ct2 = x'2 + y'2 + z'2 - ct'2

(Here c is the speed of light.)

Now comes the controversial, beautiful part. Define a variable w as ict. We're going to use w as our time coordinate, instead of t. Then the last equation above becomes:

x2 + y2 + z2 + w2 = x'2 + y'2 + z'2 + w'2

This looks a lot like our original equation for distance. And in fact this equation describes basic Euclidean geometry in four dimensions. Time becomes just another spacial dimension. All viewpoints are again simple rotations and translations of one another—not in three-dimensional space, but in four-dimensional spacetime.

Here the role of the complex numbers is to provide a new way of looking at the geometry of the universe.

But... what does it all mean? Is time really an imaginary dimension? What does it mean for three dimensions to be real numbers and one to be an imaginary number? These questions are, in a way, the same questions RT asked me months ago, the questions that got me interested in telling this story. What are the imaginary numbers? Do they exist? Do they appear in nature? I don't think anyone really knows. Einstein found the ict trick interesting at least (he mentions it twice in his short book Relativity, which by the way I enthusiastically recommend), but some physicists think it's a red herring. Maybe we're just dressing the universe up to look more comfortable and familiar.

A complex story, part 3

(See parts 1 and 2.)

In the early 1800s, Joseph Fourier found that every periodic function is made up of (a possibly infinite series of) sine and cosine functions of various frequencies and magnitudes. Just add the right sine waves together and you'll get the desired function. Any function. This collection of waves is called the Fourier series, and it would soon propel the complex numbers from the ivory tower of pure math onto the mad merry-go-round of technology.

Mathematicians used the Fourier series to shift difficult problems to an easier battleground, by transforming a complicated function into an infinite sum of very simple ones. This was the beginning of frequency-domain analysis. It was soon discovered that—thanks to Cotes's discovery—the Fourier series was much simpler if you used complex numbers. Other such transformations were discovered too, notably the Fourier transform and the Laplace transform. Both are based on the complex numbers.

Frequency-domain analysis was the killer app for complex numbers. And then came electricity. As it happens, most of electrical engineering would be practically impossible without frequency-domain analysis. Beginning problems in circuits—problems that in the time domain would require two or three semesters of college-level calculus to tackle—can be solved in the frequency domain with basic high-school algebra and a few complex numbers.

Fourier-related transforms are also essential to the compression of digital images, music, and video. So it's safe to say the complex numbers will be with us for a while yet.

There is just one more application of the complex numbers I want to talk about, by far the weirdest, probably the most controversial, and just maybe the most beautiful of them all.

(Concluded in part 4.)

24 December 2006

A complex story, part 2

(See part 1.)

Roger Cotes died of a sudden, violent fever in the summer of 1716. He was 33 years old. Isaac Newton is said to have remarked on Cotes's passing, “If he had lived we would have known something.”

Cotes left to the world a handful of unpublished math papers. Among them, this formula:

ln (cos x + i sin x) = ix

No such connection between trigonometry and logarithms was known at the time. Cotes's formula reveals a simple, fundamental connection here, and the connection runs through complex numbers. Even at the time this was a very nice result, but the realization of just how profound it was dawned slowly. It took a couple hundred years.

Around this time a totally unrelated question was starting to generate interest. There was a feeling that an equation like this...

x3 + x2 - 3x = 4

...ought to have three solutions. And indeed it does, but the feeling was that more generally every polynomial equation of degree n ought to have n solutions, though not all of them would necessarily be distinct from one another. And this turned out to be true... if you counted complex solutions. This landmark result was finally proved in 1806. It's now called the fundamental theorem of algebra.

But complex numbers were still controversial until 1799, when something ironic happened. A land surveyor, Caspar Wessel, discovered that the complex numbers could be thought of as points (or vectors) on a plane. Each complex number a + bi corresponded to the point (a, b). Adding, subtracting, and multiplying complex numbers could be done with any flat surface, a compass, and a ruler.

This was astonishing, because it made the past two centuries of work involving complex numbers suddenly much easier to visualize, in a totally unexpected way. (Graph the solutions to the equation x13 = 1. You'll see thirteen points arranged in a perfect circle—or regular triskadecagon, if you prefer—around 0. I don't know how to convey how unexpected and beautiful that is.) Ironic, too, because the underlying idea of plane coordinates was first explored by René Descartes, the same man who coined the derogatory term “imaginary number” back in 1637.

Cotes's formula had been independently discovered by Leonhard Euler, and with the discovery of the complex plane, its fame grew. The formula now bears Euler's name; and one particularly nice case (where x) is called Euler's identity:

eiπ + 1 = 0

This is widely considered the most beautiful equation in mathematics. There's certainly something about it. It's as though all the biggest concepts in math came to lunch, and while they were all there together they posed for a photograph.

...So this is why complex numbers are important. It's a little matter of the most beautiful mathematical discovery of all time.

But incredibly enough, the story doesn't stop there either.

(Continued in part 3.)

A complex story, part 1

For RT, who seemed a little skeptical.

Once upon a time, there was an Italian mathematician named Niccolò Tartaglia.

Niccolò's claim to fame—well, his secondary claim to fame—is that he discovered a formula that you could use to solve any cubic equation. Well, maybe not any cubic equation. It worked for many cubic equations. Unfortunately, for some equations the formula gave results that involved the square root of a negative number. Which was clearly nonsense. But strangely, Tartaglia found that if he just pretended that negative numbers had square roots (numbers that weren't exactly real numbers but followed all the same rules of arithmetic), he could just plow ahead with the math and eventually all the oddities would cancel out, leaving ordinary real numbers: the correct solutions.

(Tartaglia's primary claim to fame is that he spent a decade of his life ruthlessly destroying the career and reputation of his friend, Gerolamo Cardano, after Cardano revealed Tartaglia's secret formula to the world. This was a time when mathematics was the exclusive domain of paranoid madmen. Some were so secretive they managed to leave no surviving written work at all.)

From what I've read, Tartaglia apparently had no idea what he was doing, and nobody else could figure it out, either. It was as though there were a sort of mysterious shadow realm lurking behind the real numbers, and occasionally some errand would force you to travel through it, only to emerge (with a shudder of relief) back into the real numbers in the end. Nobody liked this. When René Descartes called these oddities the imaginary numbers, he meant it to sting.

The stigma persists. Most people hear a little about complex numbers in school, not enough to be comfortable with them or understand why people would think they exist (whatever that means) or why they might be useful.

Descartes probably figured a better method for solving cubic equations would eventually come along, and then the “imaginary” numbers could be quietly forgotten. What actually did happen turned out to be a lot more interesting.

(Continued in part 2.)

23 December 2006

Alice in Puzzle-Land

“How do I know for sure that I'm awake?” asked Alice. “Why can't it be that I'm now asleep and dreaming all this?”

“Ah, that's an interesting question and one quite difficult to answer!” replied the King. “I once had a long philosophical discussion with Humpty Dumpty about this. Do you know him?”

“Oh, yes!” replied Alice.

“Well, Humpty Dumpty is one of the keenest arguers I know—he can convince just about anyone of just about anything when he puts his mind to it! Anyway, he almost had me convinced that I had no valid reason to be sure that I was awake, but I outsmarted him! It took me about three hours, but I finally convinced him that I must be awake, and so he conceded that I had won the argument. And then—”

The King did not finish his sentence but stood lost in thought.

“And then what?” asked Alice.

“And then I woke up!” said the King, a bit sheepishly.

—Raymond Smullyan, Alice in Puzzle-Land. This is a fun book of logic puzzles ranging from cute to outrageously intricate. A fine gift for the mathematician on your list (though I hear The Annotated Alice is even better).