That assumes that 1 and 1 are the same thing. That they’re units which can be added/aggregated. And when they are that they always equal a singular value. And that value is 2.
It’s obvious but the proof isn’t about stating the obvious. It’s about making clear what are concrete rules in the symbolism/language of math I believe.
The idea that something not practical is also not important is very sad to me. I think the least practical thing that humans do is by far the most important: trying to figure out what the fuck all this really means. We do it through art, religion, science, and… you guessed it, pure math. and I should include philosophy, I guess.
I sure wouldn’t want to live in a world without those! Except maybe religion.
Just like they did with that stupid calculus that… checks notes… made possible all of the complex electronics used in technology today. Not having any practical applications currently does not mean it never will
The practical application isn’t the proof that 1+1=2. That’s just a side-effect. The application was building a framework for proving mathematical statements. At the time the principia were written, Maths wasn’t nearly as grounded in demonstrable facts and reason as it is today. Even though the principia failed (for reasons to be developed some 30 years later), the idea that every proposition should be proven from as few and as simple axioms as possible prevailed.
Now if you’re asking: Why should we prove math? Then the answer is: All of physics.
The answer to the last question is even simpler and broader than that. Math should be proven because all of science should be proven. That is what separates modern science from delusion and self-deception
It depends on what you mean by well defined. At a fundamental level, we need to agree on basic definitions in order to communicate. Principia Mathematica aimed to set a formal logical foundation for all of mathematics, so it needed to be as rigid and unambiguous as possible. The proof that 1+1=2 is just slightly more verbose when using their language.
In base 2 binary for example the digits are 0 and 1. Counting from 0 up would look like 0, 1, 10, 11, 100, 101, 110, 111, 1000, 1001, 1010, 1011, 1100, 1101, 1110, 1111, 10000, etc
Using the Peano axioms, which are often used as the basis for arithmetic, you first define a successor function, often denoted as •’ and the number 0. The natural numbers (including 0) then are defined by repeated application of the successor function (of course, you also first need to define what equality is):
0 = 0
1 := 0’
2 := 1’ = 0’’
etc
Addition, denoted by •+• , is then recursively defined via
a + 0 = a
a + b’ = (a+b)’
which quickly gives you that 1+1=2. But that requires you to thake these axioms for granted. Mathematicians proved it with fewer assumptions, but the proof got a tad verbose
The “=” symbol defines an equivalence relation. So “1+1=2” is one definition of “2”, defining it as equivalent to the addition of 2 identical unit values.
2*1 also defines 2. As does any even quantity divided by half it’s value. 2 is also the successor to 1 (and predecessor to 3), if you base your system on counting (or anti-counting).
The youtuber Vihart has a video that whimsically explores the idea that numbers and operations can be looked at in different ways.
Isn’t “1+1” the definition of 2?
That assumes that 1 and 1 are the same thing. That they’re units which can be added/aggregated. And when they are that they always equal a singular value. And that value is 2.
It’s obvious but the proof isn’t about stating the obvious. It’s about making clear what are concrete rules in the symbolism/language of math I believe.
This is what happens when the mathematicians spend too much time thinking without any practical applications. Madness!
The idea that something not practical is also not important is very sad to me. I think the least practical thing that humans do is by far the most important: trying to figure out what the fuck all this really means. We do it through art, religion, science, and… you guessed it, pure math. and I should include philosophy, I guess.
I sure wouldn’t want to live in a world without those! Except maybe religion.
We all know that math is just a weirdly specific branch of philosophy.
Physics is just a weirdly specific branch of math
Only recently
Just like they did with that stupid calculus that… checks notes… made possible all of the complex electronics used in technology today. Not having any practical applications currently does not mean it never will
I’d love to see the practical applications of someone taking 360 pages to justify that 1+1=2
The practical application isn’t the proof that 1+1=2. That’s just a side-effect. The application was building a framework for proving mathematical statements. At the time the principia were written, Maths wasn’t nearly as grounded in demonstrable facts and reason as it is today. Even though the principia failed (for reasons to be developed some 30 years later), the idea that every proposition should be proven from as few and as simple axioms as possible prevailed.
Now if you’re asking: Why should we prove math? Then the answer is: All of physics.
The answer to the last question is even simpler and broader than that. Math should be proven because all of science should be proven. That is what separates modern science from delusion and self-deception
deleted by creator
It depends on what you mean by well defined. At a fundamental level, we need to agree on basic definitions in order to communicate. Principia Mathematica aimed to set a formal logical foundation for all of mathematics, so it needed to be as rigid and unambiguous as possible. The proof that 1+1=2 is just slightly more verbose when using their language.
Not a math wizard here: wouldn’t either of the 1s stop being 1s if they were anything but exactly 1.0? And instead become 1.xxx or whatever?
In base 2 binary for example the digits are 0 and 1. Counting from 0 up would look like 0, 1, 10, 11, 100, 101, 110, 111, 1000, 1001, 1010, 1011, 1100, 1101, 1110, 1111, 10000, etc
In that case 1 + 1 = 10
1.xxx and 1.xxy are still 2 numbers, so 1+1=2.
Gottem.
Using the Peano axioms, which are often used as the basis for arithmetic, you first define a successor function, often denoted as •’ and the number 0. The natural numbers (including 0) then are defined by repeated application of the successor function (of course, you also first need to define what equality is):
0 = 0
1 := 0’
2 := 1’ = 0’’
etc
Addition, denoted by •+• , is then recursively defined via
a + 0 = a
a + b’ = (a+b)’
which quickly gives you that 1+1=2. But that requires you to thake these axioms for granted. Mathematicians proved it with fewer assumptions, but the proof got a tad verbose
The “=” symbol defines an equivalence relation. So “1+1=2” is one definition of “2”, defining it as equivalent to the addition of 2 identical unit values.
2*1 also defines 2. As does any even quantity divided by half it’s value. 2 is also the successor to 1 (and predecessor to 3), if you base your system on counting (or anti-counting).
The youtuber Vihart has a video that whimsically explores the idea that numbers and operations can be looked at in different ways.
I’ll always upvote a ViHart video.