Post by Dan ChristensenPost by John GabrielPost by FredJeffrieshttps://www.quantamagazine.org/20150519-will-computers-redefine-the-roots-of-math/
"Set Theory and a Paradox
Set theory grew out of an impulse to put mathematics on an entirely rigorous footing -- a logical basis even more secure than numbers themselves. Set theory begins with the set containing nothing -- the null set -- which is used to define the number zero. The number 1 can then be built by defining a new set with one element -- the null set. The number 2 is the set that contains two elements -- the null set (0) and the set that contains the null set (1). In this way, each whole number can be defined as the set of sets that came before it."
I want to say that NOTHING is more secure than the Euclidean derivation which I was the FIRST to understand in its entirety. No moron before me even came close to understanding the profound depth of what Euclid was attempting to accomplish, but failed because of Ancient Greek being such a difficult language and the fact that he was the first to ever attempt such a formulation. Euclid has some issues with definitions that are circular. I have corrected all of this.
You STILL have a long way to go with your new system, JG. We are STILL waiting, for example, for your proofs that 2+2=4 and that 1 divided by 3 is 1/3. If your system cannot handle basic arithmetic like this, it will never handle calculus.
You need proof for mythmatics, but you don't need any proofs for 2+2=4 in real mathematics. While it is very easy to prove using my axioms of arithmetic, 2+2 is just the definition of 4. You can't grasp this because you are too stupid.
Carry on reading Godel who thinks that "recursion" makes a formal system strongly representable. Chuckle.
"The Representability Theorem
In any consistent formal system which contains Q:
A set (or relation) is strongly representable if and only if it is recursive;
A set (or relation) is weakly representable if and only if it is recursively enumerable."
What a load of Godel BS. Recursion is implemented by use of PUSH/POP directives in most processor architecture. It makes a real-time system with synchronisation objects very unstable. Imagine two separate threads A and B calling a factorial() function that has been implemented recursively.
Thread A pushes its formal arguments (if any) onto stack, its return address and then calls factorial(). Inside factorial(), local variables are pushed on the stack as well for each call. These are preserved on the stack for A and B. Some instance during A's processing, B calls factorial(). Boom! So much for "recursive stability".
What I just described is an actual experience. What happened was that the recursive function would not compute properly, sometimes it did and other times it crashed. I implemented it without recursion and it worked just fine. In fact, recursion is cleaner if the stack is not used, rather an array of structures as follows:
{ Function Identifier, Call number, args, local vars }
It's a little more processing but far more stable since B cannot pop A's return address or vice-versa. Also, neither thread can use the other's local variables.
Non-recursive algorithms are far more stable. Goedel and all the recursion addicts were a bunch of effed up morons like you. "That's so Koool!" - the typical immature American/Canadian frame of mind. Tsk, tsk.
So much for Godel's "Representability Theorem".
Troll DC is still struggling to understand the following statement:
1/3 is the measured magnitude but 1:3 is the ratio of the magnitudes being compared. Division has already taken place in 1/3. Nothing more to do. It's a number. 1 -:- 3 is an algorithm that does NOTHING in algebra. It has come to mean 1 divided by 3, but it's only possible using geometry.
Idiot will probably come back with something like 1/3 * 3 = 1, forgetting that there are a lot of assumptions already in that statement that he hasn't proved.
Chuckle.
Post by Dan ChristensenPost by John GabrielNO computer would ever be able to accomplish what I have done.
Hint: If you cannot "computerize" your system, it is probably not a workable foundation for mathematics. To paraphrase Albert Einstein, if you can't "explain it" to a computer, you don't understand it yourself.
A computer cannot think! My stupid one. But yes, I can just see you talking to your code. What a psycho.