What is the most fundamental way of doing math?

And What is the most basic form of mathematics?

If multiplication is just addition and substraction just addition (but with mixing numbers) and that addition is just counting numbers, then can we do math only by counting without using any modern techniques such as addition?

Or maybe logical reasoning is the most basic form of math?

Also, is there something even more simplified than binary in terms of arithmetics?

Other urls found in this thread:

math-only-math.com/subtraction-by-2s-complement.html
blog.kevtris.org/?p=62
en.wikipedia.org/wiki/Tractatus_Logico-Philosophicus#Proposition_6.N
en.wikipedia.org/wiki/Sheffer_stroke
en.wikipedia.org/wiki/Heyting_algebra
en.wikipedia.org/wiki/Relevance_logic
en.wikipedia.org/wiki/Doxastic_logic
twitter.com/SFWRedditVideos

geometrically

You mean categorically

>If multiplication is just addition
The multiplication of two positive numbers can result in a smaller number (e.g. 9 * 1/2), whereas the addition of two positive numbers will never result in a smaller number. So argueth Euler.

You could never hope to understand fundamentals with a brain like yours. You are still welcome to try reading a book on category theory.

(9 * 1/2)
=
(9 รท 2/1)


>You could never hope to understand fundamentals with a brain like yours.

You seem to have fallen into the same fate as me, then.

I always felt that if you take some directed multigraph as base cateogry C, and then use any other directed multigraph D and a functor
F : D --> C
and force some limit of F, or co-limit of F, to exist (thereby possibly switching to a richer category C'), then whatever you got it a pretty natural thing.
E.g. if D is just two vertices and no arrows, then the limit/colimit will be the categorical product/addition. In a category C of sets, you end up with arithmetic.

Fancy words aside, the Linear Logic people are the only notable ones who I've ever seen not keeping at least the axioms of constructive logic. I'd take constructive logic as your stating point.
Beyond that, just adjoin any type rules you with and don't give it too much weight when it's comes to ontology of things. Historically, we need arithmetic (I mean humanity), so let's consider it a valid and relevant theory - that we can write down in constructive logic. This is I consider the most fundamental base.

Philosophy =^)

Jokes aside, this actually is mathematical philosophy, isn't it? We can go far, as shows

the matematekoi were just the more logically adaptable group of the cult of Pythagoras, who explored mathematical concepts not on geometry alone but also music and a generalized notion of harmony in life

Study field theory. Everything we've ever constructed using the typical sense of multiplication and addition are built from specific cases of a field.

Also, in your pic, I've never heard of the first line referred to as P, but Q. Only P and Q. Probably insignificant.

What does Q and P stand for ? I'm assuming its used in Calculus.

Help advance NY knowledge

homotopies

>Also, in your pic, I've never heard of the first line referred to as P, but Q. Only P and Q. Probably insignificant.

Yea me too, p but q sounds odd.

I'll study everything you guys are mentionning

Division is just subtraction

your move

There are many other logics weaker than intuitionistic logic that people use in foundations.

I'd argue for a logical pluralist approach. I do think category theory has a place there given it's correspondences to type theory and formal logic but I think your construction is too specific.

Substraction is only (modified) addition. Complement method :

math-only-math.com/subtraction-by-2s-complement.html


In binary :

1100
-
1010

=

1100
+
0110

We keep the last 1 and eliminate the first.

=

0010
Your turn

A pluralist approach to logic is surely not what OP understands by most basic and simplified.

And to me it's like stuff weaker than intuitionistic logic is just to keep their PhD students buzy

The computer is evidence in support of Russell's claim that mathematics is reducible to logic. I can perform all kinds of abstract mathematical operations on my computer but the computer running those operations can be built using only NAND gates. If it's a mathematical operation I can perform in software then that operation can be represented, and *is* being represented, equivalently in statements of pure logic.

Math simplifies to logic and logic simplifies to the NAND gate.

Predicate logic captures models with infinite domain.
Nand gates get you the basic Boolan algebra at most

What's with your picture? Am I the only one seeing its retardedness?

Man this thread is gud, lot of things to lookup

Yes? What is it that's retarded

the term for "directed multigraph" is "quiver", jsyk

logic does not fucking simplify to the NAND gate, truth tables are SEMANTICS, not syntax

I made your mom's pussy quiver last night

Not that poster, but if I'm being generous to them I'd say that their point is that all syntactic manipulations can be boiled down to crunching 0's and 1's, i.e. boolean logic, i.e. the NAND gate

Rigorous proofs.

Abstract algebra is algebra without numbers.

Lines 5 and 7 (greentexted below) don't make any sense to me. It's also odd to call the conjunctive 'but' in the first line.

>P is necessary for Q: Q -> P
I'd read 'Q -> P' as 'if Q then P.

>P only if Q: P -> Q
I'd read 'P -> Q' as 'if P then Q'.

I realize that this is probably just a notation unknown to me in some logical system that I'm not familiar with, but I must say it's odd that it doesn't correspond with the notation of e.g. propositional calculus. Would you be so kind as to tell me the name of the logical system that uses the notation in OP's pic?

The reason why these examples are given it to given an understanding to how other mathematicians might notate their junctions.
Its the same thing.
If you havent had abstract algebra in college I'd suggest sitting in on a few lectures, professors allow it for free even if you arent a student. Reading a book on it lets you get into this mindset, which isnt always healthy.

>logic does not fucking simplify to the NAND gate, truth tables are SEMANTICS, not syntax

I can run Wolfram's Mathematica on my computer and perform all kinds of mathematical operations but those operations are all reducible to statements of logic. They have to be in order that the operations be executed by the machine.

I can say what I want to say in an abstract and high-level mathematical language or I can say it by a collection of NAND gates arranged in a particular configuration. They are equivalent ways of expressing the same thing. There's no mathematical operation you can perform in any math software which can't also be expressed by a collection of NAND gates.

See also: computer built using only NAND gates: blog.kevtris.org/?p=62

I'm not that guy but you're wrong due to the fact that we don't have s proper type theory upon which to found all mathematics yet. In other words, Mathematica and any other example you can think of will only be able to work with a proper subset of mathematics.

Logics yield type theories which yield special types of programming language called a theorem prover upon which we can found all mathematics.

ITT a poster is making a spicy, reductionist claim involving math, computers, logic, and NAND gates. Some other posters are taking issue with this claim, referencing higher-level/philosophical issues.

I want to leave the dispute aside for one moment (and also the fact that we're on the Veeky Forums board), to reference a pertinent piece of philosophy: prop 6.etc from Wittgenstein's Tractatus, where he states (in his own notation) that such-and-such logical things can be totally characterized in terms of one operator-thingy, which wiki characterizes as a NOR-thingy:

en.wikipedia.org/wiki/Tractatus_Logico-Philosophicus#Proposition_6.N

I haven't done the exercise but I vaguely remember. But the user in this thread was talking about NAND thingys, and not NOR-thingys! Presumably, we come to no harm, and much the same thing (all you need is the one operation-thingy), exactly because per the Sheffer stroke wiki, NOR and NAND are "dual", whatever that means in this context:

en.wikipedia.org/wiki/Sheffer_stroke

I'm just noting these things, but this post implies a good four or five exercises which I haven't done yet.

Don't stop posting, i'm liking this thread

I also wish to point out that the OP's chart makes very confusing linguistic presentations of what logical connectives are supposed to connote, from a "typical" presentation.

Especially, the second line: "Either P or Q" is a common phrase-form used to indicate the XOR situation, as opposed to the OR situation. I concede that the natural language as-such on the LHS may indicate the OR operator, but my point is that the LHS gives me some grief.

I wonder how to interpret the first line, despite understanding AND. I invite someone to help me.

So if you think there is some mathematics that a computer (a collection of NAND gates) can't do, then it's probable that a human brain (a collection of random electrical signals) can't do it either.

It's cool to think that there might be fundamental truths of our universe that our unknowable to us. I wonder if designing a new type of computer will be necessary for the singularity. Or maybe a NAND computers can become smart enough to design the new ones.

Do you think that we will even be able to understand the results from them? Would it just be like a complicated algorithm that just spits us the right result that means something, or would it just look like gibberish to us?

That is, of course, assuming that there is a field of math that our NAND computers can't handle. Seeing the problems that were recently solved (at least faster) by quantum computers, I can kind of believe it.

define [math]\pi \cdot \pi [/math] assuming multiplication is repeated addition

protip, you can't.

You're confused. It's in fact the same thing.

>"It rains" ==> "There are clouds in the sky"
Your reading is
>If it rains, then there are clouds in the sky
Their reading #1 is
>There being a clouds in the sky is necessary for it to rain
Their reading #2 is
>it rains only if there are clouds in the sky

All of those express the same Boolean relation

For one, as I said in , logic is more expressive than what we can (semantically, i.e. as like a truth table with 0's and 1's on memory) implement on a computer today
E.g. we can speak of the class of all finite fields [math] {\mathbb F}_p [/math], even if there are inifinitely many of them. We can use predicate logic to express statements that hold forall ([math] \forall [/math]) finite fields, e.g. that they have a prime number of elements, and then we have a language with infinitely many well-formed strings. Why would we take Boolean algebra as base for logic, when it doesn't capture half the things we're interested in.
And besides, if you go away from Boolan algebras e.g. to some
en.wikipedia.org/wiki/Heyting_algebra
you have smaller expressive logics (thus richer ones, in the sense that they are compatible with more theories) and then the classical truth tables are false (too restrictive)

You can do it in base 16.

1. write pi in base 16 using the formula for the nth digit of pi in base 16 (there exists such a formula)
2. Expand pi*pi and collect like terms

All instances of multiplication are done to integers and therefore are just repeated addition.

Let me try again, I think my original post was a bit too vague.

>Line 5: P is necessary for Q: Q -> P
First of all, 'if Q then P' doesn't mean that 'if P then Q' holds. Secondly, using conventional semantics, 'if Q then P' does not mean that Q is necessary for P, it only means that Q is sufficient for P. Lastly, it makes no sense how a consequent can be necessary for an antecedent, though maybe I'll have to cede this point due to the ambiguity of the language used.

>Line 7: P only if Q: P -> Q
This runs into the same antecedent/consequent problem, so the explanation would make much more sense if it was 'Q only if P', but again, 'if P then Q' doesn't make P necessary for Q, only sufficient, so the 'only if' statement is very odd.

I quickly checked the wiki for boolean algebra, and it looks to agree with me on these conventions, which I find odd, seeing as I thought this was just a difference in notation.

Let me make a slightly altered example from yours:

>It rains -> the ground is wet
My reading: If it rains, then the ground is wet
From your reading #1: The ground being wet is necessary for it to rain.

This is false. The ground does not need to be wet in order for it to rain.

From your reading #2: It rains only if the ground is wet.

Again, this is obviously false.

Also, you're confusing me by having the consequent being a necessary and sufficient condition for the antecedent, which makes little sense to me. I've only encountered cases in which the antecedent is sufficient and/or necessary for the consequent, not vice versa.

I am by no means even averagely well-versed in logic, so I hope you will take the time to explain my misunderstandings to me, I'd really like to know what I'm not getting.

>Lastly, it makes no sense how a consequent can be necessary for an antecedent, though maybe I'll have to cede this point due to the ambiguity of the language used.
Do you have any problem with the rains=>cloud example? A condition for the rain is that there is a cloud.
Or let's say n even=>n^2 even. A condition for n being even is that n^2 is even.

>From your reading #2: It rains only if the ground is wet.
>Again, this is obviously false.
>From your reading #1: The ground being wet is necessary for it to rain.
>This is false.

>The ground does not need to be wet in order for it to rain.
Yes it does. That's how the reading of logical sentence reduced to truth tables work:
>If it rains then the ground's wet. Thus if the grounds wet, it can't be raining. The ground being wet is a necessety for it being rainy.
Your problem might be that there is no timely causal relation from wetness to rain, and that's just the problem of how our symbol games correspond to informal logic. There are then formal logics like
en.wikipedia.org/wiki/Relevance_logic
to resolve this.
It rules out pathological truths of classical logic, such as
>There is a thing, such that if it's a bird, everything is a bird
(which is provable, because either everything is a bird, or there's something which isn't, and there assuming the condition for the if-clause gives you a contradition that lets you proof anything)
There's also
en.wikipedia.org/wiki/Doxastic_logic
and other modal logics to refine "real world truths", but only philosophers and computer scientists every care - because first order logic and set theory happens to be strong enough to model any of those logics semantics.
Formal logic is a plebeian deterministic attempt to capture "logic" and reason. It's not the same thing. Don't overestimate positivism.

yeah, k, but that wouldn't have made the answer more simple.

btw., after reading your answer it came to me that you could just define a quivers as the type [math] A \to (B\times B)[/math].
Any such function gives rise to a quiver and the other way around.

>those phrases
I'm not a native speaker, but isn't "either X or Y" equivalent to "X iff not Y", or "X or Y but not X and Y"?

With the logical or/either, both being true is permitted

Thanks for the answer. After some consideration, I guess at the end of the day I'm just too retarded to understand that the statements 'if p then q (p -> q)' and 'p only if q (p -> q) are equivalent. My intuition told me that this wasn't the case, but after some consideration I have to backpedal on this one, at least mathematically.

For instance, I fail to see how claiming that 'if god is real then jesus is real' is exactly the same statement as 'god is only real if jesus is real'. To me, there is the possibility that god (the same one as used in both previous sentences, not some other god) might be real even if jesus isn't. I tried coming up with a mathematical statement to show that intuitively, the two definitions of the conditional are not equivalent, but I could conjure no such example.

You wouldn't happen to have any comments about my god example?

Yes, indeed it is, and that is why I asked. As far as I know, either or would allow only one of the options.

The answer is simply that informal-if and logic-if are different. Logic-if was coined for very specific purposes (such as to make formal proofs), not to model everything we mean when we talk about hyphotheticals, which isn't to say that hyphotheticals have no philosophical merit, they are simply different uses for different issues.

So in the context of formal logic there is no way of representing a conditional in which the antecedent's truth value is not dependent on the consequent?

classic formal logic is only worried about the truth of propositions. if you were to have a 'p philosophical-if q' you'd have to bring up real world shit beyond the truth values of p and q (like whether there is a causation link between p and q) and that'd complicate the logic.
also keep in mind that you can have any operator you want as long as you can make a truth table for it (like in pic related)

I assume the truth table can be completely arbitrary as long as you define it?

>For instance, I fail to see how claiming that 'if god is real then jesus is real' is exactly the same statement as 'god is only real if jesus is real'. To me, there is the possibility that god (the same one as used in both previous sentences, not some other god) might be real even if jesus isn't.
Let's assume (R) 'if god is real then jesus is real'
We want to show 'god is only real if jesus is real'. Assume the contrary, i.e. god is real even jesus isn't. Well then by (R) we have that jesus is real, which is in contradition to it.