Lambda calculus general

Why is lambda calculus so comfy bros

Other urls found in this thread:

google.co.uk/search?q=how to program
chris-taylor.github.io/blog/2013/02/10/the-algebra-of-algebraic-data-types/
www
arxiv.org/abs/0804.3434
csie.ntu.edu.tw
axiomsofchoice.org/set_theory
axiomsofchoice.org/intuitionistic_propositional_logic
cs.kent.ac.uk/people/staff/sjt/TTFP/ttfp.pdf
worrydream.com/AlligatorEggs/
coq.inria.fr/tutorial-nahas
twitter.com/SFWRedditGifs

looks stupid
like set theory

>look mom i'm doing it again

It gives a intuitive perspective for dealing with intuitionistic logic. It shows the Curry Howard correspondence between proof theory and computability (proofs = programs). It's a great place to start with the whole
Logic ~ Type Theory ~ Category Theory
perspective on programming and theorem proving.

Set theory doesn't have shit on type theory.

>Set theory doesn't have shit on type theory.
>have shit on type theory
>type theory
Cancerous shit that was one of the biggest wrong turns CS has ever taken. It shouldn't even exist and is only a field of study because retards can't design compilers for shit and started worshipping archaic static compilation techniques.

TRIGGERED

pretentious bullshit. show me how set theory improves my clock rate. oh wait, you can't because you're a fucking formalism zombie who thinks "inventing" obscure notation, and non-constructive proofs counts for "contributing to science and mathematics". even saying "non-constructive proofs" makes me sick, that I actually have to distinguish between the two types because post-war academics insisted on taking the "discoveries" of schizophrenics seriously.

I cannot wait until this, the "elegant" meme, and "mathematicians are above calculating" die in a fire.

how do you write f(x) = a*x^2 +bx +c in lambda notation? What about a fourier transform?

wow dude
lambda x.a*x^2 +bx +c

google.co.uk/search?q=how to program

isnt it supposed to only use the 3 symbols
[math]\lambda x .[/math]

and ()
It's not supposed to only use x, it can use any identifier
for instance, assuming normal whitespace rules:

\x. add (add (mult (mult a x) x) (mult b x)) c
(\ is the lambda character)
This is proper lambda calculus, but it will be nicer if we add some syntax:

"add" and "mult" are free variables, (as are a b and c) so whatever meaning they have is up to whoever interprets the output.

In terms of actually expressing * and + in lambda calculus, it is not very pretty. You can use church encoding for natural numbers:

0 = \f.\x.x = \f x.x
(\f x. is a shorthand for \f.\x.)
successor
s = \n. \f x. f (n f x)

+ = \m n. \f x. m f (n f x)
* = \m n. \f. m (n f)

m + n = \f x. m f (n f x)
m * n = \f. m (n f)

btw, operators are just shorthand for function application

a - b = - a b
You can make an operator associative, and give it a precedence/priority and make the syntax a lot nicer

\x. a*x*x + b*x + c
\x. (* (* a x) x) + (* b x) + (c)
\x. + (+ (* (* a x) x) (b x)) c

[math] f(x) := a*x^2 +bx +c [/math]
is the same as
[math] f = \lambda x.\, a\,x^2 +b\,x +c [/math]
and
[math] \lambda x.\, a\,x^2 +b\,x +c [/math]
is the same as
[math] x \mapsto a\,x^2 +b\,x +c [/math]

[math] 3 \ :\ {\mathbb N} [/math]
[math] (\lambda x.\, ax^2 +bx +c)\ :\ {\mathbb N} \to {\mathbb N} [/math]
[math] (\lambda x.\, ax^2 +bx +c)\, 3 \ :\ {\mathbb N} [/math]

[math] (\lambda x.\, ax^2 +bx +c)\, 3 [/math]
reduces to
[math] a*3^2 +b3 +c [/math]

The two equivalent models of computation, Turing machines and lambda calculus, have emphasize and simplify working with two completely different aspects of the theory of computation: Complexity vs. composability of algorithms.
What's valuable in dropping the f-declaration is that composite expressions have a form that's just the sum of their parts.

If you implement a type conversion [math] k: {\mathbb N} \to {\mathbb R} [/math] embedding e.g. the natural number 3 as the real 3 (between the e and pi), then

[math] (\lambda x.\, ax^2 +bx +c) \circ k \ :\ {\mathbb N} \to {\mathbb R} [/math]

Now say you want to solve

[math] x^2=7 [/math]
which by adding [math] x^2 [/math] on both sides and dividing by [math] 2x [/math] is the same as
[math] x = \frac{1}{2} (x +7/x) [/math]

Making an initial guess and iterating this is the Babylonian method of computing [math] \sqrt{7} [/math]:

x=3?
[math] x = \frac{1}{2} (3 +7/3) = 2.666 [/math]
[math] x = \frac{1}{2} (2.666 +7/2.666) = 2.647 [/math]
[math] x = \frac{1}{2} (2.647 +7/2.647) = 2.646 [/math]

Passing [math] \lambda x.\, \frac{1}{2} (x +7/x) [/math] to the iteration function in pic related and letting that shit run for n seconds, or specifying to stop once the m'th digit stops altering, gives an arbitrary precision algorithm in one expression

You seem to be using typed lambda calculus

Surely Y can't have a type? (Unless you're using some weird type theory with infinite types)

You can take the N's and R's in the first half to be optional annotations, if you want to make all fit into one framework. I wrote about Y to show there's more going on than notation.
Do you want to make a point here?

I was just curious if there was actually some type or framework I was missing

Does anybody know of any "naive type theory" books? Does anybody know of any math book that uses type theory instead of set theory as the background theory? I'm getting tired of seeing sets and set notation used everywhere.

You could also try Category Theory, but there are a lot of sets in it
There's a book called Homotype type theory

Our setting is historically grown and the alternative brings not that much to the table. It has its merits, but it's not that much. (And I say this really liking all of the computational and constructive stuff)

>You could also try Category Theory, but there are a lot of sets in it
Indeed, I have yet to see a category theory book that doesn't use set theory.

>There's a book called Homotype type theory
This is a start. I wish there were more books that use type theory like this, but for more mundane topics, like differential/integral calculus, linear algebra, abstract algebra, topology, etc.

>the alternative brings not that much to the table. It has its merits, but it's not that much.

Aren't the computational and constructive aspects bringing in A LOT. I mean, just look at the whole functional programming movement.

If you're into programming and ADT (algebraic data types), you can have an algebra of types which you can differentiate (differentiating a type gives you something called a zipper)

chris-taylor.github.io/blog/2013/02/10/the-algebra-of-algebraic-data-types/

Yes, I've read about that before. It's interesting, but it's not what I am complaining.

It's relevant for programming and logic, yes!

But I was answering directly to his questions about babby math in type theory instead of set theory.
Combinatorics, geometry, etc. etc. ... those subjects are written down using sets in all books, and re-writing them with types is not adding anything.

Algebra and anything that speaks of "homomorphism" (be it group homomorphisms in algebra proper or diffeomorphisms in differential geometry) intruduced their own typing schemes last century and it goes by category theory.

The homotopy type theory book will define a notion of "set" and one of "category", but is itself not concerned with math as such, like combinatorics of calculus --- excep homotopy theory of course

>re-writing them with types is not adding anything.

It does: it modernizes the presentation of these things. That alone is worth the effort. It was done when set theory became in vogue, so...

Yes okay, you're right.
E.g. I love the induction principle for N in Martin-Löfs theory, which at the same time really is the type specification of the recursion/iteration algorithm.
(i.e. Curry-Howard)

>The homotopy type theory book will define a notion of "set"

My understanding is that in HoTT, a set is a type T such that if a and b are values of type T, then there is only one proof of a = b (or something like that). What is not clear to me is how do you go about showing that a = b admits only one proof.

Be triggered all you want, but when will I ever see lambda calculus as a necessity for computational chemistry? To me this shit is like set theory in that its just being a little niche way of making mathematics "artistic", much like p-adic QM. Except that sets for the most part are actually useful, unlike whatever the hell you two jerk off to.

It's useful for implementing compilers of functional programming languages, which are becoming somewhat more popular recently.

>when will I ever see lambda calculus as a necessity for computational chemistry
Why would it ever be?

>when will I ever see lambda calculus as a necessity for computational chemistry?

If you're doing computational chemistry, you're probably using some programming language to carry out your computations, and that language is probably typed, and that means you're indirectly using concepts and ideas from lambda calculus.

By showing that any two proofs
p : (a=b)
q : (a=b)
are, in fact, the same. This means you must find a term s of the type p=q.

Depending on your background, I can try and clarify what that's about

In what way will that make your life any better ie, why do you care so much?

>In what way will that make your life any better?

How will getting people to stop wasting their time navel gazing and polishing their notation make my life better? Gee, I dunno. I guess I was under the impression that the more smart people there are working on important problems, the better, but maybe I was wrong. Maybe it's more important that you play with your formalism while your neighborhoods are falling apart and the world overpopulates. It sickens me when people brag about, and encourage intellectual escapism. Never, in any textbook I've read on set theory do I see it introduced as a tool which we can use to help ourselves and our neighbors. Always, do I see it sung high-praise that Mother Theresea isn't even worthy of. These people are parasites, not mathematicians. You are exactly the same as people who care more about interface than performance. Whether or not things "look" pretty, and not whether they work. Shallow, status-seeking formalism zombies whose only accomplishment in life is going to graduate school and "being trained in [insert some contrived bourbaki plagued field]". You know who cares that you're highly educated? No one. Try using your big brain to solve an actual fucking problem.

>Why do you care so much?

How can I not care? In what world do you live where you expect people to be apathetic to those in society who do not carry their weight and then have the gall to brag about it? How can this not infuriate you? I lose sleep every time I think about it. Negligence is worse than actively committing crimes, because at the very least we can publicly name one as a criminal.

>insert some contrived bourbaki plagued field
here's where I knew this was bait

>someone has an opinion I disagree with
>they must be baiting me

Yes, you caught me. The world does in fact revolve around you. We're all (me and my troll accomplices) out to get you. All I do is plan my posts *just for you* because you are just so important.

Pull your head out of your ass.

why are foundations-related threads the only threads where the smart people show up?

> tfw you actually went as far as to implementing your constructive proof in Coq and then extracting a working program from it

Sci tell me I can do it. My theorem is not much more complex than the intermediate value theorem.

Give me one reason why I should learn this retarded shit.

Lecture notes for all those interested.

https:// arxiv. org/abs/0804.3434

There's several different kinds of type theories, moreso than there are set theories.

The kinds of type theory more commonly used nowadays are all based on Martin Lof's intuitionistic type theory. I found his actual lecture notes to be far more straightforward than any presentation I've seen elsewhere.
www .csie .ntu.edu .tw/~b94087/ITT.pdf

Dependent type theories (like you see used in theorem provers, dependently typed programming languages, HOTT, etc...) build on intuitionistic type theory. So it will help to start with that.

The original type theory by Russel was actually different and meant to serve the same function as modern set theory. I saw a list of the axioms at one point and while it seems intuitive it also seems super similar to normal set theory just somewhat more careful. I haven't seen any texts that develop this to its full extent but I would be interested in looking at one if you find it. You should know that in general these older type theories aren't really used anymore.

The simply typed lambda calculus is a kind of type theory as well and you'll find that many languages have some as well. In particular you will typically find a correspondence between a programming language, a type theory, a logic, and a category (as in category theory). eg.
>Lambda Calculus ~ Intuitionistic Logic ~ Cartesian Closed Categories
This relationship is used in the development of modern programming languages in order to deal with the semantics and structure of the language.

Category Theory is also worth learning. Less so for foundational purposes and moreso for the new perspective it offers you through it's higher levels of abstraction.

My guess is that when it comes to other topics you're more likely to see beginners asking beginner questions as opposed to people arguing about the merits of a given approach.

On the other hand foundations carries with it a lot of strong opinions and has a lot of different perspectives stemming from many different fields. Knowledgeable people feel they have a vested interested in participating in these threads while they wouldn't even consider posting in a run of the mill "help me with X" thread.

arxiv.org/abs/0804.3434
csie.ntu.edu.tw
/~b94087/ITT.pdf

It kept detecting my post as spam until I added all those spaces.

Formal logic can be seen as being a part of/or being at the heart of any mathematical theory.
If you ask any 3rd year or above question on Veeky Forums, nobody invests the time to think about it. It's only worse better than StackExchange, really. There, questions which require work and are not just textbook knowledge of some user will also remain unanswered.

It's not likely that busy people are going to invest a few hours thinking about a question someone on Veeky Forums came up with.

What are a, b and c? How have you defined exponentiation, addition and multiplication?

Dude you are talking completely based on ignorance. Do you know about dependent typing? It lets you do shit you can't even dream of in other paradigms. Programming is math, and until people realize that they will keep on shitting out bags of features that pretend to be languages.

I would not recommend anyone to buy the HoTT book, though
The nightly PDF builds are free and are frequently updated with corrections

Any linear logic bros here? What are its true semantics?

Another way to put this is that Veeky Forums is an incredibly general forum so the odds go to zero of finding an expert in your-favorite-microsubfield-X.

How about giving an example?

>Programming is math

Glad I'm not the only one that thinks this.

>while your neighborhoods are falling apart and the world overpopulates
What?

People enjoy things like art, writing, mathematics science and so on. I don't care about solving the "world's problems". I like mathematics and science fiction and a handful of other things. I enjoyed art before I started pursing STEM.

>You know who cares that you're highly educated? No one. Try using your big brain to solve an actual fucking problem.

I do not give a shit what people think of me. I do not give a shit what people like YOU think of me. I do not want to become rich nor am I deluded enough to think I am capable of solving any problems in the world. I will find a way to make money and enjoy the things that make me happy. I'm not in it for any sort of bullshit prestige and I'm not in it for any sort of "change the world bullshit."

Me becoming a Mathematician or not will have absolutely no effect on your life. Not trying to become a Mathematician would make me depressed and deeply unsatisfied with myself. Fuck you for taking offence to that you piece of shit.

Well the natural numbers defined as in the HoTT book can be shown to be a set, for example.

If you understand the path picture for the identity type, it's clear that not all types X will be sets. The identity structure on X, e.g.
x=y
if x:X and y:X
may be one that has more content that a single term (always up to paths that are equal on a higher level)

I've pieced together some of my early notes on HoTT 4u.
(I've not come back to it since 2014, you folks are welcome to point out fuzzy shit in it)
It's all from this notebook somewhere:
axiomsofchoice.org/set_theory

If U is your type universe, you take the type N : U where e.g.
1 : N
2 : N
3 : N
are terms and define a function
+ : N->N->N
and thus
(1+2) : N
and thus
(1+2)=3 : U
(1+1)=3 : U
are types that map of may not have a term.
By the definition of how + works, you have 1+2 reducing to 3, and thus (1+2)=3 reducing to 3=3, which has a term, call it refl, by definition (reflexivity).
I.e. your compiler will agree to
refl : (1+2)=3
To show that N is a set is to show that any equality type on it has never two or more different terms, i.e. if r is a generic term of e.g. (1+2)=3, then you must show that r=refl. Something being a set in this type theory may be not completely trivial to prove. In any case, to do that here completely formally you would need to write down the product types and so on.

Those are a mighty lot of fancy words y'all are using here. You might oversimplifying it for us common folk rolling around in the dirt?

I'm not the guy you responded to, but I've tried to give a most basic intro to the ideas in pic related to
>It shows the Curry Howard correspondence between proof theory and computability (proofs = programs).
>It's a great place to start with the whole
>Logic ~ Type Theory ~ Category Theory

I also wrote some on
>It gives a intuitive perspective for dealing with intuitionistic logic.
but it's less polished
axiomsofchoice.org/intuitionistic_propositional_logic

(the point of the "even more examples" part is that those are different mathematical frameworks that in reality overlap heavily in syntax)

how do I into lambda calculus, as in what interpreter should I use and what should I read?

cs.kent.ac.uk/people/staff/sjt/TTFP/ttfp.pdf
(pic not necessarily related)

Read the arxiv link here
There are many different kinds of lambda calculus. Those notes cover the three simplest (untyped, typed, System F) versions upon which others are based.

Lambda calculus is a programming language in the sense that there are rules/procedures by which expressions can be converted into simpler expressions ("reduced"). Invoking these rules/procedures effectively acts like performing a computation.

You can find lambda calculus interpreters online that will take an expression as input and reduce it down to a simpler form. However these interpreters aren't really that useful as you won't typically use expressions that are very complicated while learning the theory and since raw lambda calculus doesn't really have strings or numbers (instead it has Church Numerals which are certain lambda expressions that "act" like natural numbers when combined with other lambda expressions that "act" like addition, subtraction, etc..).

Other programming languages include lambda expressions as a feature, however these programming languages are often based on somewhat more complex variants of lambda calculus. Either way, knowing how these basic versions work will get you most of the way to understanding what the more complex versions are capable of.

The main reasons for learning lambda calculus are moreso theoretical than applied (eg. stuff like programming language theory, computability theory, mathematical foundations, formal logic, type theory, category theory, etc...).

Reasons not to learn lambda calculus:
>I just want to make vidya games

thanks

>Reasons not to learn lambda calculus:
>>I just want to make vidya games

b-but what about the Alligator calculus?

worrydream.com/AlligatorEggs/

Thanks a lot. I'll go check it out

jesus christ

Is complex abalysis understood from a computational perspective? After all, all implementations of the reals seem to be faulx reals

anyone got good resources to learn lambda calc? please post.

ignore this i realised question already answered in thread

This is amazing!

what does this question mean? we understand R, analysis in R and analysis in R^n well and not because we can compute all the numbers in R or because we can visualize R^n

Is there something like a constructive or at least syntetic theory of analytical continuation?

>constructive or at least syntetic theory
I'm not asking what these terms mean but what do you think they mean?

a power series is completely characterized by its coefficients, which in turn are characterized from the derivative of the function.

sequences are easy to represent and compute

I'm speaking of a computational system that would be capable of carrying expressions like exp around (like mathematica does, but I'd wabt a function that doesn't just understand such general expressions, but is also capable of expressing e.g. differential geometry and topology)

Can you actually go through all the pathes on C for a complicated analytical function by just re-evaluating power series in various spots? U guess my question is if e.g. the third coefficient of a seried represebtation of a function on the fifth patch in C can be evaluated exactly (without knowledge of arbitrary high coefficients on a previous patch)

The first point was maybe a bit fuzzy. I'm thinking aboht this because it seems that implementation of something like the real numbers in computational frameworks offers several approaches, and so I wonder of the state of the art of (complex) analysis down in those system

>tfw working out fun little math problems in Coq
so cozy

Do you have a link to a list of problems/exercises as an intro to coq?

Seems neat I would like to experiment with it a little.

I don't know of any off the top of my head. I found this: coq.inria.fr/tutorial-nahas

but it's more guided like a traditional tutorial than as a mere list of exercises (which may be a plus or minus depending on your preference).

A general roadmap I would suggest (once you've learned the basic syntax of Coq):

>basic propositional and first-order logic
example problems: any tautology you can think of. try to see if you can develop an intution (no pun intended) as to which are constructive and which aren't, and if they aren't see if you can prove them using the law of excluded middle.

A fun example is the double-negated LEM (forall P : Prop, ~~(P \/ ~ P)) which is constructively valid, but has a somewhat tricky proof.

>basic type constructors (+ , *, ->)
example problems: define the standard isomorphisms between A+B and B+A, and prove that they are isomorphisms. do the same thing for the product. Try to also show associativity, etc.

>inductive types
example problems: try to prove as much arithmetic from the ground up that you can, i.e. prove associativity, commutativity, cancellativity, etc of addition and multiplication on natural numbers and whatever else you can think of using only lemmas you've previously proven.

for something a bit trickier, prove that there are no strictly descending functions from nat to nat.

you can also play around with some other inductive types. for example, define a function on finite lists which reverses them, and show that it is an involution. define a sum operation on finite lists of natural numbers and show that it is invariant under your reversing operation, etc.

>further
once you've gotten the hang of most of Coq (and assuming that you're still interested at this point) think of basic lemmas and theorems whenever they come to mind and try to state and prove them in Coq. If they're non-constructive, see if you can modify them in order to make them constructive.

You have 10 seconds to convince me to learn your memery.

There are two kinds of reals in constructive set theory, Dedekind reals and Cauchy reals. I imagine the distinction would extend to the complex numbers. In general there are finer distinctions that come up, but everything will probably work fine if you're careful.

Thanks bro, this is very helpful.

I'm sceptical.say you got the coefficient list where every entry equals one. It converges to 1/(1-z) on the unit disk and continous as such e.g. to z_0 = 7+5i, around which you can develope another series. But it seems to evaluate the coefficients you must plug in to 1/(1-z) and not the series that has no limit to z0

the series for exponential is well defined without difficulty in constructive predicative topology, but it was done informally and nothing is published so far.

On the other hand, you can take plenty of locale as an exponent of another locale

I have no idea about continuation.

>To show that N is a set is to show that any equality type on it has never two or more different terms, i.e. if r is a generic term of e.g. (1+2)=3, then you must show that r=refl. Something being a set in this type theory may be not completely trivial to prove. In any case, to do that here completely formally you would need to write down the product types and so on.

Let's do it informally. Let a, b be terms of type N and let p, q be terms of type a = b. We want to show that p and q are the same term, in this case the term refl. I'm not sure how to proceed from here.

>Let's do it informally. Let a, b be terms of type N and let p, q be terms of type a = b. We want to show that p and q are the same term, in this case the term refl. I'm not sure how to proceed from here.

I don't see how the proofs p and q can be always be the same. For example, I can always take p and create a new proof p' that swaps two steps in the proof p or just adds and removes some useless steps in p.

How complicated is to make a proof depends on the specification of the type, and then on what the compiler can do automatically. And, thus, when two proofs differ is determined by the system.
You basically say that two proofs can be different by some silly addition, so how can there only be one proof
E.g.
>You may proof that (1+2)^2 equals 10-1 by reducing the left side to 9 and then the right to 9,
>or by reducing the left side to 9, proving Fermats little theorem and then going back to reducing the right hand side to 9!
But a proof corresponds to a term of a type, and those are specified exactly, and the equality is also just a type.

Consider the logical truth
(¬A ∧ ¬B) ⟹ ¬(A∨B)
>If 'A isn't true' as well as 'B isn't true', then 'either A or B is true' is false.

In terms of types, using the notation

A⟹B ≡ A → B (functions taking terms a:A to a term of B)
¬A ≡ A → 0 (functions taking terms a:A and not terminating)
A ∧ B ≡ A × B (pairs (a,b) )
A ∨ B ≡ A + B (tagged terms such as left(a) or right(b))

this translates to the type

((A → 0) × (B → 0)) → (A+B) → 0

Look at OP's pic again, in particular the LAM rule. It reads
[math] \dfrac {x\,:\,X\, \vdash\, t\, : \, T } { (\lambda x. t)\, :\, X\to T } [/math]
"if you can get a term t of T, possibly using an x of X, you can get a function term of X→T writen with the lambda notation"

To construct a term of X → T, corresponding to an implication, we can only follow this rule. This what it means to get a term of this type
(remark: and for general equality, the rules are much harder - that's why Martin Löf only did it in the 70's. However, as an example, the equality for numbers, as you e.g. see in is much simpler)

The constructor for the A+B type goes like
"if you got either a:A or b:B, then you get a term of A+B and it known if it's in the left or right slot and what it was"

(cont.)
That is A+B comes with two functions

left: A → A+B
right: B → A+B

so that e.g.
a:A ⊢ left(a) : A+B
b:B ⊢ right(b) : A+B

and A × B is dual to that: There are two functions

π1 : A × B → A
π2 : A × B → B

With this, we can prove
(¬A ∧ ¬B) ⟹ ¬(A∨B)
>If 'A isn't true' as well as 'B isn't true', then 'either A or B is true' is false.

as follows:

( λp. λc. if (c:A) do (π1(p))(c) else (π2(p))(c) ) : ((A → 0) × (B → 0)) → (A+B) → 0

The proposition (¬A ∧ ¬B) ⟹ ¬(A∨B) is true because we have a constructive argument, namely
λp. λc. if (c:A) do (π1(p))(c) else (π2(p))(c).

In words:
Assuming you have a reason (a pair p: ((A → 0) × (B → 0)) ) for both A and B being absurd, the assuming we have a reason to believe that either A or B hold (a term c:A+B), then in the case c shows A, consider how c and the reason π1(p) play together (evaluating π1(p) at c leads to absurdum), and in case c shows B, do the same with π1(p).
Pic related is an implementation of the pullback I did in Idris some months ago, just for fun. You may be able to read it even if you don't know pullbacks.

(cont.)

The compiler checks the type for the term "a_solution", but stops when it gets to the term "will_not_compile".

What does the type checker do here?

I define the pullback type

N x_{f,g} N

where
f : N -> N
is muliplication by 8 and
g: N -> N
is squaring.

To construct a term (this is like the rule in OPs pic) I, per definition, require that one passes two terms a:N and b:N, and moreover a proof that f(a)=g(b), which is what it means to be a bullback for such setty structures.

Now the completely naive "=" in haskell is a function mapping two terms a, b to a type a=b that only has one way of constructing a term. The rule is that
x=x
has a term Refl, i.e. reflexity.

The point here is that the Idris compiler accepts
Refl : (3+4=7)
because the compiler already reduces 3+7 to the normal form 7.

And in turn, 8*3=5^2 fails to compile and the expressive type system saved myself from accidentally having a data (3,5) used for where it's not sensible to have.

Homotopy type theory goes much further by having complicated general typing rules for =, and
p,q : a=b
let you consider the type p=q.
In general, you then have to do something called path induction, which is similar to induction n to n+1 in logic.
But that's just the general theory. If you set up particular types (i.e. homotopy types i.e. groupoids) then you may introduce rules to construct terms of a=b and p=q that are very simple.

If, in particular, you only let one way of setting up a term of equality (like Refl), or if you may show from your theory/type that for p: a=b you can always find a term
f(p) : p=Refl
then you showed there is only one term of p=q.

In my example in the pic above, N is a set because I've made equality so trivial as only to have Refl.
If you read the Z example with the Decartes meme pic, you'll see that the equality (path space) there have more structure. In fact, the path space is how Z is modeled and it thus has a countable infinite number of terms.

(remark: in the above implementation of the pullback, I require for any (a,b) that f(a)=g(b), and a proof of that. But I don't require a proof of uniqueness, i.e. a function u. That's of course also doable, but tiresome as fuck. With a potential general purpose dependently typed language, you ought to choose how strict your want your world. It's a trade-off between safety and practicality)

So in pic related I've written down some computational rules for equality as in Martin-Löf type theory.
You'd now want to find a map

[math] \prod_{p:a=_Nb } \prod_{q:a=_Nb } p=_{a=b}q [/math]

i.e. a map from [math] (a=_Nb)^2 [/math] to
[math] p=_{a=b}q [/math] and you'll be required to make use of properties of [math] =_N [/math], however you defined it.
It's done in the HoTT book

>https:// arxiv. org/abs/0804.3434

Thank you for making the post worth it like a few people always do.

>It's done in the HoTT book

I'll have to check. I guess I'm so used to doing proof via tactics in Coq that I never get to see the underlying proof term in the background, so I don't if, when a refactor a proof, whether the proof term has changed or not. I'll play around and see if I can get two different proof terms for e.g. 1 + 2 + 3 = 6.

Since Coq and Idris both have a comparably simple notion of equality and we don't have any language that has implemented HoTT, it's hard to show how things differ.

As a remark, I pretty much lost interest in the dependent types for lack of practical language and lack of a large enough community.
That's not to say I don't love the idea, just that I literally see WWIII happening sooner