Lambda Calculus – A Simple, Practical Summary
Single-Step vs Multi-Step Reduction
→r means a single reduction step.
→→r means zero, one, or many reduction steps.
How Multi-Step Reduction Works
- If
A →r B, then A →→r B.
A →→r A (zero steps).
- If
A →→r B and B →→r C, then A →→r C.
Why Multi-Step Reduction Matters
Multi-step reduction lets us talk about full computations, not just single moves.
It helps answer:
- Does this expression eventually finish?
- Does it loop forever?
- Do different reduction paths reach the same result?
Python Analogy (Very Simple)
Imagine a reduction step is “subtract 1”.
def one_step_r(x):
return x - 1 # one reduction step
def many_steps_r(x, steps):
return x - steps # many reduction steps
This mirrors the idea that:
- →r = one step
- →→r = zero, one, or many steps
Strong vs Weak Normalisation
- Weakly normalising: at least one reduction path reaches a final result.
- Strongly normalising: every possible reduction path reaches a final result.
The Omega Term (Ω)
Definition:
Ω = (λx. x x) (λx. x x)
Meaning:
This expression reduces to itself forever and never reaches a final result.
Real-world analogy:
It behaves like an infinite loop in programming.
The Omega-Sharp Term (Ω#)
Definition:
Ω# = λx. Ω
Meaning:
- Ω# is a function that ignores its input and always returns Ω.
- Since Ω never terminates, calling Ω# on anything also never terminates.
- Unlike Ω, Ω# itself is a lambda abstraction, so it is already in normal form.
Real-world analogy:
A function that looks harmless but always triggers an infinite loop when used.
Simple Analogy
Think of reduction like walking:
- →r = taking one step.
- →→r = taking as many steps as needed to reach the destination.