621
What are the most mindblowing things in mathematics?
(lemmy.world)
submitted
1 year ago* (last edited 1 year ago)
by
cll7793@lemmy.world
to
c/nostupidquestions@lemmy.world
What concepts or facts do you know from math that is mind blowing, awesome, or simply fascinating?
Here are some I would like to share:
- Gödel's incompleteness theorems: There are some problems in math so difficult that it can never be solved no matter how much time you put into it.
- Halting problem: It is impossible to write a program that can figure out whether or not any input program loops forever or finishes running. (Undecidablity)
The Busy Beaver function
Now this is the mind blowing one. What is the largest non-infinite number you know? Graham's Number? TREE(3)? TREE(TREE(3))? This one will beat it easily.
- The Busy Beaver function produces the fastest growing number that is theoretically possible. These numbers are so large we don't even know if you can compute the function to get the value even with an infinitely powerful PC.
- In fact, just the mere act of being able to compute the value would mean solving the hardest problems in mathematics.
- Σ(1) = 1
- Σ(4) = 13
- Σ(6) > 10^10^10^10^10^10^10^10^10^10^10^10^10^10^10 (10s are stacked on each other)
- Σ(17) > Graham's Number
- Σ(27) If you can compute this function the Goldbach conjecture is false.
- Σ(744) If you can compute this function the Riemann hypothesis is false.
Sources:
- YouTube - The Busy Beaver function by Mutual Information
- YouTube - Gödel's incompleteness Theorem by Veritasium
- YouTube - Halting Problem by Computerphile
- YouTube - Graham's Number by Numberphile
- YouTube - TREE(3) by Numberphile
- Wikipedia - Gödel's incompleteness theorems
- Wikipedia - Halting Problem
- Wikipedia - Busy Beaver
- Wikipedia - Riemann hypothesis
- Wikipedia - Goldbach's conjecture
- Wikipedia - Millennium Prize Problems - $1,000,000 Reward for a solution
I find the logistic map to be fascinating. The logistic map is a simple mathematical equation that surprisingly appears everywhere in nature and social systems. It is a great representation of how complex behavior can emerge from a straightforward rule. Imagine a population of creatures with limited resources that reproduce and compete for those resources. The logistic map describes how the population size changes over time as a function of its current size, and it reveals fascinating patterns. When the population is small, it grows rapidly due to ample resources. However, as it approaches a critical point, the growth slows, and competition intensifies, leading to an eventual stable population. This concept echoes in various real-world scenarios, from describing the spread of epidemics to predicting traffic jams and even modeling economic behaviors. It's used by computers to generate random numbers, because a computer can't actually generate truly random numbers. Veritasium did a good video on it: https://www.youtube.com/watch?v=ovJcsL7vyrk
I find it fascinating how it permeates nature in so many places. It's a universal constant, but one we can't easily observe.