722
Biology OP
(mander.xyz)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
And mathematicians divide by multiplying!
In formal definitions of arithmetics, division can be defined via multiplication: as a simplified example with real numbers, because a ÷ 2 is the same as a × 0.5, this means that if your axioms support multiplication you'll get division out of them for free (and this'll work for integers too, the definition is just a bit more involved.)
Mathematicians also subtract by adding, with the same logic as with division.
a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.
Defining a/b as a * (1/b) makes sense if you're learning arithmetic, but logically it's more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.
That's me, a degree-holding full time computer scientist, just learning arithmetic I guess.
Bonus question: what even is subtraction? I'm 99% sure it doesn't exist since I've never used it, I only ever use addition.
Addition by the additive inverse.
Now you just replaced one incalculable thing with a different incalculable thing.
Eh?
Computers don't subtract, and you can't just add a negative, a computer can't interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It's addition all the way down.
What does this have to do with computers?