1742
Can you think of any now? (piefed.cdn.blahaj.zone)
top 50 comments
sorted by: hot top controversial new old
[-] wer2@lemmy.zip 18 points 6 days ago

When I was in school, we were taught that vaccines work. /s

[-] yabai@lemmy.world 11 points 6 days ago

Oh I've got a good one. Learned in the American south. Supposedly the American Civil War was not fought over slavery, but differing railroad track widths. Slavery was a minor detail that was a scape goat for the north to force the south to use its standard railroad width.

[-] prime_number_314159@lemmy.world 5 points 6 days ago

It's not just about slavery. There was also state's rights (to slavery), and the economic disparity (turns out free men work harder than slaves?!), and a clash of religious ideals (people that interpret the Bible as pro-slavery vs people that believe benevolence requires abolition). There were even one or two spots where water usage rights and federal funding were in controversy.

[-] kunaltyagi@programming.dev 3 points 6 days ago* (last edited 6 days ago)

The American South (several institutions, not necessarily the ppl) will attempt to make any minor issue as the root cause of the Civil War, except for the slavery issue.

Afterall, slavery and racism wasn't that ingrained in the society. If you look past the court cases, extra judicial killings, lynching, riots, coups and massacres.

[-] Professorozone@lemmy.world 12 points 6 days ago

Sure, some are still taught. Like you can catch a cold from being in the cold.

[-] Alteon@lemmy.world 4 points 6 days ago

I always understood it as your immune system gets weaker from being in the cold and makes it easier for viruses and such to propogate in your body. We're constantly fighting off minor infections and disease, and thankfully our immune systems are pretty strong....cold does not help it.

I'd say this one is sort of true...in the right context.

[-] Professorozone@lemmy.world 3 points 6 days ago

My wife likes to say that so she can keep believing that you can catch a "cold."

No cold virus. No cold.

[-] bitwolf@sh.itjust.works 3 points 6 days ago

I thought the opposite. That when you're cold, and your body releases Norepinephrine, that it re-enforces your immune system.

Which makes sense to me from a personal experience. I like to run around in the snow in tshirt and shorts and embrace the cold. I very rarely catch colds and always thought it was genetics and not a product of environment

[-] Srootus@sh.itjust.works 2 points 6 days ago

My mum says this all the time, haven't the heart to correct her though

[-] Adderbox76@lemmy.ca 10 points 6 days ago

When I graduated highschool, the idea that some dinosaurs had feathers and evolved into birds was still "fringe science".

[-] Srootus@sh.itjust.works 9 points 6 days ago

In my moc-GCSE year(s), my science teacher was so confident that blood was blue in the veins, I called her out on it but she was so adamant about it.

[-] MrSulu@lemmy.ml 6 points 6 days ago

"Yeah, but they ain't disproved my beleif in the flat earth" (sarcasm because crappy day in work)

[-] Adalast@lemmy.world 2 points 5 days ago

Hope things get better man, or whatevet idiot manager you have gets caught with his hand in the boss's daughter's cookie jar.

[-] MrSulu@lemmy.ml 1 points 5 days ago

Cheers mate. Much appreciated.

[-] MacNCheezus@lemmy.today 5 points 6 days ago

Unironically, that sounds like a great task for AI.

[-] SanguineBrah@lemmy.sdf.org 7 points 6 days ago

Great for automatically generating falsehoods; this is true.

[-] MacNCheezus@lemmy.today 1 points 5 days ago

Well, I let you be the judge. Here's a list of outdated facts that were commonly taught before the year 2000 but have since been updated, courtesy of ChatGPT:

Science / Space

Pluto is a planet.

Back then, Pluto was still the 9th planet. In 2006, the International Astronomical Union reclassified it as a “dwarf planet.”

The universe’s expansion was slowing down.

Many textbooks still suggested the universe might eventually collapse in a “Big Crunch.” In 1998, evidence of accelerating expansion was found, but it hadn’t fully filtered into school curricula by 2000.

Dinosaurs were cold-blooded and scaly.

In 2000, the “feathered dinosaur” revolution was just starting. Today, we know many theropods (including raptors) had feathers and were likely warm-blooded.

The continents “drift” slowly but are mostly stable now.

Continental drift was taught, but the understanding of plate tectonics was less developed in school-level detail. We now know tectonic activity reshapes Earth far more dynamically than was often taught.

Biology / Medicine

The human genome was incomplete.

In 2000, the Human Genome Project had just released its first draft. Many textbooks underestimated how complex genetics really is — for example, they suggested humans had ~100,000 genes, but it’s actually about 20,000.

Ulcers are caused by stress and spicy food.

That was the classic teaching. By the 1990s, scientists had already shown that ulcers are often caused by H. pylori bacteria, but the update wasn’t in most classrooms yet.

“Junk DNA” does nothing.

The idea that noncoding DNA was useless filler was common. Now we know much of it plays regulatory or structural roles.

History / Social Studies

The internet is a fad.

You may have heard skepticism about the internet being overhyped. Few predicted how deeply it would transform society in just two decades.

Christopher Columbus “discovered America.”

By 2000, it was still widely taught that Columbus “discovered” the New World, though evidence of Norse settlements (like at L’Anse aux Meadows) was already known — just not widely emphasized. Now, school curricula are far more likely to teach about Indigenous civilizations and earlier arrivals.

The Great Wall of China is the only man-made object visible from space.

This “factoid” was common in classrooms, but it’s false. The wall is not easily visible from orbit without aid, while cities, roads, and airports often are.

[-] degoogler@lemmy.zip 2 points 6 days ago

In an atom, the electrons orbit around the nucleus in the same manner as the planets orbit around the sun.

That's been debunked for many many decades but middle scool still teaches this model. At least I wasn't told back then how misleading and wrong that is, only in high school right before graduating the physics teacher emphasized this misconception. I remember how mad she was about it lol. I have no clue how its taught elsewhere.

[-] Adalast@lemmy.world 4 points 5 days ago* (last edited 5 days ago)

The Bhor's model is at least a useful simplification of the atomic structure. What needs taught is that everything you learn before college and intensive narrow topical courses is simplified to the point of being incorrect with the hope that you get enough of an intrinsic understanding of the concept that the less simplified explanation you get next will make sense. I say this because it will still be simplified to the point of being wrong, but will be a step closer to the truth. This is the essence of education.

  • Elementary/middle school: ice is water that has frozen solid
  • HS: ice is water that has lost enough energy that the molecules form a crystalline lattice.
  • College: there are actually 19 or 20 kinds of water ice that have been verified, but as many as 74,963 might exist.
  • Post-collegiate: There may be 74,963 kinds of ice, but I know one ICE we should definitely eliminate from this world.
[-] missfrizzle@discuss.tchncs.de 3 points 6 days ago* (last edited 6 days ago)

I was taught that serious academics favored Support Vector Machines over Neural Networks, which industry only loved because they didn't have proper education. oops...

also, Computer Vision was considered "AI-complete" and likely decades away. ImageNet dropped a couple years I graduated. though I guess it ended up being "AI-complete" in a way...

[-] bluemellophone@lemmy.world 2 points 6 days ago* (last edited 6 days ago)

Before AlexNet, SVMs were the best algorithms around. LeNet was the only comparable success case for NNs back then, and it was largely seen as exclusively limited to MNIST digits because deep networks were too hard to train. People used HOG+SVM, SIFT, SURF, ORB, older Haar / Viola-Jones features, template matching, random forests, Hough Transforms, sliding windows, deformable parts models… so many techniques that were made obsolete once the first deep networks became viable.

The problem is your schooling was correct at the time, but the march of research progress eventually saw 1) the creation of large, million-scale supervised datasets (ImageNet) and 2) larger / faster GPUs with more on-card memory.

It was fact back in ~2010 that SVMs were superior to NNs in nearly every aspect.

Source: started a PhD on computer vision in 2012

[-] missfrizzle@discuss.tchncs.de 2 points 5 days ago

HOG and Hough transforms bring me back. honestly glad that I don't have to mess with them anymore though.

I always found SVMs a little shady because you had to pick a kernel. we spent time talking about the different kernels you could pick but they were all pretty small and/or contrived. I guess with NNs you pick the architecture/activation functions but there didn't seem to be an analogue in SVM land for "stack more layers and fatten the embeddings." though I was only an undergrad.

do you really think NNs won purely because of large datasets and GPU acceleration? I feel like those could have applied to SVMs too. I thought the real win was solving vanishing gradients with ReLU and expanding the number of layers, rather than throwing everything into a 3 or 5-layer MLP, preventing overfitting, making the gradient landscape less prone to local maxima and enabling hierarchical feature extraction to be learned organically.

[-] bluemellophone@lemmy.world 2 points 5 days ago* (last edited 5 days ago)

No, you are correct. Hinton began researching ReLUs in 2010 and his students Alex Krizhevsky and Ilya Sutskever used it to train a much deeper network (AlexNet) to win the 2012 ILSVRC. The reason AlexNet was so groundbreaking was because it brought all of the gradient optimization improvements (SGD with momentum as popularized by Schmidhuber, and dropout), better activation functions (ReLU), a deeper network (8 layers), supervised training on very large datasets (necessary to learn good general-purpose convolutional kernels), and GPU acceleration into a single approach.

NNs, and specifically CNNs, won out because they were able to create more expressive and superior image feature representations over the hand-crafted features of competing algorithms. The proof was in the vastly better performance, it was a major jump when the performance on the ILSVRC was becoming saturated. Nobody was making nearly +10% improvements on that challenge back then, it blew everybody out of the water and made NNs and deep learning impossible to ignore.

Edit: to accentuate the point about datasets and GPUs, the original AlexNet developers really struggled to train their model on the GPUs available at the time. The model was too big and they had to split it across two GPUs to make it work. They were some of the first researchers to train large CNNs with GPUs. Without large datasets like the ILSVRC they would not have been able to train good deep hierarchical convolutions, and without better GPUs they wouldn’t have been able to make AlexNet sufficiently large or deep. Training AlexNet on CPU only for ILSVRC was out of the question, it would have taken months of full-tilt, nonstop compute for a single training run. It was more than these two things, as detailed above, but removing those two barriers really allowed CNNs and deep learning to take off. Much of the underlying NN and optimization theory had been around for decades.

load more comments
view more: next ›
this post was submitted on 23 Sep 2025
1742 points (98.7% liked)

Science Memes

16911 readers
835 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS