778
Half as Hot
(mander.xyz)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
This is a science community. We use the Dawkins definition of meme.
But °C was mentioned in the units, and its well understood that 0°C is a cold temperature for humans.
I'm not a fan of marketing doublespeak either, but I think the right scale and right terminology was used here. They cut the temperature in half, in Celsius, on the basis that 0°C is very cold.
Thats where the physics comes in. if the temperature is halved in terms of celsius from 70° to 35°, if in your case the temperature starts at 100°, the same energy difference would only bring the temperature down to something closer to 65° than 50°.
the specific cooling capacity of the cooler in question only "halves" the temperature if you start at a very specific point.
My entire argument rests on the premise that 0°C is a rational start point for both C and F, but I concede that halving something doesn't explain absolute changes
But centigrade isn't a measure of absolute units and is disingenuous. Using your argument it requires the consumer/reader to make a number of inferences or assumptions which isn't a good method of communication in general. It is perfectly valid to say that the cooler took CPU temperatures from 70°C to 35°C.
Why not just say that. It's an impressive stat!
Scales exist for a reason. Cutting 70°C in half is by definition -101.5°C. But let's assumed somehow everyone is on the same page and that anything below 0°C should just be ignored in this specific scenario and not any other (confusing right?), saying the temperature was cut in half is still confusing! Half from where? Did it go from 20°C to 10°C? From 80°C to 40°C? It just doesn't mean anything and as said before I would argue just stating the numbers is more impressive and informative.
I agree that the numbers should just speak for themselves
I'd argue here that no one would make this leap nor mental calculation, and most people would just divide X by 2 and gauge what the resulting Y is based on their familiarity with the weather.
They still have to make these inferences to understand whether or not 70 to 35 is a remarkable feat or not.
If it's 30 / 2 = 15, people would think "Huh, 15 is pretty cool compared to room temperature ~ 20ish , that's significant". If it's 90 / 2 = 45, people would think "Huh, both 90 and 45 are pretty hot, but it seems like a meaningful reduction nonetheless."
I dunno, maybe I'm overdefending this
All I can say is that in my professional career where I have to write technical reports and summarize technical information I would never represent it that way, and I would be concerned if a colleague, customer, or supplier did it even if they were communicating it to a non technical audience. I would also call out my employer or management if they ever tried to change the representation of the data to something like this.
That could say more about me than anything else, but that's where I am at.