86
top 23 comments
sorted by: hot top controversial new old
[-] Lugh@futurology.today 17 points 8 months ago

Silicon computing is starting to run up against hard limits when it comes to energy usage. Bitcoin mining is currently using 2% of the USA's energy. Data Centers are projected to be using a third of Ireland's electricity output by 2026.

However it seems next-generation solutions are on the horizon, and this is one of them. Transitioning computing to energy-efficient new technologies is another front in the war to slow climate change.

[-] NocturnalMorning@lemmy.world 5 points 8 months ago

The solution is right in front of us. Stop burning fossil fuels. We could do it tomorrow, but we don't want too bcz it would lower people's quality of life, and make billionaires less rich, heavy emphasis on the later statement.

[-] FartsWithAnAccent@lemmy.world 12 points 8 months ago* (last edited 8 months ago)

Efficiency still either burns less fossil fuels or gets more out of renewables, it helps either way.

[-] exocrinous@lemm.ee 1 points 8 months ago

It wouldn't lower people's quality of life. People who live in medium density neighbourhoods and ride bikes to work have better lives. They sleep better because it's quiet, breathe better because there's less pollution, don't need to go to the gym, and get plenty of sun. Plus, no road rage. If you take public transit, you can read a book on the train. What I'm describing is the way life is supposed to be.

[-] A_A@lemmy.world 3 points 8 months ago* (last edited 8 months ago)

Original source (free access) :
https://onlinelibrary.wiley.com/doi/10.1002/advs.202303835
So, if I read it correctly, they do not modify the fiber so the training information would be store in the fiber.
They do not have light that can learn by itself either ... instead, what they do is they notice that a very reproducible noise pattern is created and they are training a machine outside of the optical fiber to recognize which part of this noise could be interpreted as information ... all of this is in fact very power costly, ... ~~and is likely to remain so~~.
Edit : I removed my last statement because I don't want to start bickering about sterile nonsense.

[-] Lugh@futurology.today 2 points 8 months ago

all of this is in fact very power costly, … and is likely to remain so.

I'm not sure how you arrived at that conclusion. The direct quotes from the actual researchers say the opposite.

[-] A_A@lemmy.world 1 points 8 months ago

and is likely to remain so.

Well, in fact I don't care at all for that last statement of mine. So, if this is all you disagree about my reading of the article then it's fair game for me.

[-] BluesF@lemmy.world 2 points 8 months ago

It's significantly less compirationally costly however because you only need to train and run a small, linear output transformation rather than a full nonlinear neural network.

this post was submitted on 02 Mar 2024
86 points (93.0% liked)

Futurology

1765 readers
208 users here now

founded 1 year ago
MODERATORS