1025
you are viewing a single comment's thread
view the rest of the comments
[-] Hawk@lemmynsfw.com 6 points 2 years ago

They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

[-] didnt_readit@sh.itjust.works 1 points 2 years ago* (last edited 2 years ago)

They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

[-] Hawk@lemmynsfw.com 1 points 2 years ago

It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.

this post was submitted on 29 Mar 2024
1025 points (98.0% liked)

Curated Tumblr

6122 readers
2 users here now

For preserving the least toxic and most culturally relevant Tumblr heritage posts.

Here are some OCR tools to assist you in transcribing posts:

Don't be mean. I promise to do my best to judge that fairly.

founded 2 years ago
MODERATORS