878
submitted 6 months ago* (last edited 6 months ago) by ekZepp@lemmy.world to c/technology@lemmy.world

Update: https://www.bleepingcomputer.com/news/microsoft/microsoft-outage-affects-bing-copilot-duckduckgo-and-chatgpt-internet-search/

It's also important to note that ChatGPT internet search and DuckDuckGo are experiencing similar issues because they use the Bing API.

UPDATE 2

20240523_210619

you are viewing a single comment's thread
view the rest of the comments
[-] joneskind@lemmy.world 9 points 6 months ago

That’s a nice hobby

I would suggest you to install a local instance of a LLM (mistral or llama3 for example) to widen your source of information. Go straight to Wikipedia instead of “googling” it if you don’t already.

Anyway, I didn’t know about kagi so I might take my own advice and give it a try.

[-] peopleproblems@lemmy.world 8 points 6 months ago

How big are they, and what do I need to use them well?

[-] vale@sh.itjust.works 3 points 6 months ago* (last edited 6 months ago)

Take a look at Ollama.ai, just follow the installation instructions. A decent GPU is recommended, and the models are around 10GB iirc.

[-] joneskind@lemmy.world 2 points 6 months ago* (last edited 6 months ago)

Most of 7b-8b models run just fine in 4bits quant and won’t use more than 4 or 5 GB of VRAM.

The only important metric is the amount of VRAM as the model must be loaded in VRAM for fast inference.

You could use CPU and RAM but it is really painfully slow.

If you got an Apple Silicon Mac it could be even simpler.

[-] veniasilente@lemm.ee 2 points 6 months ago

I have an Intel Celeron Mobile laptop with iGPU and, I think, 256MB VRAM. How many bs does that get me for the LLM?

~~Only half-joking. That's my still functional old daily driver now serving as homelab~~

[-] joneskind@lemmy.world 2 points 6 months ago

Well, I got a good news and a bad news.

The bad news is you won't do shit with that my dear friend.

The good news is that you won't need it because the duck is back.

this post was submitted on 23 May 2024
878 points (97.7% liked)

Technology

59674 readers
1908 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS