115

In June, the U.S. National Archives and Records Administration (NARA) gave employees a presentation and tech demo called “AI-mazing Tech-venture” in which Google’s Gemini AI was presented as a tool archives employees could use to “enhance productivity.” During a demo, the AI was queried with questions about the John F. Kennedy assassination, according to a copy of the presentation obtained by 404 Media using a public records request.  

In December, NARA plans to launch a public-facing AI-powered chatbot called “Archie AI,” 404 Media has learned. “The National Archives has big plans for AI,” a NARA spokesperson told 404 Media. It’s going to be essential to how we conduct our work, how we scale our services for Americans who want to be able to access our records from anywhere, anytime, and how we ensure that we are ready to care for the records being created today and in the future.”

Employee chat logs given during the presentation show that National Archives employees are concerned about the idea that AI tools will be used in archiving, a practice that is inherently concerned with accurately recording history. 

One worker who attended the presentation told 404 Media “I suspect they're going to introduce it to the workplace. I'm just a person who works there and hates AI bullshit.”

you are viewing a single comment's thread
view the rest of the comments
[-] wizardbeard@lemmy.dbzer0.com 24 points 1 month ago* (last edited 1 month ago)

Scanning texts is OCR and has never needed modern LLMs integrated to achieve amazing results.

Automated tagging gets closer, but there is a metric shit ton that can be done in that regard using incredibly simple tools that don't use an egregious amount of energy or hallucinate.

There is no way in hell that they aren't already doing these things. The best use cases for LLMs for NARA are edge cases of things mostly covered by existing tech.

And you and I both know this is going to give Google exclusive access to National Archive data. New training data that isn't tainted by potentially being LLM output is an insanely valuable commodity now that the hype is dying down and algorithmic advances are slowing.

this post was submitted on 15 Oct 2024
115 points (95.3% liked)

Technology

59670 readers
2259 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS