[-] zappy@lemmy.ca 3 points 1 year ago

I hear this from Americans a lot, here everything is pretty much online nowadays (although a friend of mine had her identity stolen so she has to get in person which is her biggest complaint about the whole thing)

[-] zappy@lemmy.ca 3 points 1 year ago

The problem isn't the memory capacity, even thought the LLM can store the information, it's about prioritization/weighting. For example, if I tell chatgpt not to include a word (for example apple) in it's responses then ask it some questions then ask it a question about what are popular fruit-based pies then it will tend to pick the "better" answer of including apple pie rather than the rule I gave it a while ago about not using the word apple. We do want decaying weights on memory because most of the time old information isn't as relevant but it's one of those things that needs optimization. Imo I think we're going to get to the point where the optimal parameters for maximizing "usefullness" to the average user is different enough from what's needed to pass someone intentionally testing the AI. Mostly bc we know from other AI (like Siri) that people don't actually need that much context saved to find them helpful

[-] zappy@lemmy.ca 3 points 1 year ago

Last time I talked about this with the other TAs, we ended up coming to the conclusion that most papers that were decent were close to the max word count or above it (I don't think the students were really treating it as a max, more like a target). Like 50% of the word count really wasn't enough to actually complete the assignment

[-] zappy@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

Over-enthusiatic english teachers... and skynet (cue dramatic music)

[-] zappy@lemmy.ca 16 points 1 year ago

So I'm a reasearcher in this field and you're not wrong, there is a load of hype. So the area that's been getting the most attention lately is specifically generative machine learning techniques. The techniques are not exactly new (some date back to the 80s/90s) and they aren't actually that good at learning. By that I mean they need a lot of data and computation time to get good results. Two things that have gotten easier to access recently. However, it isn't always a requirement to have such a complex system. Even Eliza, a chatbot was made back in 1966 has suprising similar to the responses of some therapy chatbots today without using any machine learning. You should try it and see for yourself, I've seen people fooled by it and the code is really simple. Also people think things like Kalman filters are "smart" but it's just straightforward math so I guess the conclusion is people have biased opinions.

[-] zappy@lemmy.ca 3 points 1 year ago

That's true, also at some point the human will go "that's too much work, I'm not going to answer that" but the ai will always try to give you it's best response. Like I could look up the unicode characters you're using but I'd never actually take the time to do that

[-] zappy@lemmy.ca 4 points 1 year ago

First years have max word counts now, not minimums. That's more a highschool thing.

[-] zappy@lemmy.ca 15 points 1 year ago

Generally, very short term memory span so have longer conversations as in more messages. Inability to recognize concepts/nonsense. Hardcoded safeguards. Extremely consistent (typically correct) writing style. The use of the Oxford comma always makes me suspicious ;)

[-] zappy@lemmy.ca 4 points 1 year ago

Soap bar bags are the superior option, they save you money too

[-] zappy@lemmy.ca 5 points 1 year ago

I use soap bar bags... I can't figure out if that qualifies as barehanded or not

[-] zappy@lemmy.ca 20 points 1 year ago

I'm glad I'm not the only one who was wondering what on earth OP was taking about

[-] zappy@lemmy.ca 8 points 1 year ago

Laser eye surgery

view more: next ›

zappy

joined 1 year ago