240
you are viewing a single comment's thread
view the rest of the comments
[-] pglpm@lemmy.ca 35 points 1 year ago* (last edited 1 year ago)

This image/report itself doesn't make much sense – probably it was generated by chatGPT itself.

  1. "What makes your job exposed to GPT?" – OK I expect a list of possible answers:
    • "Low wages": OK, having a low wage makes my job exposed to GPT.
    • "Manufacturing": OK, manufacturing makes my job exposed to GPT. ...No wait, what does that mean?? You mean if my job is about manufacturing, then it's exposed to GPT? OK but then shouldn't this be listed under the next question, "What jobs are exposed to GPT?"?
    • ...
    • "Jobs requiring low formal education": what?! The question was "what makes your job exposed to GPT?". From this answer I get that "jobs requiring low formal education make my job exposed to GPT". Or I get that who/whatever wrote this knows no syntax or semantics. OK, sorry, you meant "If your job requires low formal education, then it's exposed to GPT". But then shouldn't this answer also be listed under the next question??

  

  1. "What jobs are exposed to GPT?"
    • "Athletes". Well, "athletes" semantically speaking is not a job; maybe "athletics" is a job. But who gives a shirt about semantics? there's chatGPT today after all.
    • The same with the rest. "Stonemasonry" is a job, "stonemasons" are the people who do that job. At least the question could have been "Which job categories are exposed to GPT?".
    • "Pile driver operators": this very specific job category is thankfully Low Exposure. "What if I'm a pavement operator instead?" – sorry, you're out of luck then.
    • "High exposure: Mathematicians". Mmm... wait, wait. Didn't you say that "Science skills" and "Critical thinking skills" were "Low Exposure", in the previous question?

  

Icanhazcheezeburger? 🤣

(Just to be clear, I'm not making fun of people who do any of the specialized, difficult, and often risky jobs mentioned above. I'm making fun of the fact that the infographic is so randomly and unexplainably specific in some points)

[-] Urist@lemmy.ml 18 points 1 year ago

I've seen GPT struggling with pretty basic maths and "abstract" tasks such as making the letters add up in an anagram. Math requires insight that a language model cannot posess. I don't really get why people like infographics so much. The format usually just distracts from the data presented, which is convenient given that the data is usually garbage too.

[-] pglpm@lemmy.ca 4 points 1 year ago

Math requires insight that a language model cannot posess

Amen to that! Good maths & science teachers have struggled for decades (if not centuries) so that students understand what they're doing and don't simply give answers based on some words or symbols they see in questions [there are also bad teachers who promote this instead]. Because on closer inspection such answers always collapse. And now comes chatGPT that does exactly that instead – and collapses in the same way – and gets glorified.

Amen to what you say on infographic content as well 😂

load more comments (2 replies)
this post was submitted on 27 Jul 2023
240 points (84.1% liked)

Technology

59598 readers
1824 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS