you're misunderstanding the analogy. this is a fundamental problem with analogies.
'fundamental' would mean it applies to good analogies, not just wrong ones
it does. an analogy is not 1:1 and if it was it wouldn't be an analogy, it would just be the thing.
That's getting a bit too vague for me.
How is giving people a chatbot like giving paraplegics pedals? Nobody has explained it yet.
I understand that adding an chatbot to an product don't make it more practical, when you find even a Toaster with AI. Also an design made with ChatGPT insteatof research can also result in an absurd crap, halucinating not only in chats.
you're misunderstanding the analogy. this is a fundamental problem with analogies.
'fundamental' would mean it applies to good analogies, not just wrong ones
it does. an analogy is not 1:1 and if it was it wouldn't be an analogy, it would just be the thing.
That's getting a bit too vague for me.
How is giving people a chatbot like giving paraplegics pedals? Nobody has explained it yet.
I understand that adding an chatbot to an product don't make it more practical, when you find even a Toaster with AI. Also an design made with ChatGPT insteatof research can also result in an absurd crap, halucinating not only in chats.