

3·
5 days agoI’m a little defeatist about it. I saw with my own 3 eyes how a junior asked ChatGPT how to insert something into an std::unordered_map
. I tell them about cppreference. The little shit tells me “Sorry unc, ChatGPT is objectively more efficient”. I almost blew a fucking gasket, mainly cuz I’m not that god damn old. I don’t care how much you try to convince me that LLMs are efficient, there is no shot they are more efficient than opening a static page with all the info you would ever need. Not even considering energy efficiency. Utility aside, the damage we have dealt to developing minds is irreversible. We have convinced them that thought is optional. This is gonna bite us in the ass. Hard.
Damn, I forgot about the teaching aspect of programming. Must be hard. I can’t blame students for taking shortcuts when they’re almost assuredly swamped with other classwork and sleep-deprived, but still. This is where my defeatist comment comes in, because I genuinely think LLMs are here to stay. Like autocomplete, but dumber. Just gotta have students recognize when ChatGPT hallucinates solutions, I guess.