I dunno, it should be able to at least summarize existing knowledge. It does surprisingly well with programming.
You do realize that the majority of code produced is pretty crap? The equivalent of blocks glued together with hot snot? With the reliability of soap bubbles?
It was pretty impressive with my Macgyver project.
I was not referring to only GPT engines, but to human code writers as well.
Sure, it
looks impressive, because that is exactly what gets copied and used the most.
Popular ≠ high quality. Remember, GPT engines base their 'knowledge' on what humans have already produced. They have no facility to determine 'quality', so the GPT trainers use some kind of weighting scheme based on links; a qualified popularity, if you will. This yields "impressive" results, but there is no real quarantee of the underlying quality at all. My problem is that most of code written by humans or generated by various toolkits is at most "pretty", not of good quality (reliability, robustness against unexpected inputs, efficiency, etc.).
Think of shiny gadgets from China with lots of blinky leds, but with their actual innards being the Cheapest-Backalley-Mart quality.
Lots of people think that stuff is impressive, too.
The root of the issue is best exemplified by my pet peeve. Pick up any C book, and one of the exercises will be to read all files in a directory. The "correct" solution it shows will use
opendir(),
readdir(), and
closedir(). This is utter crap, because 99% of such examples fail to handle the cases where the directory contents change during the scan. Another exercise will extend that to scan all files in a directory tree. That will be similar, with string operations to handle the path name construction. That is even worse, because it utterly fails if subtrees are moved or renamed during the scan. The only reason it is shown this way, is because it was the way it was done in 1989. (I'll omit a rant here, involving BSD, Single Unix Specification, POSIX, the C standard, and what happened with the C11 standard and Microsoft. Let's just say that I'm quite happy that
C2x/C23 seems to treat C11 mostly as a fork (including removing Annex K), and return to the approach used in the standard developed from C89 to C99: adding features 'end users'/programmers need and have asked from compiler developers, instead of having the committee dictate new features.)
The correct solution on all systems except Microsoft Windows is to use
scandir(),
scandirat(),
glob(),
wordexp(),
nftw(), the aforementioned being built-in to POSIXy standard C libraries and most BSDs, and/or
fts_..() family of functions available on all BSD variants and Linux. Because these are part of the base C library on all operating systems except Windows, they are expected to handle the aforementioned issues correctly (but if you test, remember that time-based race windows will still exist; just checking when a specific function returns using a wall clock does not mean the actual filesystem change has propagated to be visible to all processes).
On Windows, you also use the above, but with custom implementations that fulfill the synchronicity requirements you have. One can write their own, or use any of the freely-licensed implementations –– noting that the straightforward opendir()/readdir()/closedir() ones are heuristic, not deterministic, when directory tree operations occur during scanning.
I often see extremely experienced C programmers describe a neat "solution" using straightforward opendir()/readdir()/closedir() without any additional logic to handle concurrent modifications as "impressive", too. :'(
Now, I do realize I am in a very tiny minority here, because almost all humans are used to the fact that the software they use is pretty crappy
in absolute terms. Then, something that seems to work just fine is certainly impressive. This is sufficient for even commercial software, and anything above that is –– in my personal experience –– declared
excessive perfectionism. I like to write code and build systems that work without issues for years on end, that when an issue does occur (even if just a hardware one that is normally ignored, say like a delayed write error at file close()/fclose() time) does let me know, so I can decide for myself what to do.