I have to say I was impressed with an AI item for the first time last week.
I was documenting an IT infrastructure architecture using a tool called Notion.
I'd written the text and wanted to add a diagram but couldn't find how to start the diagramming widgetry.
I was looking around for help and it said to press spacebar for ai.
So I pressed spacebar and typed "draw architecture diagram", expecting some generic boxes and lines to appear.
But no, it drew my diagram, including three scenarios I'd written into the text.
It had depicted the layers top-down, whereas I wanted them bottom-up, but otherwise it was spot-on.
(I think my very structured descriptions had enabled it to infer the componentry.)
OTOH, I have a colleague who is using ChatGPT to help him write Bash scripts; mostly they're OK, if a bit quirky sometimes.
Well, there was one which wasn't working and it took us a while to figure out what it was doing, but it was still erroring.
After a bit of reading of man pages, comparing Mac vs. Linux I spotted the issue: it was supplying an empty string "" in the place of an optional arg.
Deleting that fixed it.
So, it got a prototype in place, but (a) complicated the issue and (b) had a syntax error.
Summary: AI can be good and impressive, but you always have to keep your eye on it and not blindly accept its outputs as verbatim.