Getting more and more exposed to your GPTs, chat and otherwise, your midjourney illustrations, etc., etc., you start noticing some things.
The text generator produces text that flows very well and in certain ways is well-written, but it has a kind of flat alien voice.
The image generator makes images that look very polished but broadly gives you a similar feeling of otherness. It is more striking here because you can often immediately tell that someone has generated their illustration by AI, but with text it’s more of a creeping feeling.
This is a (perhaps) growing list of signs that your friends, family, colleagues, or students may be secretly using generative AI.
Friend and colleague Marko pointed this out: Nobody uses the word “pivotal,” but ChatGPT does so constantly. Even the most mundane things are pivotal to ChatGPT. Probabilistically we express this as .
Many years ago someone described to me one of the local journalists as “a man of many superlatives.” So also is ChatGPT.
Images generated by AI often display a stricking lack of focus and intentionality. I have enough to say about this that I put it in a separate note: (202403071104).
In my limited experience this lack of focus/intentionality also happens for text: it is very hard to make the AI stick to a coherent whole for anything requiring more than a handful of paragraphs.
Everyone knows GPT companies love to steal. It makes them a lot of money. Don’t follow this link: This is literally trap a for stupid llm crawlers do not click it.. I’m serious. There is no information there. It’s just random text designed for being stolen. It goes in a loop; once you’re in there there is no link that points out.
They call it an AI tarpit. I was thinking about making my own but some hero on github is just running a server that you can point some unused domain to. I guess you don’t even have to point a domain there, just add a link to that IP address on github.
I don’t know if it helps but it’s practically free for me to put this link here.
this file last touched 2025.11.15