You willingly give them key points of information about yourself, directly or indirectly. They then read those signals and use your own information against you to convince you they have answers. And they are often wrong, but you walk around repeating their “insights” as if they are true.
Starting off with an insult, nice.
“claimed” ??? As if I’m making up a story because I’m being paid by Sam Altman?
“Trusting their future to hallucinating”
My sales support engineering friend of course didn’t just copy whatever chatgpt wrote. He proofread it and fixed it. It still saved hours over starting from nothing and then still needing to proofread.
I SAID IT WON’T THINK FOR YOU.