pull down to refresh
43 sats \ 0 replies \ @k00b 4 May \ on: I'd rather read the prompt tech
This sums up how I feel about the majority of model outputs. I generally read because I want to see someone's thinking and LLMs are still pretty bad at exposing their thinking. I generally read because I'm trying to learn how to think about the thing being written about. I'm not reading because there's a lack of available things to read and I need extra words around facts.
That isn't to say the output is valueless. It seems very useful for augmenting one's thinking and making connections that one couldn't. The majority of the content just lacks most of what I value in content.