pull down to refresh

At least, your guides get read.
hahaha you make me laugh :) do you think so? Sincerely, several of them were just drafts. When I was testing something new, I like to write down the steps I did, screenshots sometimes, taking notes of details, to remember me how I did it. Bitcoin apps are not so simple under the hood as many think and you need to understand all the functionalities in order to truly understand how to use them. We are opening new charters here.
So after I wrote drafts and notes, I took them and change them a bit and convert them into guides for noobs and other bitcoiners. So literally I wrote the guides for myself in the first place. Indeed I learned a lot just by writing them.
I see popping up new guides but many are just shitGPT and copy/paste sections from others, barely can see a truly PoW guide that was worked hard. I like for example the Parman and Minibolt guides. Are truly PoW.
I've read a few. But mostly when I have a specific problem. It's like when I had to use chantools. One is only interested in such a tool when a problem has already happened.
AI won't work for these new charters as you call them. It'll work for things that everyone knows how to do. My guess is that some ChatGPT answers on these topics have been trained on your guides... whether you like it or not~~
reply
btw... Oliver just released a new version for chantools https://github.com/lightninglabs/chantools/tree/v0.13.6
some ChatGPT answers on these topics have been trained on your guides.
indeed: #874938
I wonder if is possible that every time a shitGPT is using my content, they pay me sats.
shitGPT MUST PAY, not the noobs
reply
This made me wonder if there is a way to protect your data from being scraped. I found this, but it relies on the bots voluntarily complying.
The day those AI agents can fulfill LN invoices, it'll be good to update your Robots.txt file with a prompt and a static invoice to send you some sats~~
I'll keep it in mind next time I need Chantools...
reply
it'll be good to update your Robots.txt file
Great idea! I did that with my private self-hosted email server, replying to all non-contacts with a NDR message saying to pay a LN invoice first. LN indeed keep away the spammers.
... until the shitGPT agents will learn how to use autonomously LN LOL
reply