pull down to refresh

Nice little article is on why letting ai models crawl your site is good (mostly).
102 sats \ 0 replies \ @optimism 18h
I am using a crawl project as a vibe coding test scenario for testing model and tooling capabilities.
every LLM has thus far created this:
    this.userAgents = config.userAgents || [
      'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36',
      'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36',
      'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36'
    ];
So I'm quite sure that the percentages mentioned in their tracker are... off.
reply
0 sats \ 0 replies \ @Tony 6h
This is indeed interesting. I’m myself researching this as I figured people will increasingly use chatbots for search. I even got a few visits from ChatGPT on my educational bitcoin website (I use Umami analytics to preserve visitors privacy, but still have tools to improve conversion).
Just couple of days ago I found out it’s a good practice - both for regular web crawlers and chatbots to consider your content - to add an invisible JSON block with structured page info. It should have title, description, author, other info. As this block is structured, crawlers understand it much better and are more likely to suggest it to readers.
reply