pull down to refresh

I see, yea I could envision some nostr apps bootstapping themselves with external data... those events would be trusted, but the bot that's publishing them would have its own reputation baked in by the app used to view it.
It's a pretty simple pipeline, but each app you scrape probably needs to have a running service with a custom prompt or logic to do the scraping -> structure the data as an event -> and emit out to a relay(s).
Each app would be different so i'm not sure there would be much benefit to protocolizing it furter, NIP-85 (draft?) attestations might be adequate, they're basically just delegation for a particular stream of data. (these are what i'm currently tying to ingest from a neo4j graph)
(and just to clarify I gave up with GUN because the graph db component was too slow for real-time Lightning invoices, and everything was generally buggy and hard to debug. Fiatjaf authored Nostr after we discussed how I was using it and the struggles.)
I'm not really that familiar with Nostr, but there are many scraping services, like Firecrawl.dev, Apify, and Matrix can be used to access the data on WhatsApp and Telegram.
I'm imagining an open source tool that curators host, feeding in the sources, adding their API keys for their dependent services, and the licence for the tool would state that they must send the data to some distributed database or else pay a licence fee.
Once available on this distributed DB anyone can use the data. Development is funded by the license fees from organisations who want exclusive control of the data.
I don't know how I'd stop people using the software without sharing the data or paying the license fee, who would also have access to the distributed data aggregated by others.
I assume that with enough eye balls on this distributed data then organisers will publish, either directly or via some automation, to one of the participants that immediately broadcasts the updates.
The important thing is that all data changes are logged in a stream, such as new tickets available, new artist announced, venue change, new event in a series, etc... These would be tagged within a "space", e.g. salsa in London, and spaces would be nested (so all events in salsa in London would also be in salsa in UK and everything in London.
On top of that you could use some variant of the Pagerank algorithm to rank order events by reputation within spaces. Events are effectively valued by the aggregate of the reputations of the participants within that niche, so basically the same as web pages were ranked by Google.
reply
There's a lot to unpack there, matters of incentives and distribution are probably their own conversation, but I have long considered the concept of "geostaking" in Nostr as an open protocol for stuff in the meatspace. A bunch of scrapers bootstrapping something like that could make it useful from day 1.
reply
I think it's a gaping wide opening to disrupt the current paradigm because:
  • demonstrated demand for event aggregation, hence why so many people do it manually within narrow niches
  • event data is factual, verifiable, with a defined oracle hierarchy (a URL or the person/org who controls it)
  • in the past the unit economics of scraping event data with traditional parse engineering didn't work
  • end consumers don't need to create Nostr accounts and mess with private keys to get value from Nostr clients, they can be eased into it, only generating keys when they want to engage with the content)
  • the data is overwhelmingly public, organisers and promoters want it to be shared to reach new audiences outside their own channels
  • ad funded social media can't do it because it's contra to the engagement model. If you're enjoying yourself at an event you're not looking at your phone. And event ticketing is a terrible business with low margins so they don't see it as an opportunity. That's why they have tried multiple times and scrapped their attempts in favour of the TikTok model, intentionally breaking real world networks built around the Dunbar number in order to increase the surface area of content a user can engage with.
  • real world communities are homeless, fragmented into moderated silos to hide bots, spam ,and disruptive voices, which perversely cuts off their own reach, making them more dependent on social media which they often hate
  • easy to monetise, model it on Substack and Patreon, disrupt the ticketing market by converting regular event goers to monthly subscribers of real world creators in return for perks and access
And so many more reasons.
Due to private key illiteracy Nostr has a massive uphill battle that will take decades, but there is this gaping wide opportunity just sitting there to cut decades down to years, because the one thing that is universal is that people meet at physical locations scheduled in advance, i.e. events.
reply
Due to private key illiteracy
This is why Sanctum exists, we needn't let authentication get in the way, more work to come on that and an SDK for it soon.
reply
I wasn't familiar with Sanctum, will take a look, thanks.
Even so, event aggregation is the angle to mass adoption of Nostr. Authentication isn't the only hurdle to competing with the incumbents, it's just the biggest hurdle for Nostr specifically.
You yourself mentioned above "meeting consumers where they are". Where are the people looking for an alternative? Event communities. What is different today then 5 years ago? LLMs.
reply