You are right, I was hasty in saying that no social network supports them.
Let's say that those who introduced them did so later and perhaps in premium contexts (related to content producers); most likely, because it is a centralized system, there is active monitoring (done by people or by an AI) of when and how a post is changed, to see if the change is just a minor update or integration, or upsets the meaning of the post itself, including in relation to feedback received so far.
All of this is clearly not possible on Nostr, as it is permissionless and decentralized, so tradeoffs are needed, and user expectations should relate to such limitations, understanding that those who are working on the protocol often act with a broad technical view, and do not just pick and choose for personal taste, and maybe trying to be prudent as well, managing priorities. Of course, it would be important for such views to be explained to all user groups, which is no small task.
3 sats \ 9 replies \ @ek OP 11h
most likely, because it is a centralized system, there is active monitoring (done by people or by an AI) of when and how a post is changed, to see if the change is just a minor update or integration, or upsets the meaning of the post itself, including in relation to feedback received so far.
You’re saying these centralised services are auditing every single edit the moment I hit save to check if the edit "is just a minor update or integration, or upsets the meaning of the post itself, including in relation to feedback received so far" or revert the edit later and that’s what prevents abuse?
I think you’re underestimating how complicated that would be and how much they care about something they aren’t legally interested in. This argument is a stretch.
I am pretty sure if I write something on GitHub and then edit to write the opposite nothing will happen.
They are auditing the content but not the context of edits.
reply
142 sats \ 8 replies \ @dtonon 11h
Yes, I say that, I think this control is done by measuring the potential negative effects of an update. It is not difficult for an AI to analyze the text, only when the context requires it (great exposure and many responses/reactions already recorded), assessing the underlying risks.
Btw, when you hit save, you think that the post have been updated, but often this doesn't happens in real time, it's a background process that, perhaps, include also this sort of checks.
Github is not a “social,” it is a forum with a very technical target audience in which the chances of a manipulation having profound effects at the social level is minimal, and thus the benefits of the edit function clearly outweigh the possible damage caused by a malicious edit.
My argument is that edits overcomplicate things, especially at this time of development. If you wrote something wrong simply delete the note or comment/annotate it. Maybe it's not the best solution for all, but I think it's a reasonable tradeoff.
reply
3 sats \ 7 replies \ @ek OP 10h
Btw, when you hit save, you think that the post have been updated, but often this doesn't happens in real time, it's a background process that, perhaps, include also this sort of checks.
True, but I still think it's a stretch that this context analysis exists and is what prevents abuse. The costs vs benefits don't make sense to me. Yes, it's not difficult as in "run some software" but it still requires significant resources at scale. We're talking about indiscriminate edit context analysis.
Github is not a “social,” it is a forum with a very technical target audience in which the chances of a manipulation having profound effects at the social level is minimal, and thus the benefits of the edit function clearly outweigh the possible damage caused by a malicious edit.
It was just an example, my argument would also apply to Reddit. But "profound effects at the social level" is a good point! Since Reddit is mostly pseudonymous, it doesn't have this effect like Twitter. Mastodon, Facebook and Threads might apply though (and Twitter Premium).
My argument is that edits overcomplicate things, especially at this time of development. If you wrote something wrong simply delete the note or comment/annotate it. Maybe it's not the best solution for all, but I think it's a reasonable tradeoff.
I share your concerns, but I am more concerned about nostr staying a niche because we ignore user expectations. However, I agree, it might be too early and I definitely wouldn't say no edits are a game breaker for nostr, lol. Maybe we can agree on this:
But maybe delaying events is indeed a good compromise between UX and complexity.
?
Thanks for the discussion, you raised good points I wasn't aware of.
reply
1142 sats \ 1 reply \ @dtonon 8h
In fact, maybe I overestimated these big tech circuses:
reply
5 sats \ 0 replies \ @ek OP 8h
That's a great example, thank you! I will reevaluate my opinion. I thought loss of reputation would be enough to prevent abuse but I am no longer sure.
I also think that nowadays, the damage is usually already done before people can clarify what happened. I am not sure if nostr fixes people being sensationalists :/
reply
375 sats \ 4 replies \ @dtonon 10h
The cost/benefit doesn't make sense to me.
It's a supposition, I don't have first hand informations. But big tech social have huge social and legal pressures, they already have complex moderation system in place, I was not surprise if the use them in some other contexts.
I am more concerned about nostr staying a niche because we ignore user expectations
I worry about this too, it is not an easy journey. And that is why it is very important that these discussions arise to share as many points of view as possible, thank you for your accurate feedback.
Maybe we can agree on this: But maybe delaying events is indeed a good compromise between UX and complexity.
Sure, as I already replied you here yesterday I pointed out this exact alternative :)
Staying with the topic: why did SN decide to limit edits to 10 minutes and not leave them free? I suppose it was a reasoned choice; it could provide additional insights. Btw, it's a very good compromise, that unfortunately we cannot apply on Nostr.
reply
19 sats \ 3 replies \ @ek OP 10h
Sure, as I already replied you here yesterday I pointed out this exact alternative :)
Oh, right, haha
why did SN decide to limit edits to 10 minutes and not leave them free?
This was added before I joined iirc and I would guess the timer was added because we don't have edit history yet (/cc @k00b). I think if we had that, maybe we don't need the timer anymore? But I have to admit, I also like the "set in stone" nature of it and maybe we're too used to it now. Has the 10min edit timer became a big part of SN?
Or maybe we could allow 10min edit timer for "invisible edits" and after that we will show history. Backwards-compatible user expectations, lol
reply
10 sats \ 2 replies \ @dtonon 10h
Do we have history now, just for the 10 min edit timespan? I used this edit feature, it's handy, it gives you the freedom to write quickly knowing that you can correct, but it also imposes careful follow-up because you cannot exceed the time limit.
reply
6 sats \ 1 reply \ @ek OP 10h
No, there is no edit history at all. We literally overwrite the content in the database on edits (like we do for deletes).
reply
0 sats \ 0 replies \ @dtonon 10h
Ah ok, sorry I misread your comment. Thank you for the discussion!