We found in our review that your app includes user-generated content but does not have all the required precautions. Apps with user-generated content must take specific steps to moderate content and prevent abusive behavior.
Next Steps
To resolve this issue, please revise your app to implement the following precautions:
  • A method for filtering objectionable content
  • A mechanism for users to flag objectionable content
  • The developer must act on objectionable content reports within 24 hours by removing the content and ejecting the user who provided the offending content
App Store Guideline 1.2 - Safety > User Generated Content https://developer.apple.com/forums/thread/688227
All those are impossible. Except, for #1, but it can only be on the client side. Jb55 should just add a “report” function that just send the “report” into the ether lol.
reply
According to jb55, the challenge is actually #3
Apple should submit a NIP in github. 😅
On a serious note, I think banning users is possible. If memory serves me right, some relay implementations have a configurable blacklist. Also, theoretically, relay operators should have the ability to remove “objectionable content" from their database.
Damus is just a browser.
reply
#3 is possible but not in a desirable way.
The content comes from the relays, damus ships with default relays, one of them controlled by jb55, so to pass the review, damus needs to ship with jb55's relay enabled and abiding Apple's moderation policies, and other "open" relays disabled and must be manually enabled by user after accepting warnings.
reply
214 sats \ 1 reply \ @pi 3 Jan 2023
That won't work, I'm sure they understand nostr well enough, and will therefore never get fooled by this.
A compromise is needed, because nostr does so many things very differently, and these rules apply to an obsolete centralised model.
reply
Its not fooling Apple because it would be restricted by Apple's moderation by putting an additional burden on jb55 to moderate the default relay
reply