I'm not following why there is a need to track anyone's search history.
For blocking a URL, all there would need to be is a row of boxes that I paste a URL into and the search engine never shows me those sites.
For prioritized sites, another row of boxes I enter sites that I want to be shown at the top if they have content related to my search queries.
When I search, the site searches the prioritized sites first and if any of the prohibited sites have content related to my search query, they wouldn't be shown to me.
Does that make sense?
This makes total sense! I understand the idea of a whitelist and a blacklist of sites. You'd have to create an account on my server, I would store your lists there, and for every query you make I look into those lists to filter the results for you. You get better results, but I get the full list of your searches, all linked to your account.
I wonder if some perceived loss of privacy here is actually negligible compared to the positive effects this would give.
reply
Is there a way to keep the whitelist/blacklist client side?
reply
Need to experiment here, not sure atm. Keeping the list is simple, filtering on the client means much more data has to be served to the client, that might be an issue. OTOH, if I make the price higher for this feature, then I might get properly compensated. Thank you for pushing on this!
reply
You could store templates (list of blackisted sites) on the server, which the user can download onto his client, or just simply let the user add his own entries. you can store the list on the client, inside local storage. you can do the filtering client side also, just match the set from the server with the urls in the local storage, and throw out any matches. no need for the server to know anything.
I think this is a niche feature though
reply