We really should look at intolerance as part of the social contract and those that are intolerant have violated the contract and are no longer a part of it and should be booted out.
It’s a philosophical paradox of sorts though. If we then begin to no longer tolerate intolerance, don’t we then become intolerant ourselves? It’s a slippery slope. Philosophically speaking of course.
Edit: the answer, IMHO, is a benovlent and benign world Governing AGI… However getting there is difficult if impossible, given that right now AI is programmed by humans, and thus tainted.
We really should look at intolerance as part of the social contract and those that are intolerant have violated the contract and are no longer a part of it and should be booted out.
It’s a philosophical paradox of sorts though. If we then begin to no longer tolerate intolerance, don’t we then become intolerant ourselves? It’s a slippery slope. Philosophically speaking of course.
Edit: the answer, IMHO, is a benovlent and benign world Governing AGI… However getting there is difficult if impossible, given that right now AI is programmed by humans, and thus tainted.