A paperclip maximizer driven by self-preservation? What could possiblie go wrong?
charonn0
Pirate King: HE DID?!? ... oh... oh, yes so he did... I was there.
Are there examples of censorship or prior restraint you'd like to highlight?
Ctrl-F "plato"
Required reading
?
Yet Trump can declassify documents by thought alone.
"Here come the test results: 'You are a horrible person'. That's what it says, 'a horrible person'. We weren't even testing for that!"
It could certainly be used as evidence in your favor. Whether it by itself would be enough to exonerate you would depend on things like the evidence against you and how much weight the jury gave to your records.
These are known as souvenir plots. Generally, you aren't buying the land, but rather you're buying a contractual right to prevent the actual owner from developing the land.
Californian. No.
It wouldn't solve any problems that can't be solved by other means, and it would create new problems that we haven't had to worry about before. It'd be a net loss for everyone involved.
Insufficient data for a meaningful answer.
The problem is that an AI built to maximize paperclips might conclude that converting the planet to paperclips is an acceptable cost of maximizing paperclip production. It might understand why humans think it's bad to convert the planet, but disagree. It would need to be explicitly programmed to prioritize human life over paperclips.
If it were super-intelligent, it could probably trick us into leaving it turned on.