Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • tygerprints@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    If as you suggest the AI in question can feel pain and suffer, of course I would care and not want it to have to experience that. Why would I? I’m not a sadist or a monster, or a Utah legislator without any human feelings.

    It’s like the scenario, “if you could get away with murdering one person, would you do it?” Of course I wouldn’t!!! Whether or not I could get away with it, I still have to live with myself and what I do. And I have a thing called “morality” that I live with and a respect for life that goes beyond my own self-concern.