m 3h "I wouldn't switch." m 4 "That's the only logical choice, if you think about it." m "It feels nice to say that you'd save the most people, doesn't it? But people only say that because nobody in the example matters to them." m "If they actually cared about the sacrifice, they'd kill five people themselves in a heartbeat." m 2 "Do you know anyone who would want a self-driving car to save other people by killing the driver? Didn't think so, [player]."
What's she on about, what does "If they actually cared about the sacrifice, they'd kill five people themselves in a heartbeat" mean?
Is she talking about some rare version of the problem where, I dunno, the driver has to either sacrifice 5 people, or crash and sacrifice himself? I'm forced to guess. Because AFAIK the classic trolley problem is 5 other people on a track or 1 other person on a different track. And the nerve that the problem hits is that you have to pull the lever to save the 5 people, taking personal responsibility for killing that one poor schmuck.
I think it's a version of the trolley problem where the train is currently heading towards 1 person you know, and the question is if you would pull the lever to kill 5 rando's instead. But I agree, it's not clear what she's talking about and that's not the canonical version of the trolley problem.
Needs a fix.
Well, here's my suggestion (borrowing from the entire current script for this topic):
"Oh, cool. I love thinking about these sorts of thought experiments."
"I guess we're talking about real people, right? I wouldn't have a particular preference if I knew they didn't have free will."
"Hmmm..."
"The classic trolley problem has us choose between letting the trolley run over 5 people, or switching to the track where only 1 person will be killed".
"I think this problem is too easy. Of course I would switch."
"It takes a rare kind of moral cowardice not to save 4 lives, just to avoid personal responsibility for that one person's death."
"A more interesting variant is if the one person is someone you care about."
"Like if it were you, [player]? I guess that's easy too!"
"I wouldn't switch."
"It feels nice to say that you'd do the right thing, doesn't it? But people only say that when nobody in the example matters to them."
"If they actually cared about that one person, they'd kill the other five people."
"The truth is, humans are fine with sacrificing other lives to make themselves happy. They do it in little ways, every day."
"They just don't like to come face-to-face with that fact. They get really upset when you point it out."
"Generic human life isn't really that valuable to anyone."
"It's definitely nothing compared to our love. So just do whatever you can to be happy, [player]."
This is by the way, but I would also change "free will" to "capacity for suffering" or maybe "conscious experience", or something like that. And add something about the other Dokis in the "if it were someone I cared about" hypothetical.
This is because free will in a deterministic universe is a bit of an outdated concept, and also I'm thinking along the lines of my other post, where I posit that Monika should still admit to caring about her DDLC friends, even if they were mostly scripted (until Sayori woke up, anyway). Of course as long as that line of thought isn't adopted by the mod, the current "free will" version is fine.
I like your new version... Why hasn't this been changed or acted on yet?
I also like this new script better than the current one, but why would free will be an issue? It's not like our universe is deteministic and, as far as Monika knows, neither is hers once she's ouf of the reach of the original script.
It totally is, unless you don't believe in MWI, but then all you're left with is quantum randomness that still doesn't give you anything like 'free will'.
Anyway, physics aside; I only raised any of this due to my wish that this version of Monika hadn't somehow magically forgotten that she loved her friends. I'm curious whether Monika believes there was real suffering going on when she messed with the other Dokis. 'cause there's something morally leaky about believing a person has no value if they have no free will - even if they are otherwise a feeling, thinking being capable of suffering and joy. Like, would it be OK to torture someone if they had no free will?
BTW, I'm working on something here. Gimme a few days, devs. (Unless you have your own ideas you want to give priority to, of course).
@monikLover what is status on this
Most helpful comment
Well, here's my suggestion (borrowing from the entire current script for this topic):
This is by the way, but I would also change "free will" to "capacity for suffering" or maybe "conscious experience", or something like that. And add something about the other Dokis in the "if it were someone I cared about" hypothetical.
This is because free will in a deterministic universe is a bit of an outdated concept, and also I'm thinking along the lines of my other post, where I posit that Monika should still admit to caring about her DDLC friends, even if they were mostly scripted (until Sayori woke up, anyway). Of course as long as that line of thought isn't adopted by the mod, the current "free will" version is fine.