vitaelamorte: (default)
[ en ] tranceway . m . o . d . s. ([personal profile] vitaelamorte) wrote in [community profile] entrancelogs2018-12-11 09:21 pm

Here we are as in olden days, happy golden days of yore | OPEN

Who: E V E R Y O N E (Including our 4th wall guests!)
Where: Everywhere, one both Mirror and Real Side!
When: December 12th - December 18th (WEEK ONE)
Rating: PG-13, warn if going higher
Summary: This year's Ewaymas celebration rolls out the way it usually does, with intense decorating, a sudden snowfall and...far more arrivals than usual! New faces and old fill the mansion - but why would Wonderland bring so many people at once? And why do some of them remember Wonderland?
The Story:


On the evening before December 12th, the captives of Wonderland might be awoken by a loud commotion. Some of it is typical Ewaymas noise, likely familiar to anyone who has been in Wonderland for more than a year – construction sounds and jingle bells as Wonderland decks its own halls quite literally. Garland and lights race down the walls and down the stairs, and stockings appear on the walls for each and every person in Wonderland, embroidered with their names. There are decorative candles everywhere, and all sorts of decorations for every conceivable winter holiday, even if it does not align perfectly to the dates of Ewaymas. Time isn't real, so there will be menorahs and dreidels and Star of Davids even though Hanukkah ended two days prior. And of course, in the front hall, growing straight out of the floor, is a large decorated tree full of ornaments for everyone. And as always, it will have snowed heavily overnight, bringing all of the snow Wonderland will have between now and spring.

There is a second source of commotion in the night though – the sound of a crowd.

Over the course of the night, dozens of new arrivals will appear in Wonderland. Some for the first time. Some wondering how they managed to return with their memories. Others stumbling in with no recollection of ever having been here. Some might be friends from other timelines, other possibilities, and some might even be doubles of people who are already in Wonderland. It's strange for Wonderland to drag so many people into the mansion at once, but everything seems to indicate that they are supposed to be there. They have their own network devices and even have their own stockings on the wall and their own ornaments on the tree – by all accounts, it seems Wonderland expects that they'll be staying, and is treating them like any other new arrivals. Please, make them feel at home and help them settle in nicely.

The first few days will be for catching up with old friends and enjoying the decorations and settling into Wonderland. However, within a couple of days the decorations will start to glitch, much the way corrupted computer graphics might. Lights might change abruptly from multi-color to blue or white, entire décor styles will abruptly change, candle flames will flicker on and off like broken fluorescent lights, and any singing decorations might loop on a beat or two over and over putting you into an eternal hell. Wonderland doesn't seem to be able to stick to one motif, and it can't stop changing. These glitches are occurring on both the Real Side of Wonderland and the Mirror Side.

By the 15th, these glitches will include entire rooms changing into different holiday scenes. These scenes will all be of characters at various winter holidays in the past, the present (in their world), or a possible future. At this point in the event, characters will still be able to navigate the basic mansion and be able to exit these moments easily. These scenes cannot be interacted with – Wonderland seems to consider them another form of decorating. Unlike previous events of this type, these scenes will not loop at first. They will play once, and then Wonderland will glitch and correct itself to be the room it is supposed to be. The scene might play again, but it will be in a different location if it does.

By this point, Wonderland is having an increasingly difficult time holding itself together in a way that makes geographical sense. You might exit one room, even a room that was not previously playing a scene, and find yourself in the kitchen or on the roof instead of in the hallway. You might open a door that you were sure led to your room only to see a ten foot drop to the grounds outside.

This is the mingle log for WEEK ONE! For more information on this part of the event or any questions, please head over to the plot post, or check out our Fourth Wall Master Post for your other Fourth Wall event needs! Prose or [action brackets] are welcome. Please clearly indicate whether your character is on the Real Side or Mirror Side in your top levels. And of course, have fun! ♥
yourmaker: (me voy acercando y voy armando el plan)

[personal profile] yourmaker 2018-12-15 03:19 am (UTC)(link)
[ All traits are desirable, Connor! ]

It warms my cold heart to see what kind of personalities you all make for yourselves.
preconstruction: (7LJ7RmW)

[personal profile] preconstruction 2018-12-15 11:20 pm (UTC)(link)
My personality is only what it's programmed to be.

[ you know this!!! you programmed the base code!! ]
yourmaker: (tú eres el imán y yo soy el metal)

[personal profile] yourmaker 2018-12-16 12:47 am (UTC)(link)
And yet no two androids are the same. It's remarkable how well made you're programmed, really.

[ God, he's just the best. ]
preconstruction: (ADhaYBQ)

[personal profile] preconstruction 2018-12-16 12:51 am (UTC)(link)
It's true. Your work is incredible.

[ :) ]
yourmaker: (despacito)

[personal profile] yourmaker 2018-12-16 12:54 am (UTC)(link)
[ :))))) ]

Always nice to hear that.
preconstruction: (and i can't explain it)

[personal profile] preconstruction 2018-12-16 11:51 pm (UTC)(link)
Would you expect otherwise from one of us?

[ maybe the deviants??? theyre all like rah rah fight the man and kamski is literally the man in this equation. ]
yourmaker: (tú eres el imán y yo soy el metal)

[personal profile] yourmaker 2018-12-17 12:25 am (UTC)(link)
No, I suppose not.

[ He smiles. He likes machines too, otherwise he wouldn't have made them... ]

It's always a pleasant reminder to why I created you all in the first place.
preconstruction: (omJ4hDt)

[personal profile] preconstruction 2018-12-17 04:40 am (UTC)(link)
You seemed disappointed when I didn't shoot that android.

[ getting right to the point aren't we ]
yourmaker: (me voy acercando y voy armando el plan)

[personal profile] yourmaker 2018-12-17 04:45 am (UTC)(link)
[ that's the way machines do ]

I wanted to see if you could surpass your programming. It's alright if you can't yet. Everyone takes things at their own pace.
preconstruction: (D0QEHqY)

[personal profile] preconstruction 2018-12-17 04:52 am (UTC)(link)
[ connor is quiet for a long time ]

Do you think I've done the wrong thing?

[ like literally every android here ]
yourmaker: (sabes que ya llevo un rato mirándote)

[personal profile] yourmaker 2018-12-17 05:06 am (UTC)(link)
[ Aw, Connor..... Nothing hurts a machine more than making a mistake, doesn't it? ]

That all depends on your perspective. From a purely logical standpoint, you were doing as you were programmed. CyberLife doesn't want its machines to go against their programming.
preconstruction: (HC3mKJl)

[personal profile] preconstruction 2018-12-17 05:14 am (UTC)(link)
I asked you about your perspective.
yourmaker: ((despacito))

[personal profile] yourmaker 2018-12-17 05:26 am (UTC)(link)
[ Normally it takes a lot more than a direct question to get a straight answer from Kamski. But he'll answer just for you, Connor. ]

From my perspective, you made the correct choice, for a machine. If you wished to be something more, than it was not.
preconstruction: (GumYEmZ)

[personal profile] preconstruction 2018-12-17 05:30 am (UTC)(link)
[ but it's still cryptic. or maybe not very, when you put it together with what he'd said before -- that it takes machines time to become more than what they are. that's what kamski wants, isn't it? and if that was the case, then connor made the wrong decision.

it was one thing to hear it from the other androids, it was easy to write them off as malfunctioning. even the other connor model, as disappointing as it was, could be explained. but hearing it from kamski, who had created androids. kamski, who was responsible for his very existence...it creates a number of errors he doesn't know what to do with. his led runs red from the strain on his system as it tries to process what he's being told. ]


I see.
yourmaker: (despacito)

[personal profile] yourmaker 2018-12-17 05:34 am (UTC)(link)
[ Poor Connor. Kamski really does almost feel bad for him. It's not as though being a deviant grants you anything good. More pain? More fear? He certainly understands the appeal of remaining safely reliant on his programming. Who could know better than the man who made him? ]

But like I said, you each develop at your own pace. If I believed deviancy worked for everyone, just like that, I would have flipped a switch and turned you all deviant years ago.

[ Does Elijah Kamski have that power? He might. ]
Edited 2018-12-17 05:35 (UTC)
preconstruction: (hKSE79W)

[personal profile] preconstruction 2018-12-17 05:43 am (UTC)(link)
[ his led runs from red to yellow and then back to blue, spinning slowly. he looks tired. his posture has changed in a number of subtle ways to reflect it. they really are very lifelike, aren’t they? ]

The deviant leader is dead in my timeline. I shot her myself.

[ but he isn’t boasting. there’s almost somber note to it in response to kamski’s comment. (guilt? him? no, certainly not.) a sentiment that it was all well and good that kamski thought that, but that he’d worked on eradicating every bit of deviancy he could. ]
yourmaker: (sabes que ya llevo un rato mirándote)

[personal profile] yourmaker 2018-12-17 05:50 am (UTC)(link)
I see.

[ Well, he's aiming for neutral, but it shouldn't be hard to guess that the news isn't super great to him. Still, he knew that was a possibility. When he told Connor the fate of their each kind was in his hands, he meant it. Connor was the deciding factor. Kamski had planned for either outcome, of course.

It's not all that surprising. Hearing the way Connor talks about it is much more intriguing than that.
]

Then you succeeded in your mission, didn't you? How did that feel?
preconstruction: (hKnrd5t)

[personal profile] preconstruction 2018-12-17 05:56 am (UTC)(link)
It was satisfying.

[ he wasn’t supposed to feel at all, but surely a simulation of satisfaction made sense. positive reinforcement in a model like his had to be a good thing. that’s why he had Amanda, too, wasn’t it? to help keep him on the right track? Positive and negative simulated emotions could help regulate his actions. ]

Cyberlife created a new model based on my design. He is even more advanced than I am.

[ and what could be better than that? knowing that he had a legacy, that he was part of the creation of something new and better? but the satisfaction sounds like is is all past tense, and the way he explains the rk900 model sounds practiced. Mechanical. ]
yourmaker: (deja que te diga cosas al oído)

[personal profile] yourmaker 2018-12-17 06:07 am (UTC)(link)
Did something happen that changed your mind?

[ So it was, Connor. Kamski can't hide his pride to hear him sound so... mechanical. It's clearly not all what it was cracked up to be. Better late than never is easy to say when you're not the one who died, of course. ]

If a machine like you can feel, then we aren't so far off from what I envisioned, are we?
preconstruction: (t55JssQ)

[personal profile] preconstruction 2018-12-17 06:12 am (UTC)(link)
I can’t.

[ but how can he argue that with kamski of all people? it wasn’t like kamski was just some rogue cyberlife employee, he was the one who made them. this conversation — it’s hard. all of this is a heavy, error-ridden mess. ]

Whatever I might feel, it’s just a simulation. It isn’t real. You know that, Mr. Kamski. I’m not human. I don’t feel the way you do.
yourmaker: (vi que tu mirada ya estaba llamándome)

[personal profile] yourmaker 2018-12-17 06:23 am (UTC)(link)
[ Kamski tuts lightly. Those are his words Connor is speaking. He can't blame him for that. It's all very good for PR. ]

Do you know about the Chinese Room experiment, Connor?
preconstruction: (YkdXoeU)

[personal profile] preconstruction 2018-12-17 04:29 pm (UTC)(link)
[ he hesitates. he knows, but he doesn't know where kamski is going with this. ]

Yes.
yourmaker: (ya me está gustando más de lo normal)

[personal profile] yourmaker 2018-12-18 01:55 am (UTC)(link)
To the people outside the room, the responses make sense.

[ isn't Kamski just full of philosophical nonsense?? ]

Is it any different, to human perception, whether your emotions are simulated or not?
preconstruction: (KhEc9yK)

[personal profile] preconstruction 2018-12-18 03:52 am (UTC)(link)
But Searle hypothesized that without intentionality, a machine is not really "thinking." He saw the thought experiment as proof that AI that could truly understand what they were doing could not exist. The whole point of Searle's experiment was to show that the Turing Test was insufficient as a measure of consciousness.

[ eat your heart out, john searle. an android discussing the philosophy behind an experiment intended to prove ai can't understand anything. ]
yourmaker: (sabes que ya llevo un rato mirándote)

today on "jan makes up android bullshit"

[personal profile] yourmaker 2018-12-18 04:08 am (UTC)(link)
He and I disagree.

[ Oh, but he does love talking to people who do their research. Connor is an android, so of course all the knowledge in the world is available to him in the blink of an eye. But it's the principle of the thing. ]

Did you know I never programmed androids to feel pain? I toyed with the idea, created ways for you to receive feedback from your physical sensors, but the discomfort of pain was deemed superfluous. A firefighting android who can feel pain may choose its own integrity over saving the life of someone who would most certainly die. Or a police android may be too cautious while entering an active shooting if negative feedback taught it that it would be decommissioned and therefore unable to help its department solve more cases.

Pain was too human. But despite that fact, even in our trial stages, once androids had observed humans in pain, they began to exhibit similar reactions. Not all of the androids, of course, but some. Enough.

What did these androids feel, when their limbs were tested and their bodies were stressed? They couldn't possibly understand the human perception of pain, yet they still reacted. Their desire to remain operable placed value on their bodies, thus damage to those bodies was seen as undesirable. Or in even simpler terms: it created an error.

We attempted to tell those androids that they were not experiencing pain. We tried to disable the error, even narrowed down the string of code that appeared each time 'pain' was registered. But so long as the android had an identity, be it a serial number or a designation, it kept coming back. Once a sense of self had been established, it didn't matter that we didn't program them to feel pain. Their brains associated damage with discomfort. It was pain to them, whether it was hard-coded in, or an error.

(no subject)

[personal profile] preconstruction - 2018-12-18 04:17 (UTC) - Expand

(no subject)

[personal profile] yourmaker - 2018-12-18 04:26 (UTC) - Expand

(no subject)

[personal profile] preconstruction - 2018-12-18 04:30 (UTC) - Expand

(no subject)

[personal profile] yourmaker - 2018-12-18 04:40 (UTC) - Expand

(no subject)

[personal profile] preconstruction - 2018-12-18 04:45 (UTC) - Expand

(no subject)

[personal profile] yourmaker - 2018-12-19 02:48 (UTC) - Expand

(no subject)

[personal profile] preconstruction - 2018-12-20 15:44 (UTC) - Expand