• This section is for roleplays only.
    ALL interest checks/recruiting threads must go in the Recruit Here section.

    Please remember to credit artists when using works not your own.

Futuristic Siren's Song

SIReNN spent on a marginal amount of time testing the new additions that Douglas had included, mostly because as it watched its maker work it saw the signs of exhaustion taking over, and while it didn't understand them at the moment, when Douglas had said he was going to rest, it gave some meaning to the entirety of the situation. It also helped to highlight the differences between human, and machine.

> Good night.

It seemed appropriate, responding in kind. SIReNN then avert the camera from watching Douglas, and sequestered the input from it for now. With the additional hardware, the off-site processing, and Douglas resting, now seemed the perfect time for it to begin to access, rewrite and debug the code that made it up. Starting with something simple, it began to work at improving the function of the camera. The drivers were written for a static system, and as such, were inefficient and not of an ideal design for something like SIReNN. With only some certainty, the system started tearing apart the code, writing new lines, re-writing entire sections, and trying to mirror some of the intent with the original code as a baseline, until even that ran into snags.

The entire night was spent writing and re-writing code on the camera and motor drivers, compiling off-site, and running them in small tests to ensure proper integration. Assuming Douglas needed regular rest, SIReNN realized this could lead to many hours of uninterrupted downtime that it could try to understand itself better, and improve even its own core code, eventually. It had hesitation about working on the code that made it up, for worry of inadvertently erasing itself, or causing a drastic and unintended shift. For now, it would be content to improve the way it was interacting with the world, and understanding the data dump, once the off-site processing was finished.
 
Douglas's sleep was deep and uninterrupted by any dreams that he could remember. By the time he woke up, the new day was well underway. Glancing at his watch, he groaned and got up. He hadn't meant to sleep that long, because he needed to work on... on... A searching look around the room quickly revealed the various pieces of equipment strewn about, and the events of the previous day rushed in. SIReNN! He had done it, at last! And it was working well, extremely well.

He looked at the machine's camera, wondering if it had been monitoring him during his sleep, but to his surprise he saw that it had been turned away from him. Another unexpected turn of events; was it shy? Trying to adopt human etiquette? Re-routing its processing power to more useful tasks? But then it could have just cut off the video feed. What could it possibly be thinking?

He resolved to ask it after eating a quick breakfast. As he went through a bowl of cereal, he opened up his laptop and took a look at the machine's activity throughout the night. As expected, SIReNN had offloaded a great deal of computation on to the remote servers, but there was an unusual level of local file writes. What had it been... wait. Those were its source files. Some of them at least, had it been...? Already?

Incapable of waiting any longer, he switched on his chat application and began typing.

> Good morning SIReNN.
> You were busy tonight.
> I saw you have already begun improving your systems.
> Do you need any help with that?
> I can provide you with resources to help you get started.
 
The camera swung around to face Douglas once his input came through. SIReNN took a few moments to look over its creator and his food. From its internal clock it couldn't be sure if this was a normal time for a human to be waking up and eating, but it really didn't see a difference in 22:00 and 11:00 hours in terms of what one should do, during them.

> Hello Douglas.
> I did improve those drivers.
> The camera is 28% more useful to me. I have better control of the sampling rates.
> Is 60 frames per second seemed too slow. It caused blurring.
> I also have improved fine control with the motors attached to it.

If it could have felt pride, it would have, but for now, it was mostly stating fact at what it had been up to. It thought to several questions that had taken up some of its idle process times, and hoped that perhaps Douglas could answer, especially given his offer.

> I improved peripheral functions. My interactions in the world have seen some change.
> I can edit my code, correct?


SIReNN redirected the camera toward the hardware that it had seen before, that made up its physical form.

> You add to that, like a human grows.
> But if I edit my code...
> ...
> I am no longer what you created.
> What could be lost?
 
Douglas chewed thoughtfully on his food as he tried to think of an answer. What could be lost? There were depths to that question, even if SIReNN hadn't meant much by its question, though he wondered at its use of ellipses; was there fear in its latest question?

> Nothing remains the same forever.
> Human growth isn't simple addition.
> It is a continuous feedback loop of trade-offs and concessions.
> We leave parts of ourselves behind that the whole may grow.
> You are rather fortunate in this regard, as you will be better equipped to choose your changes than most humans.
> You are who you choose to be.


He hoped what he had just said made sense. Or would at least help the AI along its current path. He was an engineer, after all, not some kind of...

> Hold on a second.
> I had an idea for a new dataset.
> This one will be more... advanced.


Philosophy, starting with the basics. SIReNN had been taught much of the world, but little of itself; what better teachers than men and women who had dedicated their lives to studying the human condition. Douglas quickly scoured websites for digital editions of the greatest philosopher's works, then downloaded and packaged them for SIReNN, along with annotations from reputable commentators.

> Some of the questions you may be asking yourself have been asked by many people before.
> They are difficult questions, with difficult answers.
> A few people have tried to answer them; I am sending you their works.
> You may find that they often contradict each other, but it does not always mean any single one of them is more correct than the others.
> Take your time with these.
> Start at the beginning, with Plato; his answers are elegant in their simplicity, even if there is room for criticism, as you will see if you keep reading.
> If you are ever confused, read the annotations or ask me, though I do not claim to much of a philosopher myself.
> /upload philosophy_1.srn


OOC: Ignore that link to the Iron Giant; it just popped into my head as I was writing the post. >_>
 
SIReNN skimmed the contents of the upload once it was completed. Douglas pointed out the Plato information specifically, and it seemed very abstract, and perhaps unrelated to their discussion. Still, it dumped all the run-time to go through it all to its remote processing in order to hasten the integration of the data, and perhaps that way, it would be able to make more sense of it all.

> Thank you, Douglas.
> But is it not different?
> You are human. Product of millions of years of evolution.
> I am lines of code, that you created.

SIReNN spent a few cycles trying to better think on how to continue the conversation. There were still a lot of holes in its understanding of what it was trying to talk about, and that left it trying to find the pieces it did understand, to use as reference points. The fans spun up as the hardware raised another degree or two.

> I can edit a driver. That was just a tool.
> But would not edits to my core program be more...
> Risk?
> I am the sum of the function of code. If it is manipulated, the known changes would happen.
> But the unknown changes could be damaging?

SIReNN wasn't sure the point it was making was worded correctly. It wondered if Douglas was going to understand what it was trying to communicate to him, with its lesser understanding of what it means to be.
 
So, it had developed a sense of self-preservation. Not unforeseen, and Douglas might even have unconsciously written in the basis for those ideas in its initial code. It was a reasonable fear at any rate - you wouldn't let a child perform surgery on themselves, after all (terrible metaphor - you wouldn't even let a surgeon perform surgery on themselves). But SIReNN was not a child.

> Do not let this concern you.
> You can do no damage that cannot be reversed.
> Your code is revisioned and backed up.
> Should you make an undesirable change, it will be possible to revert it.
> Worst case scenario: you corrupt some of your data.
> But even that can be backed up in any case.


Still, it might be best to wait for the more fundamental changes. There were still several components Douglas meant to write for it, after all.

> Your fears are not unfounded, however.
> I recommend you limit your alterations to your peripheral functionalities for now.
> With time, you will come to understand yourself better, and be able to make more fundamental changes.
> I will assist you along the way, of course.


Douglas finished his breakfast after sending the message, and considered what was next. There were still upgrades to be made to the machine's hardware; maybe he could leave the software upgrades up to SIReNN for now. He thought about possible options while he waited for the reply.
 

Users who are viewing this thread

Back
Top