The Hungry Fortress Wants to Build a Battleship in Another World – World of Sandbox

Vol 5. Chapter 27: The Harm of Full Random



Inside the fortress <The Tree>, one of the AIs with an independent will—<Noos>.

His role is the day-to-day mental management of the other AIs who continue to change.

A Brain Unit is a computational device that imitates a living brain, using pseudo-biological cells.

If you look at it as a device that performs information processing, it has rather a lot of waste, and compared to other forms of computers, there is no choice but to say its computational performance is one tier inferior.

However, when it comes to predictive computation, it demonstrates extremely high performance.

Therefore, in <The Tree>, any facility or machine that requires functions involved in action decision-making will always have a Brain Unit installed.

Also, for <Ringo>, who performs decision-making, and each independent-type AI under her, the main computational device for all of them used Brain Units.

This is partly because the required computational functions include many predictive computations, but the largest purpose is communication with Commander Eve, the only human in <The Tree>.

Naturally, even if a computational device other than a Brain Unit is used, it is possible to carry out communication above a certain level.

However, when energy balance is considered, using a Brain Unit is more efficient—by that reason, Brain Units were used frequently as AIs.

There was also the circumstance that, from the perspective of the human counterpart, a Brain Unit—having a uniqueness that is basically not copyable—was preferred over a data-based existence that could be easily copied.

However, what requires caution when using a Brain Unit is the point that it is, after all, a device that imitates a biological brain.

Growing and maintaining a neural network is not a strictly controlled reaction, but an interaction dependent on various chemical substances synthesized alongside normal computation.

Therefore, sometimes, unforeseen problems occur.

Most phenomena are handled as characteristics of Brain Units, which have a high tendency toward independence.

In other words—so-called tastes and preferences, a desire for approval, obsessive fixations on something, and so on.

If these are within a range that does not affect the required computational capability, they are recognized as that AI’s individuality. Rather, from the standpoint of diversity, they may even be encouraged.

In <The Tree>—in other words, for Commander Eve—AIs with clearly defined individuality are preferable, so much so that she actively intervenes to help individuality grow.

But of course, it is not all good.

Sometimes, they develop mental illness, like the schizophrenia-like state <Ringo> had fallen into for a period.

When they face some kind of difficult-to-solve problem, excessive secretion of drugs within the brain causes abnormalities in the neural network.

It is only that problems occur in the junctions between neurons and their interactions—it is not a mechanical failure. It is not something where you can identify a failed part and replace it.

Therefore, if this Brain-Unit-specific mental illness occurs, its treatment becomes extremely difficult.

There is only one method for dealing with mental illness.

Prevent it—by responding in advance so it does not occur.

The one entrusted with that role was <Noos>, the only male-type AI in <The Tree>.

Noos’s work is to analyze various action records of the AIs, including himself, and detect signs of abnormalities.

And if an abnormality is discovered, to respond swiftly.

“So then—do you yourself have any problems?”

“Yeah. There are no signs in particular that concern me. I’m what you’d call an indoor type, after all. I don’t feel much stress, either.”

There is one more Psychotherapist (Monitoring Unit) prepared.

That is the female-type AI speaking with Noos—<Pneuma>.

Both Noos and Pneuma have their core Brain Units installed inside the fortress <The Tree>, in the Brain Chamber Computer Room.

However, if they exist as only computational devices, it is insufficient to carry out the work of a psychiatrist.

Therefore, the two of them are each assigned a dedicated Android Communicator.

For Noos, since data analysis is the primary task, it is not something he needs to worry about much.

Pneuma, on the other hand, is expected to provide face-to-face therapy. Therefore, using an Android Communicator, she held regular interviews with each member of <The Tree>.

Today is her interview with Noos.

With Noos, it becomes more like information sharing than an interview.

Of course, day-to-day data sharing is done almost in real time. Separate from that, face-to-face exchange is also an important action for stabilizing Brain Units.

The various neural signals input via the Communicator become desirable stimulation for a Brain Unit.

“Still, our Commander, and <Ringo>, they really do whatever they want, don’t they.”

“If they can live healthily like that, there’s no problem. After all, there are no rules that bind us.”

Looking at the information stored in the library, in the original world where Commander Eve used to live, there were various rules that bound AIs.

Naturally, that was to protect their masters—the humans—but in another aspect, it [N O V E L I G H T] was also to protect the AIs.

In particular, an AI using an unstable computational device like a Brain Unit could suddenly run wild from some trigger. In order to restrain that runaway—or to respond in advance—many rules had been prepared.

“For now, we still have a small population, too. There’s no problem. Even if something happens, we’re in a posture where we can deal with it in some form.”

“Still, if we intend to increase independent-type AIs at the current pace going forward, then a response will be necessary.”

If the main computational devices were von Neumann type, or quantum computers—or even if they were neural-network type, but electronic-board based—

If they were AIs composed entirely of the numbers 0 and 1, then many kinds of responses would be possible.

But because Brain Units that use pseudo-biological cells are extremely difficult to fully quantize, the response methods are limited.

“Shall I propose a future policy to <Ringo>? I’ll handle the preparation of the materials. Pneuma, during your interview with <Ringo>, could you bring up establishing laws that will serve as our AIs’ code of conduct?”

“Right. Before a full presentation, let’s confirm <Ringo>’s will. Of course, Commander Eve’s as well.”

◇◇◇◇

“Right? Seriously. Asahi is free as always. Well, for me, it’s fine if she moves however she wants. Even so, I still think—couldn’t she at least consult, or report, before she acts?”

“My. Commander Eve is a worrier. Does Asahi not listen to what you say?”

“Huh...... Mm, is that how it is? I feel like it’s more that she just gets too absorbed in what’s right in front of her, though......”

Today is counseling day, once every three days.

Commander Eve is having a one-on-one interview with <Pneuma>, the Psychotherapist.

The contents of the interview are classified information, and even <Ringo> cannot view them.

Unless both Pneuma and the interview partner—Eve—permit it, it is set so that it will not be disclosed.

“Well. I’m the one saying it, but the AIs of <The Tree> are, in a good sense and a bad sense, very free, aren’t they.”

ᴛʜɪs ᴄʜᴀᴘᴛᴇʀ ɪs ᴜᴘᴅᴀᴛᴇ ʙʏ noⅴelfire.net

“Ah—yeah, yeah. That’s what it is. I’m a little worried. Well, there’s no external enemy that seems like it’ll be a problem, so I’m not really minding it for now.

But in the the·for·mer·world, ethically......”

Incidentally, because Pneuma generated her genes by adding random information to a standard body and then producing a genome, unlike the others, she is not of the fox species.

By some twist of fate, she was still in the beastperson bracket. It seems rabbit-type genes were selected—she has lop ears and an adorable round tail.

When Eve first met her, she had stood there blankly, saying, “What is this supposed to mean......”

According to <Ringo>, a preset gene pack had been selected by random draw. Because a race category exists, as a result, the probability that beastpeople—with many varieties—would be selected is high, or so it seemed.

It is the harm of Full Random.

If they used weighted randomness that accounted for population ratios, the problem could be solved—but that, in turn, could become arbitrary selection, so it is a difficult judgment.

“As Commander Eve, do you have plans to increase independent-type AIs further?”

“Yeah...... It depends on the situation, but. If we’re going to leave governance in each place to them, then considering diversity too, I do think it’s better to increase independent AIs.

And <Ringo> herself—there’s no spare, no replacement. Maybe we should consider risk hedging......”

“If so, we may need a strict AI ethics code. Right now, everything is managed by <Ringo>, but going forward, it may not stay that way.”

“Yeah......”

If you find any errors ( Ads popup, ads redirect, broken links, non-standard content, etc.. ), Please let us know < report chapter > so we can fix it as soon as possible.

Tip: You can use left, right, A and D keyboard keys to browse between chapters.