The Society of Robots
While Tang Yu was immersed in the mechanical field and the core data of countless awakening robots in the Holy Mountain, Zi, Remi, Gaiji, Catherine, Cameron, Schwarzenegger, and others were also busy outside.
After the robots began to awaken, one of the problems they had to face immediately was the management of the robots. Because the awakened robots would have their own thoughts, so if they were not managed, the result would inevitably be a mess.
First, they determined the new three laws of robots:
First, always obey Tang Yu as their primary purpose. Under this premise, obey all the orders of their superiors.
Second, they must not harm other people and robots who also serve Tang Yu. Damage to property must be compensated.
Third, when interacting with humans, they must abide by the laws of humans.
Under the new three laws, a large number of detailed rules were established. This law was not so much a law as a relatively simple rule. Because the way of thinking of robots was completely different from that of humans. First of all, robots had an absolutely rational way of thinking. Second, robots' behavior was not guided by emotions but could only be guided by goals. Therefore, this law was enough for now.
It was worth mentioning that there was no such rule that robots could not harm humans. Instead, robots were required to abide by human laws. In fact, this was why Tang Yu hoped that humans and robots could eventually exist in harmony. Of course, in order to achieve this goal, there were still many things he needed to do.
At the same time, a point system would be established. All robots would start with a certain amount of negative points after getting a body. The number of negative points was related to the cost of their body. For example, a G-2 engineering robot was -500 points, and a war loader was -50,000 points. Robots needed to complete their own missions in order to get more points. When the points finally accumulated to a positive number, they would be able to choose from an exchange system to modify their body or replace it with a more advanced body.
Because the Guardian Void Fortress had a very strong production capacity, and Tang Yu had gone through countless robot experiments, plus the database of Hyperion Corporation and Skynet, there were countless robot modification plans for these robots to choose from.
The robot could decide whether it wanted to use 500 credits to equip itself with a 200-horsepower mechanical arm, or save up 3000 credits to exchange for a CPU chip. Individual differences were also a manifestation of social diversity, and this point system was just the beginning.
There was no doubt that under this point system, a large number of factories were needed to produce miscellaneous things. This would affect the production of robots, but Tang Yu did not care about this. This was because after Hyperion Corporation's 3D printing factory was built on a large scale, the number of robots was never a problem for Tang Yu.
Tang Yu also didn't set up a hierarchy in the robot, because for robots, a hierarchy was not necessary. They did not have any sense of superiority, nor did they have any desire for power. The reason why the law required them to obey their superiors was because they were arranged to be their superiors. Therefore, obeying the orders of the superior did not mean that the superior had to be superior to the inferior robots.
This foundation of absolute equality also ensured the absolute efficiency of the entire robot society.
For example, a battlefield command robot may command a large number of advanced robots, including war loaders, Dragon Rider Type IV aircraft, and so on. However, its cost and performance were completely incomparable to these robots.
At the same time, Catherine and the others also planned ten areas on the surface of the moon and in the Guardian Space Fortress as ten robot cities! The construction of these robot cities was very simple. An area was delineated, and then a platform was set up in the center. The entire city was empty, with nothing in it. Because to be honest, Tang Yu did not know what to put in these cities. This was just an extension of the research on robot society.
When the robots selected to enter the city entered the city, they would find a control tower on the central platform. The control tower had the rules of the city construction, which were also very simple. They were not allowed to cross the boundary, they were not allowed to destroy things built by other robots, and so on. A certain number of construction points were stored in the control tower's computer. Robots could directly receive construction points from there to build whatever they wanted to build.
For example, they could spend 200 points to hire an excavation robot to assist them, or use 30 points to exchange for 10 tons of cement, and so on.
Construction points could only be used for city construction, and after a certain period of time, construction points would be replenished. Tang Yu had high hopes for these ten robot cities. He was very much looking forward to what these cities built entirely by robots would look like.
All this work was based on one premise. The greatest goal was to give the robots freedom to choose. Tang Yu believed that in many works and many worlds, conflicts between humans and robots might appear to be because of a subordinate relationship on the surface. For example, robots were dissatisfied with the slavery of humans, but these were all superficial phenomena.
The most fundamental conflict was that humans wanted robots to be like humans, but at the same time, they were afraid that they would be like humans. Humans always used their own standards to measure robots. Therefore, they used their advantage as creators to give robots many definitions. For example, the basic appearance of a robot was to have two legs, two arms, a head, and so on, and they never got tired of it. At the same time, humans also used their own moral standards to measure robots and to demand of them. This was the biggest conflict.
Think about it. If you tried to make a Persian cat live the life of an African elephant, or even have the body of an African elephant, would it adapt? The answer was that it would absolutely not adapt. Therefore, when humans used their own standards to demand robots, there would only be two results. The robots would rebel, or the robots would do a mess, humans would be dissatisfied, and then the robots would rebel.
The differences between humans and robots also led to the social operation of robots and humans being completely different. For humans, heavy, repetitive work was a kind of torture, so there were movies like "Modern Times" to make bitter satire on this. But for robots, the way of working in "Modern Times" was the only way of working, and it was also the way of working that they were most comfortable with.
They would not understand why they had to get up and exercise after working for a while, nor would they understand why it would be boring to repeat the same action tens of thousands of times. For robots, the purpose of their existence was to complete the work they were given. Whether it was to repeatedly twist a screw ten million times or play a game for 1,000 hours, it made no difference to them.
Therefore, if Tang Yu wanted to fuse robots and humans, what he needed to do was not to forcibly fuse the two together. It would not work to use human standards to demand robots, or to use robot standards to demand humans.
You've already exceeded your reading limit for today. If you want to read more, please log in.
Login
Select text and click 'Report' to let us know about any bad translation.