Governing a world of 250 million robots

The lines of ethical principles are often blurred when it comes to embracing ubiquitous machines.

Robots will shape the cities of the future – from raising children, to cleaning streets, to protecting borders from military threats and much more.

While the ubiquity of robots isn’t arriving tomorrow, it’s closer than many realize—by 2030, humanoid robots (such as personal assistant bots) are slated to exceed 244 million worldwide, soaring from 220 million in 2020.

Examples of some cities with existing robotic infrastructure are Masdar City’s Personal Rapid Transit (PRT) and The Line, a future city in Neom, Saudi Arabia, South Korea’s Songdo Waste Management System, Denmark’s Odense City collaborative robots or cobots, and Japan s traffic navigating robots. in Takeshiba district.

But the rise of robotics raises thorny ethical questions about how we govern entities that sit between the conscience of humanity and the mechanical nature of machines like a dishwasher or a lawnmower.

Getting on the front foot with governance could make a huge difference by the end of the decade.

Robots can be designed to mimic humans (as humanoids or androids) and used in almost every sector: healthcare, manufacturing, logistics, space exploration, military, entertainment, hospitality and even in the home.

Also Read :  Women Turn To Internet Support Groups For Information On Cannabis And Pregnancy, Study Finds

Robots are designed to address human limitations, to be repetitively precise, long-lasting and unswayed by emotions.

They are not designed to harm the executive and seize power, contrary to what films like the Terminator may represent.

In dangerous jobs or tasks that require intensive manual labor, robots can complement or be a substitute for the human workforce.

In the agriculture sector, drones have a huge potential in assisting farming activities.

In early education, robots are accompanying children to learn and play. ‘Little Sophia’, a ‘robot friend’, aims to inspire children to learn about coding, AI, science, technology, engineering and mathematics through a safe, interactive, human-robot experience.

The rising trend of ubiquitous humanoid robots living alongside humans has raised the issue of responsible tech and robot ethics.

Debates about ethical robotics that started in the early 2000s still center on the same key issues: privacy and security, opacity/transparency and algorithmic biases.

Also Read :  Jennifer Rexford named Princeton’s next provost

To overcome such issues, researchers have also proposed five ethical principles, along with seven high-level articles for responsible robotics. These principles include: Robots should not be designed as weapons, except for national security reasons. Robots should be designed and operated to comply with existing laws, including privacy and security.

Robots are products: As with other products, they should be designed to be safe and secure.

Robots are manufactured artifacts: the illusion of emotions and intent should not be used to exploit vulnerable users.

It should be possible to find out who is responsible for any robots. Researchers also suggest robot city designers reconsider how ethical principles like the above can be respected during the design process, such as providing off-switches.

For example, with an efficient control system such as actuator mechanisms and algorithms to automatically switch off the robots.

Without agreed principles, robots could pose a real threat to humans. For example, the cyber threats of ransomware and DDoS attacks, the physical threats of increasingly autonomous devices and robots, and the emotional threats of being over-attached to robots, ignoring real human relationships such as depicted in the 2013 movie, ‘You’.

Also Read :  Daughter of North Korean leader Kim Jong Un's is revealed to the world in first public appearance

Other negative environmental impacts of robotics include excessive energy consumption, accelerated resource depletion, and uncontrolled electronic waste.

Cities and lawmakers will also face the emerging threat of artificial intelligence (AI) terrorism.

From expanding autonomous drones and introducing robotic swarms, to remote attacks or disease delivery by nanorobots, law enforcement and defense organizations face a new frontier of potential threats.

To prepare, future robotics, AI law and ethics research geared to developing policy is advised.

Robots should make life better. In the face of rapid innovation, banning or stifling development are not feasible responses.

The onus then falls on governments to cultivate more robot-aware citizens and responsible (licensed) robot creators.

This, along with a proactive approach to legislation, offers cities the opportunity to usher in a new era of robotics with greater harmony and urgency.

(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)

Source

Leave a Reply

Your email address will not be published.

Related Articles

Back to top button