Of, For, and By the People: The Legal Lacuna of Synthetic Persons

Conferring legal personhood on purely synthetic entities is a very real legal possibility, one under consideration presently by the European Union. We show here that such legislative action would be morally unnecessary and legally troublesome. While AI legal personhood may have emotional or economic appeal, so do many superficially desirable hazards against which the law…

Legal and ethical issues in telemedicine and robotics

Modern medical concerns with telemedicine and robotics practiced across national or other jurisdictional boundaries engage the historical, complex area of law called conflict of laws. An initial concern is whether a practitioner licensed only in jurisdiction A who treats a patient in jurisdiction B violates B's laws. Further concerns are whether a practitioner in A…

What Should We Want From a Robot Ethic

There are at least three things we might mean by ethics in robotics: the ethical systems built into robots, the ethics of people who design and use robots, and the ethics of how people treat robots. This paper argues that the best approach to robot ethics is one which addresses all three of these, and…

Robots of Just War: A Legal Perspective

In order to present a hopefully comprehensive framework of the stakes of the growing use of robot soldiers, the paper focuses on: (1) the different impact of robots on legal systems, e.g., contractual obligations and tort liability; (2) how robots affect crucial notions as causality, predictability and human culpability in criminal law and, finally, (3)…

The responsibility gap: Ascribing responsibility for the actions of learning automata

Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation wherein the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour anymore, and thus cannot…