Sunday, July 12, 2009

Ethics of the future: self-copies

Just as the future of science and technology is rife with legal opportunities and psychological study possibilities, so is it with ethical issues. One interesting example is the case of individuals having multiple copies of themselves, either embodied or digital.

1. Can I self-copy?
The first issue is how different societies will set norms and legal standards for having copies. The least offensive first level would be having a backup copy of mindfiles for emergency and archival purposes, much like computer backups at present. People take pictures and videos of their experiences, why not of their minds? The other end of the extreme would be the most liberal societies allowing all manner of digital and embodied copies. The notion of regulating copies brings up an interesting potential precedent, that currently, the creation of children is largely unregulated on a global basis.

2. When and where can I run my self-copy(ies)?
A second issue is, given copies, under what circumstances can and should they be run. A daily backup is quite different from unleashing hundreds of embodied copies of oneself. Physically embodied copies would consume resources just as any other person in the world and there would likely be some stiff initial regulations since national population doubling, trebling or more overnight would not likely be a useful shock to society. Not to mention the difficulty in quickly obtaining and assembling the required resources for a full human copy; despite the potential advances in 3D human tissue and organ home printers by then.

Digital copies is the more obvious opportunity for running self-copies and could be much more challenging to regulate. In the early days, the size and processing requirements of uncompressed mindfiles would likely be so large that a runtime environment would not be readily available on any home machine or network but would rather require a supercomputer.

3. Am I a copy?
A third interesting problem is whether it would be moral for copies to know that they are copies, and the related legal issues regarding memory redaction as explored in Wright's "Golden Age" trilogy. Depending how interaction between originals and copies is organized, it may not matter. Psychologically for the originals and the copies, it may matter a lot. The original may 'own' the copies or the copies may have self-determination rights. In the case of an embodied copy, it is hard not to argue for their full personhood but somehow a digital instance seems to have fewer rights, although it may come to be that shutting down an instance of a digital mind, even with a recent full memory backup and integration, is just as wrong as a physical homicide.

Interesting ethical issues could arise for originals and copies alike as to what to share with the others; should horrifying experiences be edited out as Brin's Kiln People do at times? There would be both benefits and costs to experiencing the death of a self-copy, for example. It would not seem ethical to make self-copies explicitly for scientific research purposes to garner information from their deaths, but it does seem fully ethical to have multiple self-copies for with different life styles, some healthier and some less healthy to investigate a) whether a healthy life style matters and b) to selfishly share exciting experiences from less risk averse copies back with the longer-lived healthier copy.

Indeed in the new medical era of a systemic understanding of health and disease where n=1, what better control examples to have than of yourself! However, epigenetic mutations and post-translational modifications may be much harder to equalize across copies than memories and experiences.

The issue of the definition of life arises as some people may want the abridged meta-message or take-away from experiences, indeed this is one of the great potential benefits of multiple copies, while others may wish to preserve the full resolution of all experiences. The standard could accommodate both, with the summary being the routine information transfer with the detail archived for on-demand access.

4. What can I do with my self-copies?
Societies might like to attempt to establish checks and balances to prevent originals from selling copies of themselves or others into slavery to reap economic benefits, as dystopially portrayed in Ballantyne's "Capacity". Especially in a potential realm of digital minds, there are many potential future challenges with rights determination and enforcement.

The 'AI abdication' defense is the argument that societies that are sufficiently advanced to have the ability to run self-copies would also have other advancements developed and in use such as some sort of consciousness sensor identifying existing and emerging sentient beings and looking after their well-being, a beneficent policing. There are numerous issues with the AI abdication defense, including its unlikely existence from a technical standpoint, whether humans would agree to use such a tool, whether a caregiving AI could be hacked and other issues. However, technology does not advance in a vacuum and society generally matures around technologies so it is likely that some detriment-balancing counter initiatives would exist.

For example, would it be moral to create sub-sentient beings as sex slaves or personal assistants? This may be an improvement over the current situation but is not devoid of moral issues. At some point, as more about consciousness has been characterized and defined, a list of intelligence stratifications and capabilities could be a standard societal tool. Animals, humans and AIs would be included at minimum. A future world with many different levels of sentience seems quite possible.

blog comments powered by Disqus