Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Search
Search
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Mind uploading
(section)
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
Move
General
What links here
Related changes
Special pages
Page information
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Ethical and legal implications=== The process of developing emulation technology raises ethical issues related to [[animal welfare]] and [[artificial consciousness]].<ref name=SandbergEthics2014>{{cite journal| first= Anders |last= Sandberg|title=Ethics of brain emulations |journal=Journal of Experimental & Theoretical Artificial Intelligence| date=14 April 2014 |volume= 26 |issue= 3 |pages= 439β457| doi=10.1080/0952813X.2014.895113|s2cid=14545074}}</ref> The neuroscience required to develop brain emulation would require animal experimentation, first on invertebrates and then on small mammals before moving on to humans. Sometimes the animals would just need to be euthanized in order to extract, slice, and scan their brains, but sometimes behavioral and ''in vivo'' measures would be required, which might cause pain to living animals.<ref name=SandbergEthics2014 /> In addition, the resulting animal emulations themselves might suffer, depending on one's views about consciousness.<ref name=SandbergEthics2014 /> Bancroft argues for the plausibility of consciousness in brain simulations on the basis of the "[[Qualia#David Chalmers|fading qualia]]" thought experiment of [[David Chalmers]]. He then concludes:<ref name= Bancroft2013>{{cite journal |first= Tyler D. |last= Bancroft|title=Ethical Aspects of Computational Neuroscience |journal= Neuroethics |date=Aug 2013|volume=6|issue=2|pages=415β418|doi=10.1007/s12152-012-9163-7| s2cid= 145511899 |issn= 1874-5504}}</ref> βIf, as I argue above, a sufficiently detailed computational simulation of the brain is potentially operationally equivalent to an organic brain, it follows that we must consider extending protections against suffering to simulations.β Chalmers himself has argued that such virtual realities would be genuine realities.<ref>{{cite book |last=Chalmers |first=David |author-link=David Chalmers |date=2022 |title=Reality+: Virtual Worlds and the Problems of Philosophy |url=https://wwnorton.com/books/reality |location=New York |publisher=W. W. Norton & Company |isbn= 9780393635805}}</ref> However, if mind uploading occurs and the uploads are not conscious, there may be a significant opportunity cost. In the book ''[[Superintelligence: Paths, Dangers, Strategies|Superintelligence]]'', [[Nick Bostrom]] expresses concern that we could build a "Disneyland without children."<ref name="bostrom2014">{{cite book |last=Bostrom |first=Nick |title=Superintelligence: Paths, Dangers, Strategies |date=2014 |publisher=Oxford University Press |isbn=978-0199678112 |location=Oxford, England}}</ref> It might help reduce emulation suffering to develop virtual equivalents of anaesthesia, as well as to omit processing related to pain and/or consciousness. However, some experiments might require a fully functioning and suffering animal emulation. Animals might also suffer by accident due to flaws and lack of insight into what parts of their brains are suffering.<ref name=SandbergEthics2014 /> Questions also arise regarding the moral status of partial brain emulations, as well as creating neuromorphic emulations that draw inspiration from biological brains but are built somewhat differently.<ref name=Bancroft2013 /> Brain emulations could be erased by computer viruses or malware, without the need to destroy the underlying hardware. This may make assassination easier than for physical humans. The attacker might take the computing power for its own use.<ref name=EckersleySandberg2013>{{cite journal|first1= Peter |last1= Eckersley|first2= Anders |last2= Sandberg|title=Is Brain Emulation Dangerous?|journal=Journal of Artificial General Intelligence|date=Dec 2013|volume=4|issue=3|pages=170β194|doi=10.2478/jagi-2013-0011|issn=1946-0163|bibcode=2013JAGI....4..170E|doi-access=free}}</ref> Many questions arise regarding the legal personhood of emulations.<ref name=Muzyka2013 /> Would they be given the rights of biological humans? If a person makes an emulated copy of themselves and then dies, does the emulation inherit their property and official positions? Could the emulation ask to "pull the plug" when its biological version was terminally ill or in a coma? Would it help to treat emulations as adolescents for a few years so that the biological creator would maintain temporary control? Would criminal emulations receive the death penalty, or would they be given forced data modification as a form of "rehabilitation"? Could an upload have marriage and child-care rights?<ref name=Muzyka2013>{{cite journal|first= Kamil |last= Muzyka|title=The Outline of Personhood Law Regarding Artificial Intelligences and Emulated Human Entities|journal=Journal of Artificial General Intelligence|date=Dec 2013|volume=4|issue=3|pages=164β169| doi= 10.2478/jagi-2013-0010|issn=1946-0163|bibcode=2013JAGI....4..164M|doi-access=free}}</ref> If simulated minds would come true and if they were assigned rights of their own, it may be difficult to ensure the protection of "digital human rights". For example, social science researchers might be tempted to secretly expose simulated minds, or whole isolated societies of simulated minds, to controlled experiments in which many copies of the same minds are exposed (serially or simultaneously) to different test conditions.{{citation needed|date=June 2014}} Research led by cognitive scientist Michael Laakasuo has shown that attitudes towards mind uploading are predicted by an individual's belief in an afterlife; the existence of mind uploading technology may threaten religious and spiritual notions of immortality and divinity.<ref name="laakasuo2022">{{cite journal |first1= Michael |last1= Laakasuo |first2= Jukka |last2= Sundvall |first3= Marianna |last3= Drosinou |display-authors= 3| author4=Ivar Hannikainen |author5=Anton Kunnari |author6=Kathryn B. Francis |author7=Jussi PalomΓ€ki |date=2023 |title=Would you exchange your soul for immortality? β Existential Meaning and Afterlife Beliefs Predict Mind Upload Approval |journal=Frontiers in Psychology |volume=14 |doi=10.3389/fpsyg.2023.1254846 |doi-access=free |pmid=38162973 |pmc=10757642 }}</ref>
Summary:
Please note that all contributions to Ikwipedia are considered to be released under the Creative Commons Attribution-ShareAlike (see
Ikwipedia:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Toggle limited content width