A thought occurred to me after I wrote yesterdays post about neural networks/machine learning and that is:
If you recreate a brain using computer hardware and software, you have several problems:
This raises a whole world of legal, moral and ethical issues to which I don't believe there are any perfect answers. Just a few initial thoughts from that list above make things even more complicated...
You would (will) be living in a society where technology and AI have finally evolved to a point where machines are capable of making decisions. Indeed, as most things will be automated, they will be machine controlled. The advantages of having AI control of machinery are numerous and so it's likely that even menial machines will have some form of intelligence. As an aside, if you haven't watched Red Dwarf, you really need to see the episode where Lister meets Talkie Toaster:
Imagine if those awful self service checkouts actually had some intelligence and you could talk to them? This kind of technology is coming, the beginnings are already available today and history tells us that these things only get better and better with refinement over time. Things like Siri voice control will be as normal and ubiquitous as handles on doors.
So, if our machines can think, they can also very easily turn you off and if you've been turned off you're no longer in control.
I am fascinated by the nuances of our personalities, how our minds work and what makes us... us! Because we don't truly understand how we become who we are, we cannot predict whether a machine with the same neural capability will develop skills such as empathy, understanding, kindness and so on. Maybe, machines will possess all the intelligence but no feelings whatsoever and make cold, calculated decisions.
I'm just off to see a man about a dog at Skynet...
The thought also crosses my mind that you clearly have a situation where both human and machine-human people will exist side by side. Who has precedence? Do you have equality? What if the humans simply... switch you off! Conversely and presumably, a human-machine would be able to work 24/7 without the need for sleep. Does this mean that humans are out of a job because they have needs such as sleep?
What about reproduction? You can store as many human machines as you like. Who or what decides to reproduce a mechanical person? Would human machines have the same living requirements as us? Would they even need a body in the traditional sense of the word? Probably not - if you wanted to go somewhere you could just zip down a network.
Which makes me think again - if this is an exact copy of a human mind then they will crave the ability to touch, taste, feel, love... What would happen if the machine gets depressed? A machine can surely do more damage than an individual ever could.
Really the list of questions is never ending.
Finally, the issue of security. At present we can safely assume it's pretty much impossible to hack someones brain - we're fairly secure and amazing advances in medicine and bio engineering aside, I think we will be for quite some time to come. However, as soon as you make something digital it becomes vulnerable to attack. Can you imagine a bot-net of people? Now that would be a problem...
Makes you think, doesn't it?