Is it normal to wonder if machines will ever be given rights?

Imagine all life as a biological machine and all electronics as a technological machine, then drop the adjective and just refer to both things as machines.

We give rights to humans, rights to certain animals, and then diminishing rights based on an animal's intelligence, importance, size, rarity. My mobile phone is probably about as complex a machine as a small fish.

When computers have the complexity and intelligence of a dog, should they have the rights of a dog? When they are as complex as humans and have electronic brains which mimic human biological brains, should they have rights? What if they develop a sense of self, an identity, a belief system, a personality?

At any stage of this process, do they deserve rights? Are we saying we deserve rights because of how we were made, but something else doesn't deserve rights because it was made in a different way?

Before you give a knee-jerk response, remember that you are a machine. The only difference between you and a computer is that you were assembled over a long period, almost at random.

P.S. Apologies to anyone who feels this doesn't consider a theist viewpoint. That was deliberate. One question at a time...

Yes 31
No 33
Maybe 15
Help us keep this site organized and clean. Thanks!
[ Report Post ]
Comments ( 42 )
  • howaminotmyself

    I'm still angry that corporations are people.

    Comment Hidden ( show )
      -
    • wigsplitz

      Why, exactly?

      Comment Hidden ( show )
  • dom180

    You need to think about what the purpose of "rights" are and why we have them.

    You could argue that rights were first granted to human factory workers because they guaranteed workers could not be over-worked thus maximising their productivity (you can't work effectively if you suffer sleep deprivation). Those rights weren't altruistic as they weren't for the benefit of the workers but for the benefit of the factory owners.

    I could foresee rights being given to machines in the same way; if doing so maximised their productivity. But in that case you've got to factor in that machines can't be "unhappy" with their operator, their "employer" like humans can. They're workhorses who follow out their instructions without judgement over why or for what personal reward and for the foreseeable future that is all they will ever be. They are selfless by nature, without even a notion of the "self" as "living creatures" think of it.

    A machine can't refuse to work if it feels its labour is being abused or its talent wasted in the same way human workers once did to first gain workers rights. Granted it can break down if it over-heats, but people who operate machinery know this and realise it is in their own best interests not to let that happen. Giving machines "rights" isn't needed for that to happen, those rights to a "productive working environment" evolved as a result of the system without a need for government intervention. Granted they're not legal and constitutional rights - they're not rights as we think of rights, but then again machines aren't alive as we think of life - but for all intents and purposes they are rights and machines already have them, albeit in a down-graded form.

    As a separate point, machines don't have leverage over the government. They can't withdraw their labour and go on strike if they're not awarded rights, they can't vote into power a political party based on the promise to award them rights. Even if they ever truly deserved rights they would never receive them because only a government can award rights and no government would benefit from doing so (presuming all governments act in self-interest, which is a pretty safe assumption. But that's for another rant). Corporations would bribe parties with donations to stop them awarding rights to their non-organic employees.

    Machines will not receive constitutional and legal rights as organic lifeforms do, at least for the foreseeable future of how technology will progress. After that, who's to know?

    EDIT: Apologies for the wall of text, I was a little tipsy when I wrote it.

    Comment Hidden ( show )
  • Mando

    Great post. Electronic machines may be complex and become more so but only mimic, as you say, man in mechanics, or rational thought and memory etc.. They are things, not life forms. They have no life, no sentience and are often inorganic. They are just extensions of mankind as are any other tools simple or complex. The only rights pertain to production (e.g. patent), use (contract/laws) and ownership (property).

    Comment Hidden ( show )
      -
    • VioletTrees

      Why? On what basis do you say they'll only be able to mimic thought? Why is life (which is defined based on reproduction, respiration, and development, none of which are directly related to our nervous systems) required for an entity to experience? Why does an entity need to be organic to experience? Carbon isn't magic sentience juice, it's just one of the things we use to identify life on our planet (and in fact, it's possible that there's non-carbon based life elsewhere in the universe). Anyway, most computers are made at least partially out of organic materials.

      Right now, I agree with you. I don't think our computers are sentient. But that doesn't mean they never will be. You say they're just things, but when you come down to it, people are just things, too. Don't get my wrong, humans are really rad things that I care about a lot, but we're still made of stuff that came from the same place as stuff computers are made of. It's possible that there's something about humans (or animals in general) that makes us capable of experiencing that computers can't ever have, but as of right now, I don't know of any reason to believe that.

      Comment Hidden ( show )
        -
      • I feel very similarly about this. I would also blur the lines by saying that the means of production of a "machine" isn't important in terms of what rights the machine has.

        A human machine can be produced in the common way of daddy planting a seed in mummy's tummy, or conception can happen in a test tube. One day it may be possible for a life to exist without ever having been in a human womb. Maybe even it could be built from scratch as a fully functioning adult, cell by cell, and so never have been an infant. People would agree that if it's a person it deserve a person's rights. That kind of thinking is what I was challenging.

        What if it's a person with a prosthetic leg. Do they deserve fewer rights? What about a fully prosthetic body and only a human brain? What about a human body and a prosthetic brain?

        It's an area that people have a gut instinct on simply because they've never had to think about a machine as complex as a human, or conversely that they and all other humans are merely machines.

        I agree with both yours and Mando's posts, dealing with the right here, right now, it's difficult to argue for machine rights. But things advance all the time and perhaps we should think about it, even though it seems vaguely ridiculous to do so.

        Comment Hidden ( show )
  • InfiniteCycles

    2.

    - Determining sentience is, what I'd argue, an inherently arbitrary thing. I cannot look it another's mind as, I should hope, other's cannot look it mine; whether or not other people exist, amongst a medley of other things, cannot be ascertained, but, the thing is, we ought act as if they are real, subjugate oneself's being to what very well might be a sort of simulacra, hyperreality, for the most pleasure and other miscellaneous stuff awaits; in our interactions with other people we make the implicit assumption, or least, for those paranoid folks, adopt the façade, of an adaptation of Descartes's 'I think, therefore I am': 'They think, therefore they are.'; which translates to granting this thing, that may or may not be conscious, all the societally-ordained rights of something sentient, something self-determining, something alive, and these aforesaid attributions we grant other people have basis, unlike determining sentience, the possibility of sentience is probably not an arbitrary thing as a certain amount of computational power, not necessarily what humans have for a variety of environmental factors have added things here and there in accordance with what proved threatening and what did not and what awarenesses perpetuated and propagated most successfully humankind, is required to collect situational data, and then interpret or, often times, store or just in general manipulate the data, and then determine how or how not to fashion one's behaviors or actions to the data per emotions or social contexts or whatever, and, obviously, these types of fashionings, emotions and social contexts and akin, and mental, conscious or otherwise, also require computational capacity and, if our experiences with the silicon medium says anything, that is pretty high on the magnitude scale so much so that the total computational capacity of human-made chips exceeds the mind only thrice over; computations are computations, mediums do not matter in this respect, and the question is: 'Why can't emotion exist on an inorganic medium?' If something has the objective capacity for such a thing and however the thing is programmed seems to have the teleology of something meant to have a dynamicism in navigating life's totally little petty unsexy things to the grandest of narratives, or maybe just the mundanity of the in-between, then how are we to argue?

    - Here's a nice Star Trek reference ('The Measure of a Man,' if you don't know): 'Picard: Your ruling today will determine how we will regard this creation of our genius. It will reveal the kind of people we are, what he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty. Expanding them for some, savagely curtailing them for others. Are you prepared to condemn him, and all those who come after him, to servitude and slavery?' Let me expand. This is not a question of one man's right to his own creation, because the current laws of propriety and of morals thereof never contemplated the possibility of the purported object of propriety is sentient or at minimum exhibiting the computational requisite per the only available standard - human, and thus are inept and need to be revisited, and let me revisit for everyone's sake, that constitutes slavery, but only on a different form, not of a different form. Have some sympathy and think from the perspective of these possible sentients: 'How would you like it if an alien species came along and started questioning whether we deserve rights because we aren't conscious or conform to their higher formation of it?' And surely there are cases of greater good, but shouldn't we afford them the same respect as any other sentient being until that occasion is exposed and proven?

    - Well, hope I didn't sound to stupid or over-the-top in my word choices. All beatdowns, as I have said, are welcome, even the religious ones. Please remember, though, I am just spewing here and don't feel like neither writing nor structuring an actually good semi-essay on this, so, yeah.

    Comment Hidden ( show )
  • InfiniteCycles

    1.

    - I don't know if I am explaining this right and, well, I probably sound an immense fool right, so please take my two pence, stupid pun right (?), with about the same disregard one gives, stupid reference FTW (?), a fly on the wall, your complete and utter attention armed with a newspaper, of logic, and an insatiable drive to deliver a smackdown. Oh God, I sound like a trite, intertextual, pompous a-hole, don't I? Well, anyway, right from the floodgates, sentience is intelligence, self-awareness, and consciousness; the former two are easy to determine, the remainder is not, if not impossible, but the possibility is determinable, and very easily so. I argue not that consciousness must be ascertained and have a confluence of criterion backing it up, alternatively I scrutinize the structures whose capacities may create that ineffable human quality and what those structures need to be realities.

    - As is obvious, there are other certain factors that are pretty much distinguishably humans. One, humans have the ability for creativity; two, humans have emotions. Creativity and emotions are great and they enrichen most aspects of our lives, but they do not constitute part nor whole of the consciousness equation; the aforesaid other qualities are just adding to the baseline of sentience, furthering, the range and abilities of consciousness, but, for example, we bestow sentient rights to complete vegetables of people and retarded, per psychiatric terminology, and lesser entities, when either can, sometimes do, lack either of these qualities, or be 'there' in an animal manifestation, i.e., anger, love, or something alike, (there are others, but the point is articulated none-the-less) and, h*ll, might be devoid of consciousness and mere shells of the very possibility of sentience. Once again, the power of possibility is triumphant in the scheme of things. Plus, the appearance of emotion is getting there in structural equivalents of the head, and creativity is like a meta-program that supervises all sub-programs and allowing certain things to loop and others not depending upon the efficacy of such loops in accomplishing their teleology and artistic creativity, I argue, could be accomplished via analyzing the artifice and products derived thereby and then developing a sort of topical randomizer and mixer that, as the name intimates, creates a mélange of styles and techniques and themes and subject matters and realizes the pictorial and/or linguistic and/or auditory potential of the vision, if you will, and then another program, a meta-meta-program, could revitalize these mixings by drawing upon indices of knowledge and concocting various different and maybe successful, reality and culture are complicated constructs, variations in the aforementioned mélanges; now I likely underestimate the complexity of artistic creation, though I do my fair-share, and thus fail hard, but, if history be any indicator, the future bears a superabundance of things-impossible becoming things-possible. Besides, cutting all the artistic speculation, if self-awareness was under the penumbra of doubt this meta-programming is equivalent to what we call self-awareness, the ability to make adaptations to our environment in whatever sense of the word you construe it as being.

    - So, anywho, if a artificial being thinks it experiences sensations and thinks thoughts and has experiences and meets the requisite 10^23 calc./sec. of computational power, or less, though this may BS, because of the increased efficiency of silicon-based circuits juxtaposed with neuronic pathways, to mimic the human brain via analog, then what compels us to question this? What if these hunks of metal, silicon, programming, prostheses, and myriad other apparatuses and materials are attaining sentience? They are indubitably smart, albeit also in non-human complexly logically holy-f*ck worthy ways, and are in all likelihood self-aware due to the necessity of such in any communicative- and/or locomotive- oriented human-machine symbiosis, and who knows? They just might transcend those two petty barriers and have attained sentience, and to treat a sentient being, different mostly by medium, would constitute a flagrant abrogation of not only an individual's rights, but of an entire race's ability to pursue their life, their liberty, their happiness.

    Comment Hidden ( show )
  • Dad

    "My mobile phone is probably about as complex a machine as a small fish."

    Um, no.
    Your mobile phone is not even as complex as a single living cell (ironic use of words)

    I think rights should be given to something (either alive or inanimate) when that something is important to us. ie Why do we try to kill cancer cells, don't they have rights? No, we hate them. Kill them all I say, and I'm not voting cancer in, we already have religion.

    Comment Hidden ( show )
      -
    • I'm going to have to disagree there. There are certain cells which are complex indeed, but not all are. I think it's valid to compare genetic code to programming code because, essentially, it's not really that different as a concept.

      For a mobile phone running either of the three most common operating systems, the lines of programming vastly outnumber lines of genetic code in a single cell, but not the sum total of a complex organism with an advanced brain.

      I did do a bit of research before I posted this question and thought hard about the type of creature I could validly compare to my mobile phone. I maintain "small fish" as a valid analogy. Especially as it is vague enough to give me lots of wiggle room. :)

      Comment Hidden ( show )
        -
      • Dad

        Well you can run a computer code and expect everything to work. But a genetic code won't work without a little life included, since the code, at best, could only make a dead cell. I suppose if you had said a 'dead' fish, that would have been closer.

        The complexity of a living fish who's processing brain can do much more than provide communicative ability, but even create new life, far outweighs a stand alone mobile phone.

        Comment Hidden ( show )
          -
        • Legion

          think about that for a minute, the computer code cant necessarily stand on its own either, cause it requires an assembler program and a computer to work. the assembled program has to be run inside a computer, which is able to read and follow the code, just like The genes in the human body are the blueprints for building and using the life processes in a person, that computer code tell the computer how to make and use a program. Also, in many cases, the program (or organism) as a whole can run even when there are problems in the "code" or "genes", the problems will sometimes show themselves in certian ways, like a video game glitch or a mental defect. some errors may even prevent the program or organism from working altogether.

          without life or the computer, the genes and/or computer program are just empty code.

          Comment Hidden ( show )
            -
          • Dad

            You could also say that humans require an evolutionary perfection on Earth to even be produced in the first place.
            The topic was will they 'ever' be given rights. Whereas you could argue that an android with living external tissue (obviously the living tissue off some other animal or possibly even synthetics) BUT in essence still a machine on the inside, could that 'thing' be given rights?

            What classes as 'rights' in the first place? Is it greed or suffering or even common good in the present community? I wonder if 'rights' are feelings and not just based on intelligence. I'm quite sure the supercomputers of the world may far outweigh the intelligence of some of our earthly beings in our world. Even a chicken has rights, but strangely insects don't seem to. Is it based purely on intelligence or the feelings of pain, loss, and suffering? Even love?

            I suppose the question could then be, if an intelligent 'machine' was able to create other intelligent machines (you could call them offspring) and therefore reproduce, would they THEN have rights? Or would they also need to also 'feel' a sense of love and pain and even society membership to finally be given rights? Even if those rights were to just treat them with respect.

            No, a cellular phone is no where close to a living cell (or better yet, organism). But maybe 'one day' (just like life before them) 'machines' will have their own right of freedom of movement and speech, at which time I'd say that present day humans will be extinct! Humans have only been around for approximately 200K years (before that in another lifeform), in a 13.7 Billion year old known universe. I'd say that the next 'lifeform' species could easily be android, based purely on the weakness of human limitations within our short time alive.

            Humans will be obsolete, all hail the mighty machine. For without them most of us would be dead already.

            Comment Hidden ( show )
              -
            • Legion

              yes, Im aware of the main question, but the point you brought up was about how computer code would run just by itself, where genetic code cannot. that was my response to that point.

              I do agree that phones are no where near as complex as even a fish, considering we have million dollar humanoid robots that are vastly inferior to even a dog, a mobile phones processor would still be vastly outclassed by a fish

              Comment Hidden ( show )
    • bananaface

      Sorry for the pointless comment, but that "cell" thing is just hilarious, haha:D!

      Also, what do you mean by your last sentence, if you don't mind me asking? Sorry for being dense, but I'm not really sure what you're getting at.:S

      Comment Hidden ( show )
        -
      • Dad

        Oh I was implying that religion is a cancer and 'seemed' to go along with things that don't require any respect or rights. To tell you the truth it could be omitted, it has little to do with topic, except as a lead on from the fish analogy (fish being the common recognized symbol of Christianity).

        Comment Hidden ( show )
  • Darkoil

    Lol you actually think your mobile phone is as complex as a fish?

    Comment Hidden ( show )
      -
    • In terms of a comparison between genetic and technical programming, yes. They can't be compared like-for-like but if you compare a nucleotide base to a machine-level operator, most phone operating systems come out ahead. For comparison, I intended a phone with no apps whatsoever, not even the ability to dial a number or send a text. Add the apps most phones are bundled with (and the ones people add) and the phone is far ahead as one individual app can be more complex programmatically than a small fish.

      It's very easy to look at the results of modern computer programming as simplistic but this is because it's intended to feel like that. In actuality, a layer based model abstracts even the programmer from the hardware. There is a lot going on at the lowest level. When my phone is going full-tilt, it is doing 1.4 billion things per second (best example I can give without getting into processor benchmarking). Imagine how complicated something has to be to require 1.4 billion operations to complete. And that's just for something taking a second. As a very rough example, the process of sending a text message would take twenty to thirty million low-level operations.

      I promise you, I did think very hard before making the comparison and I'm one of those rare weirdos who has actually written and designed operating systems in native code and who also has a bit of a background in the medical sciences.

      Comment Hidden ( show )
        -
      • Darkoil

        Don't get me wrong, I understand how complex the technology is in a mobile phone. I myself did some low level programming back in college and I do like the comparison between genetic code and programming code. The problem is the fish has a brain, this means it is so complex we will never fully understand how the fish works. Yeah you could compare the brain to a processor in terms of flops but we are talking about complexity not speed. The processor might seem complex but it is simple enough that a brain designed and developed it. One of my neurology lecturers once said that if the brain was simple enough for us to ever understand it then we would be too 'stupid' to understand it.

        Comment Hidden ( show )
          -
        • Legion

          a paradox we have!

          Comment Hidden ( show )
  • Wambo37

    no

    Comment Hidden ( show )
  • robbieforgotpw

    A machine has the "right" to do exactly what it's made for.

    Comment Hidden ( show )
  • chainsaw88

    Eventually there will be AI as complex (or even more complex) as the human mind. With the way technology is exponentially growing, it is bound to happen, especially after neuroscience has completely mapped out the functions of the human brain. So what is the difference between a person and a complex machine? How complex does it's mind have to get before it's not considered artificial anymore? Robots are programmed to react a certain way to certain triggers. So are people. EVERY choice you ever make is because of nature or nurture. Your internal programming (genetics), and the changes to that programming from the experiences you faced growing up. A robot can be programmed to change and learn with experiences. It may sound asinine now, especially if you are religious you don't want to believe it, but how could you know for sure that a highly advanced 'artificial' intelligence won't have emotions in the future? Emotional responses are programmed in our brain and are not random. We feel emotions because of signals. I believe in the far future this will be able to be replicated.

    Comment Hidden ( show )
  • Legion

    honestly, I think before we will be able to create a physical robot that can think independently, we should work on making a "virtual Robot", or making AI brain that allows a robot rendered on a computer to interact and think for itself, maybe even walk around in the real world with the aid of a hologram projector,(if we get that far) or the ability to manipulate machines wirelessly using a program that mimicks a control panel. (kinda like the nagigation computer in sci-fi starships. Granted, the "Robot" wont have much practical purpose other than the ability to think, (unless it can manipulate outside machines, but it will give researchers a much easier time to develop a robot brain without having to overcome physical problems like walking and movement limits of current tech.

    now, like physical robots, would that computer deverve rights like human beings? If it develops sentience, than possibly, but it will be very difficult for us to determine if it is in fact, sentient.

    Comment Hidden ( show )
  • 1000yrVampireKing

    If we ever make a computer that is sworn to protect a human and they hurt a human trying to kill them or attack their own human. I think they should be judged like any human. If they ever become so advanced.

    Comment Hidden ( show )
  • q25t

    Infinite Cycles put it best.

    On a related topic, do human consciousnesses that have been transplanted into robotic hosts have the same rights?

    This isn't really all that much of a sci-fi type of a question, either. The technology for this will probably be within this century.

    Comment Hidden ( show )
  • mightymouse

    THE ONLY IMPORTANT FACTS

    1. Emulation of true consciousness is not equal to true consciousness. If machines could ever be TRULY conscious, then the degree to which they were conscious would be a good basis of "rights".

    2. The soul has not yet been conclusively scientifically proven. If it ever is, then it would likely be a determining factor in the giving out of "rights".

    That is all.

    --------------------
    | mightymouse |
    --------------------

    Comment Hidden ( show )
  • Neomatt

    If machines can fight and act in there own interest then I say they deserve rights, but sadly they are not capable of any type of feelings or have the understanding of harm to themselves they can not argue with us or tell us no they do what they are programed to do nothing more nothing less.

    Comment Hidden ( show )
  • GuessWho

    Maybe when someone writes a true AI software that can understand and replicate human emotions.
    In other words... When your computer goes on strike and refuses to print an important document even though everything is working properly.

    Comment Hidden ( show )
      -
    • VioletTrees

      Do they have to be human emotions? I mean, if we create machines who experience and feel, but whose experiences and feelings are different from humans', what do we do?

      Comment Hidden ( show )
        -
      • WordWizard

        I think a robot should think like a computer. The logic is the beauty of perfection and they will be much easier to deal with. I want a robot president. Oh never mind we already elected the Terminator as governor. I think no Cyborg. Pure robot all the way.

        Comment Hidden ( show )
      • GuessWho

        There's no practical purpose to that anyway, so it'll never happen.

        Comment Hidden ( show )
          -
        • VioletTrees

          People invent things that don't have practical purposes all the time.

          Comment Hidden ( show )
  • Hippie

    If they had rights we would have to Adopt a computer or smrtphone. And not be able to smash it when we got a new one. To me it says " more paperwork". If they do develop personalities, ill change my vote

    Comment Hidden ( show )
  • thinkingaboutit

    read "god & golem inc".

    I feel like that is something you would fucks with.

    Comment Hidden ( show )
  • Something being alive hinges on its awareness of its own reality. Since machines don't possess that I can't say they need rights.

    But an equally interesting question is-

    What is the nature of a "right" and why do we deserve them? For what purpose do they exist?

    Comment Hidden ( show )
  • squirelhunter

    I dunno for me it creates some sort of god complex asif we should have different rules as we have given life to it. yh I watch too much terminator.

    Comment Hidden ( show )
  • WordWizard

    If its have biological and half machine I think it has rights and feelings but no I have never seen a terminator robot lol.

    Comment Hidden ( show )
  • Glass

    When machines are fully self aware, then yes. But as of right now machines are just tools, its not thinking or percieving, its simply doing. What would it even take for a program to be self aware? They're programmed actions and reactions, how would one suddenly be able to think?

    Comment Hidden ( show )
  • KeddersPrincess

    I immediatly thought about Skynet.

    Comment Hidden ( show )
  • Wendell

    No

    Comment Hidden ( show )