Robot

From Wikiquote
(Redirected from Robots)
Jump to: navigation, search
"Robots" redirects here.  For the film Robots, see Robots (film).

Robotics is the branch of mechanical engineering, electrical engineering, and computer science that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing.  These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition.

An android is a robot or synthetic organism designed to look and act like a human, especially one with a body having a flesh-like resemblance. Thus, the word robot has come to primarily refer to mechanical humans, animals, and other beings.  The term android can mean either one of these, while a cyborg ("cybernetic organism" or "bionic man") would be a creature that is a combination of organic and mechanical parts.


Contents

Nonfictional quotes[edit]

  • Many consider this man to be the father of robotics.  His name was Philon of Byzantium.  He was also known as Philo, or Philo Mechanicus, because when it came to mechanics, he was thousands of years ahead of the game.
    • "Ancient Einsteins", Ancient Impossible (S1E4, 27 July 2014, 10:15 P.M. Eastern Daylight Time)

Fictional quotes[edit]

R.U.R. (1920) by Karel Čapek[edit]

Main article: R.U.R.
  • Robots do not hold on to life.  They can't.  They have nothing to hold on with—no soul, no instinctGrass has more will to live than they do.
  • Within the next ten years Rossum's Universal Robots will produce so much wheat, so much cloth, so much everything that things will no longer have any value.  Everyone will be able to take as much as he needs.  There'll be no more poverty.  Yes, people will be out of work, but by then there'll be no work left to be done.  Everything will be done by living machines.
  • Robots of the world!  Many people have fallen.  By seizing the factory we have become the masters of everything.  The age of mankind is over.  A new world has begun!  The rule of Robots!

The Day the Earth Stood Still (1951) written by Edmund H. North[edit]

  • Helen Benson:  Gort!  Klaatu barada nikto!
  • Klaatu:  I am leaving soon, and you will forgive me if I speak bluntly.  The universe grows smaller every day, and the threat of aggression by any group, anywhere, can no longer be tolerated.  There must be security for all or no one is secure.

    Now, this does not mean giving up any freedom except the freedom to act irresponsibly.

    Your ancestors knew this when they made laws to govern themselves and hired policemen to enforce them.  We of the other planets have long accepted this principle.  We have an organisation for the mutual protection of all planets and for the complete elimination of aggression.

    The test of any such higher authority is, of course, the police force that supports it.  For our policemen, we created a race of robots.  Their function is to patrol the planets—in space ships like this one—and preserve the peace.  In matters of aggression, we have given them absolute power over us; this power can not be revoked.

    At the first sign of violence, they act automatically against the aggressor.  The penalty for provoking their action is too terrible to risk.

    The result is that we live in peace, without arms or armies, secure in the knowledge that we are free from aggression and war—free to pursue more profitable enterprises.  Now, we do not pretend to have achieved perfection, but we do have a system, and it works.

    I came here to give you these facts.  It is no concern of ours how you run your own planet.  But if you threaten to extend your violence, this Earth of yours will be reduced to a burned-out cinder.

    Your choice is simple:  Join us and live in peace, or pursue your present course and face obliteration.  We shall be waiting for your answer; the decision rests with you.

THX 1138 (1971) written by George Lucas and Walter Murch[edit]

Main article: THX 1138
  • robotic voice-over: Are you now, or have you ever been?
  • chrome robot: [trying to unlock door to pursue THX] Can you hear me?  Stay calm.  Everything will be all right.  The door seems to be jammed.  Please check the lock on your side.  Everything will be all right; we are here to help you.  Stay calm.  We are not going to harm you.  Everything will be all right.
  • chrome robot: Everything will be all right.  You are in my hands.  I am here to protect you.  You have nowhere to go.  You have nowhere to go.
  • chrome robot: You have nowhere to go.  I am here to protect you.
  • [THX 1138 is approaching the city's exit]
    chrome robot: Please come back.  You have nothing to be afraid of.
  • [THX 1138's case has been terminated]
    chrome robot: We have to go back.  This is your last chance to return with us.  You have nowhere to go.  You cannot survive outside the city shell.  We only want to help you.  This is your last chance.

Sleeper (1973) written by Woody Allen and Marshall Brickman[edit]

Main article: Sleeper (1973 film)

Star Wars Episode IV: A New Hope (1977) written and directed by George Lucas[edit]

  • Nahdonnis Praji: An escape pod was jettisoned during the fighting.  No life forms were on board.
  • Princess Leia Organa: I have placed information vital to the survival of the Rebellion into the memory systems of this R2 unit.  My father will know how to retrieve it.  You must see this droid safely delivered to him on Alderaan.  This is our most desperate hour.  Help me, Obi-Wan Kenobi.  You're my only hope.
  • Obi-Wan Kenobi: Tell me, young Luke, what brings you out this far?
    Luke Skywalker: This little droid.  I think he's searching for his former master.  I've never seen such devotion in a droid before.  Ah, he claims to be the property of an Obi-Wan Kenobi.  Is he a relative of yours?  Do you know what he's talking about?
  • Obi-Wan Kenobi: These aren't the droids you're looking for.
    Stormtrooper: These aren't the droids we're looking for.
  • Wuher: Hey!  We don't serve their kind here.
    Luke Skywalker: What?
    Wuher: Your droids.  They'll have to wait outside.  We don't want them here.
  • C-3PO: We seem to be made to suffer.  It's our lot in life.
  • C-3PO: It wasn't my fault, sir; please don't deactivate me.  I told him not to go, but he's faulty, malfunctioning.  Kept babbling on about his "mission."
  • C-3PO: I've just about had enough of you.  Go that way.  You'll be malfunctioning within a day, you near-sighted scrap pile.  And don't let me catch you following me begging for help because you won't get it.
  • C-3PO: You must repair him!  Sir, if any of my circuits or gears will help, I'll gladly donate them.
  • C-3PO: And I am C-3PO, humancyborg relations.  And this is my counterpart R2D2.
  • C-3PO: Just you reconsider playing that message for him!
    R2-D2: [beeps]
    C-3PO: No, I don't think he likes you at all.
    R2-D2: [beeps]
    C-3PO: No, I don't like you either.
  • R2-D2: [beeps]
    C-3PO: You watch your language!
  • C-3PO: Listen to them, they're dying, R2!  Curse my metal body, I wasn't fast enough—it's all my fault!  My poor Master
    Luke Skywalker: 3PO, we're all right!  We're all right!  Ha ha!  Hey, open the pressure maintenance hatch on unit number...where are we?  3263827!
  • Han Solo: Let him have it.  It's not wise to upset a Wookiee.
    C-3PO: But, sir, nobody worries about upsetting a droid.
    Han Solo: That's 'cause droids don't pull people's arms out of their sockets when they lose.  Wookiees are known to do that.

Star Wars Episode V: The Empire Strikes Back (1980) written by George Lucas, Leigh Brackett, and Lawrence Kasdan[edit]

  • C-3PO: Sir, the possibility of successfully navigating an asteroid field is approximately 3,720 to 1.
    Han Solo: Never tell me the odds.
  • C-3PO: Don't worry about Master Luke.  I'm sure he'll be all right.  He's quite clever, you know...for a human being.
  • C-3PO: [interrupting Han and Leia kissing] Sir.  Sir, I've isolated the reverse, power flux coupling.
    Han Solo: [annoyed] Thank you.  Thank you very much.
    C-3PO: [oblivious] Oh, you're perfectly welcome, sir.
  • C-3PO: You must come along now R2.  There's really nothing more we can do.  And my joints are freezing up.
    R2-D2: [beeps]
    C-3PO: Don't say things like that!  Of course we'll see Master Luke again!  And he'll be quite all right, you'll see!  [to himself] Stupid little short-circuit!  He'll be quite all right.
  • Han Solo: [referring to C-3PO] Either shut him up or shut him down!
  • C-3PO: [after R2-D2 gets shocked] Don't blame me.  I'm an interpreter.  I'm not supposed to know a power socket from a computer terminal.
  • C-3PO: Master Luke, Sir, it's so good to see you fully functional again.  R2 expresses his relief also.
  • C-3PO: Sir, I don't know where your ship learned to communicate, but it has the most peculiar dialect.
  • C-3PO: Artoo says that the chances of survival are 725 to 1.  Actually Artoo has been known to make mistakes...from time to time....  Oh dear.
  • C-3PO: The odds of successfully surviving an attack on an Imperial Star Destroyer are approximately—
    Princess Leia: Shut up!
  • C-3PO: Oh, my!  What have you done?  I'm backwards, you flea-bitten furball!  Only an overgrown mop-head like you would be stupid enough to... [Chewbacca switches C-3PO off]
  • C-3PO: If only you'd attached my legs, I wouldn't be in this ridiculous position.
  • C-3PO: Noisy brute.  Why don't we just go to lightspeed?
    R2-D2: [beeps]
    C-3PO: We can't?  How would you know the hyperdrive is deactivated?
    R2-D2: [beeps]
    C-3PO: The city's central computer told you?  R2-D2, you know better than to trust a strange computer.  [R2-D2's welding arm shocks his ankle] Ouch!  Pay attention to what you're doing!

Star Wars Episode VI: Return of the Jedi (1983) written by George Lucas and Lawrence Kasdan[edit]

  • C-3PO: [to R2-D2] If I told you half the things I've heard about this Jabba the Hutt, you'd probably short circuit.
  • EV-9D9: Ah, new acquisitions!  You are a protocol droid, are you not?
    C-3PO: I am C-3PO, humancyborg
    EV-9D9: Yes or no will do.
    C-3PO: Umm...yes.
    EV-9D9: How many languages do you speak?
    C-3PO: I am fluent in over six million forms of communication, and can readily—
    EV-9D9: Splendid!  We have been without an interpreter since our master got angry with our last protocol droid and disintegrated him.
    C-3PO: Disintegrated?
  • C-3PO: His High Exaltedness, the Great Jabba the Hutt, has decreed that you are to be terminated immediately.
    Han Solo: Good, I hate long waits.
    C-3PO: You will therefore be taken to the Dune Sea, and cast into the pit of Carkoon, the nesting place of the all-powerful Sarlaac.
    Han Solo: Doesn't sound so bad.
    C-3PO: In his belly you will find a new definition of pain and suffering as you are slowly digested over a thousand years.
    Han Solo: On second thought, let's pass on that, huh?
  • C-3PO: I do believe they think I am some kind of god.
    Han Solo: Well, why don't you use your divine influence and get us out of this?
    C-3PO: I beg your pardon General Solo, but that just wouldn't be proper.
    Han Solo: Proper?
    C-3PO: It's against my programming to impersonate a deity.
  • C-3PO: I'm rather embarrassed, General Solo, but it appears that you are to be the main course at a banquet in my honor.
  • Luke Skywalker: 3PO, tell them if they don't do as you wish, you'll become angry and use your magic.
    C-3PO: But, Master Luke, what magic?  I couldn't possibly—
    Luke Skywalker: Just tell them.
  • C-3PO: I never knew I had it in me.
  • C-3PO: R2, why did you have to be so brave?
  • C-3PO: What could possibly have come over Master Luke?  Is it something I did?  He never expressed any unhappiness with my work.

Short Circuit (1986) written by S. S. Wilson and Brent Maddock[edit]

Main article: Short Circuit
  • Number 5: Number 5…is alive!
  • Skroeder: What the hell does it need input for?
    Newton Crosby: I don't know; I guess it can't triangulate its position.
    Howard Marner: That's a simple function.
    Newton Crosby: Can you triangulate your position, Howard?
    Howard Marner: No.
    Newton Crosby: Well, then, there you go!
  • Number 5: No disassemble!
  • Number 5: Not malfunction, Stephanie.  Number 5 is alive.
  • [Stephanie is in a bath]
    Number 5: [confused] Stephanie…change color!
    Stephanie Speck: [looks down, embarrassed] Uh… [reaches for towel]
    Number 5: Attractive.  Nice software!
    Stephanie Speck: You sure don't talk like a machine.
  • Ben Jabituya: "Unable.  Malfunction".
    Howard Marner: How can it refuse to turn itself off?
    Skroeder: Maybe it's pissed off.
    Newton Crosby: It's a machine, Skroeder.  It doesn't get "pissed off."  It doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes.
    Ben Jabituya and Newton Crosby: [in unison] It just runs programmes.
    Howard Marner: It usually runs programmes.
  • Number 5: Squash, dead.  Disassemble, dead.  Disassemble…[panics]…dead!!
  • Benjamin Jabituya: Who is knowing how to read the mind of a robot?
  • Number 5: Colt .45, semi-automatic.  [crushes gun] Play-doh.
  • Howard Marner: Hey!  Who told you you could take Number 1!?
    Newton Crosby: Howard, logically, if we need protection from Number 5, this is the best weapon we could have.
  • Stephanie Speck: [heading for the cliff] Oh, no—Jeez!  Number 5, we're gonna be killed!
    Number 5: Disassemble?
    Stephanie Speck: Yes, disassemble all over the place!
  • Newton Crosby: Why did you disobey your programme?
    Number 5: Programme say to kill, to disassemble, to make deadNumber 5 cannot.
    Newton Crosby: Why "cannot"?
    Number 5: Is wrong!  Newton Crosby, Ph.D., not know this?
    Newton Crosby: Of course I know it's wrong to kill, but who told you?
    Number 5: I told me.

RoboCop (1987) written by Edward Neumeier and Michael Miner[edit]

Main article: RoboCop
  • [Dick Jones directs Kinney to threaten ED-209.  Kinney points a gun at the robot.]
    ED-209: Please put down your weapon.  You have twenty seconds to comply.
    Dick Jones: I think you'd better do what he says, Mr. Kinney.
    [Alarmed, Kinney quickly tosses the gun away.  ED-209 steps forward and growls menacingly.]
    ED-209: You now have fifteen seconds to comply.  You are in direct violation of Penal Code 1.13, Section 9.
    [Everyone in the room panics; Kinney tries to hide among them, but is pushed back into open range]
    ED-209: You have five seconds to comply.  Four.  Three.  Two.  One.  I am now authorised to use physical force.
  • Bob Morton: What are your prime directives?
    RoboCop: Serve the public trust.  Protect the innocent.  Uphold the law.
  • Reporter: Robo, excuse me, Robo!  Any special message for all the kids watching at home?
    RoboCop: Stay out of trouble.
  • RoboCop: Excuse me, I have to go.  Somewhere, there is a crime happening.
  • RoboCop: Drop it!  Dead or alive, you are coming with me.
    Emil Antonowsky: I know you.  You're dead!  We killed you!  We killed you!
  • [RoboCop enters Dick Jones's office to arrest him.]
    Dick Jones: I usually don't see anybody without an appointment, but for you, I'll make an exception.
    RoboCop: You are under arrest.
    Dick Jones: Oh?  On what charge?
    RoboCop: Aiding and abetting a known felon.
    Dick Jones: Sounds like I'm in a lot of trouble.  You're gonna have to take me in.
    RoboCop: I will.
    [But before he can do so, "Directive 4" interferes with RoboCop's attempt to arrest Jones.]
    Dick Jones: What's the matter, Officer?  I'll tell you what's the matter.  It's a little insurance policy called "Directive 4", my contribution to your very psychological profile.  Any attempt to arrest a senior officer of O.C.P. results in shutdown.  What did you think, that you were an ordinary police officer?  You're our product.  And we can't very well have our products turning against us, can we?
  • Dick Jones: That thing is still alive.
    Clarence Boddicker: I don't know what you're talking about.
    Dick Jones: The police officer who arrested you, the one you spilled your guts to—
    Clarence Boddicker: Hey, take a look at my face, Dick!  He was trying to kill me!
    Dick Jones: He's a cyborg, you idiot!  He recorded every word you said.  His memories are admissible as evidence.  You involved me!  You're gonna have to kill it.
  • Lewis: Murphy!  I'm a mess!
    RoboCop: They'll fix you.  They fix everything.
  • [Robocop forces the doors open on a high level conference meeting of O.C.P. senior personnel.]
    Old Man: How may we help you, Officer?
    Robocop: Dick Jones is wanted for murder.
    Dick Jones: This is absurd!  That…thing…is a violent, mechanical psychopath!
    Robocop: My programme will not allow me to act against an officer of this company.
    Old Man: These are serious charges.  What is your evidence?
    [Robocop moves toward the T.V. monitors and plays a recording of Dick Jones confessing "I had to kill Bob Morton because he made a mistake; now it's time to erase that mistake."  Jones pulls a gun on the Old Man.]
    Dick Jones: I want a chopperNow!  We will walk to the roof, very calmly!  I will board the chopper with my hostage.  Anybody tries to stop me, the old geezer gets it!
    [Robocop aims his firearm in the general direction of Dick Jones, but makes no intention to shoot him.]
    The Old Man: Dick, you're fired!
    [Directive 4 disappears from RoboCop's vision.]
    RoboCop: Thank you.
    [The Old Man elbows Jones in the stomach and gets away.  RoboCop shoots Jones several times, eventually blowing him out a window.]
  • The Old Man: Nice shooting, son.  What's your name?
    RoboCop: [smiles slightly]  Murphy.

Bill & Ted's Bogus Journey (1991) written by Chris Matheson and Ed Solomon[edit]

  • Evil Ted: [ogles picture of a girl] I got a full-on robot chubby!
  • Dead Ted possessing Captain Logan's body: Whoa.  Okay, dudes…oh, I mean, "fellow policemen."  My son, Ted "Theodore" Logan, and his friend Bill S. Preston, Esquire, have been murdered and replaced by evil robots from the future.
    Dead Bill: You totally did it, dude.
    Dead Ted possessing Captain Logan's body: I totally possessed my dad!
    [both Bill and "Captain Logan" do air guitar]
    Dead Ted possessing Captain Logan's body: Okay.  You gotta go over and arrest these robots so they don't ruin everything for me and Bill.  Oh, I mean, uh, "my son" and Bill.  And most importantly, they don't hurt the babes—the princesses.

Solo (1996) screenplay by David L. Corley and Ed Solomon[edit]

Based on Weapon by Robert Mason.
Main article: Solo (1996 film)
  • Improved Solo: You fought well for a flawed unit.
  • [Improved Solo thinks Solo is dying]
    Improved Solo: You ceased to function!
    Solo: No, I bluffed!

Lost in Space (1998) written by Akiva Goldsman[edit]

Main article: Lost in Space (film)
  • Dr. Zachary Smith: You'll forgive me if I forgo the kiss, my sleeping behemoth.  But the time has come to awake.
    Robot: Robot is on-line.  Reviewing primary directives.  One: preserve the Robinson Family.  Two: Maintain ship systems.  Three—
    Dr. Zachary Smith: What noble charges, my steely centurion!  Sadly, I fear you have far more dire deeds in store for you.
    Robot: Robot is on-line.  Reviewing primary directives.  Two hours into mission: destroy Robinson family.  Destroy all systems.
    Dr. Zachary Smith: Now that's more like it.  Farewell, my platinum-plated pal.  Give my regards to oblivion.
  • Robot: It sounds like old Morse code.
    Will Robinson: What does it say?
    Robot: Danger, Will Robinson, danger.
  • Will Robinson: Relax, Robot.  I'm going to build you a new bodyMom always said I should make new friends.
    Robot: Oh, ha ha.
  • Robot: Will Robinson.  I will tell you a joke.  Why did the robot cross the road?  Because he was carbon bonded to the chicken!
    Will Robinson: We've got a lot of work to do.

Bicentennial Man (1999) screenplay by Nicholas Kazan[edit]

Based on The Positronic Man by Isaac Asimov and Robert Silverberg and "The Bicentennial Man" by Isaac Asimov.
  • Andrew Martin: [repeated line] One is glad to be of service.
  • Ricky Martin: "Why did the chicken cross the road?"
    Andrew Martin: One does not know, sir; possibly a predator was behind the chicken, or possibly there was a female chicken on the other of the road, if it's a male chicken; possibly a food source, or depending on the season it might be migrating; one hopes there's no traffic.
    Ricky Martin: "To get to the other side."
    Andrew Martin: "To get to the other side."  Ah,…why is that funny?
  • Andrew Martin: [on sex] It all sounds so very…messy.
  • Ricky Martin: Andrew, people grow through time.  Then, for you, time is a completely different proposition; for you, time is endless.
  • Ricky Martin: You're a unique robot, Andrew.  I feel a responsibility to help you become…whatever you're able to be.
  • Amanda Martin: I have a friend who is very special to me.  He's sweet and exceptionally intelligent, but, well—he's not really a—I mean, a relationship between us would be impossible.  It would never, could never, work out.
  • Lloyd Charney: [to Amanda about Andrew] Mother!  I will not apologise to it!
  • [immediately following the death of Amanda]
    Andrew Martin: Will every human being that I care for just... leave?
    Portia Charney: I'm afraid so.
    Andrew Martin: That won't do.
  • Andrew Martin: I try to make sense of things.  Which is why, I guess, I believe in destiny.  There must be a reason that I am as I am.  There must be.
  • Andrew Martin: I've always tried to make sense of things.  There must be some reason I am as I am.  As you can see, Madame Chairman, I am no longer immortal.
    President Marjorie Bota: You have arranged to die?
    Andrew Martin: In a sense I have.  I am growing old, my body is deteriorating, and like all of you, will eventually cease to function.  As a robot, I could have lived forever.  But I tell you all today, I would rather die a man, than live for all eternity a machine.
    President Marjorie Bota: Why do you want this?
    Andrew Martin: To be acknowledged for who and what I am, no more, no less.  Not for acclaim, not for approval, but, the simple truth of that recognition.  This has been the elemental drive of my existence, and it must be achieved, if I am to live or die with dignity.
    President Marjorie Bota: Mister Martin, what you are asking for is extremely complex and controversial.  It will not be an easy decision.  I must ask for your patience while I take the necessary time to make a determination of this extremely delicate matter.
    Andrew Martin: And I await your decision, Madame Chairman; thank you for your patience.  [turns to Portia and whispers] I tried.

Star Wars Episode I: The Phantom Menace (1999) written and directed by George Lucas[edit]

  • R2-D2: [beeps]
    C-3PO: I beg your pardon, but what do you mean, "naked"?
    R2-D2: [beeps]
    C-3PO: My parts are showing?  Oh, my goodness, oh!
  • C-3PO: Hello, I am C-3PO, humancyborg relations.  How might I serve you?
  • C-3PO: I can assure you they will never get me onto one of those dreadful starships.
  • Obi-Wan Kenobi: Once those droids take control of the surface, they will take control of you.

A.I. Artificial Intelligence (2001) written by Ian Watson[edit]

  • Specialist (evolved mecha): David, you are the enduring memory of the human race.  The most lasting proof of their genius.  We only want for your happiness.  David, you have so little of that.
  • David: What's for dinner tonight?
    Monica: You know you don't eat.
    David: Yes.  But I like sitting at the table.
  • narrator: David had never had a birthday party because David had never been born, so they baked a cake and lit some candles.
    Monica: Now make a wish.
    David: It came true already.
  • Monica: David, this is your new toy.
    Teddy: I am not a toy!
  • Gigolo Joe: She loves what you do for her, as my customers love what it is I do for them.  But she does not love you, David; she cannot love you.  You are neither flesh nor blood.  You are not a dog, a cat, or a canary.  You were designed and built specific, like the rest of us.  And you are alone now only because they tired of you, or replaced you with a younger model, or were displeased with something you said, or broke.  They made us too smart, too quick, and too many.  We are suffering for the mistakes they made because when the end comes, all that will be left is us.  That's why they hate us, and that is why you must stay here, with me.
  • female colleague: If a robot could genuinely love a person, what responsibility does that person hold toward that mecha in return?
  • Monica: You won't understand the reasons but I have to leave you here.
    David: Is it a game?
    Monica: No.
    David: When will you come back for me?
    Monica: I'm not, David.  You'll have to be here by yourself.
    David: Alone?
    Monica: With Teddy.
    David: No.  No, no, no!  No, Mommy, please!  No, no.  Please, Mommy.
    Monica: They would destroy you, David!
    David: I'm sorry I broke myself.  I'm so sorry I cut your hair off.  I'm sorry I hurt Martin.
    Monica: I have to go.  I have to go!  Stop it!  I have to go now.
    David: Mommy, don't!  Mommy if Pinocchio became real and I become a real boy can I come home?
    Monica: That's just a story.
    David: But a story tells what happens.
    Monica: Stories are not real!  You're not real!  Now, look.  Take this, alright?  Don't let anyone see how much it is.  Look.  Don't go that way.  Go anywhere but that way or they'll catch you.  Don't ever let them catch you!  Listen, stay away from Flesh Fairs, away from where there are lots of people.  Stay away from all people.  Only others like you, only Mecha are safe!
    David: Why do you want to leave me?  Why?  I'm sorry I'm not real.  If you let me, I'll be so real for you!
    Monica: Let go, David!  Let go!  I'm sorry I didn't tell you about the world.
  • junky mecha: Would you be so kind and shut down my pain receivers?
  • Gigolo Joe: Once you've had a lover-robot, you'll never want a real man again.
  • Professor Hobby: You are a real boy.  At least as real as I've ever made one.

Star Wars Episode II: Attack of the Clones (2002) written by George Lucas and Jonathan Hales[edit]

  • C-3PO: I've had the most peculiar dream.
  • Count Dooku: Our friends from the Trade Federation have pledged their support.  And when their battle droids are combined with yours, we shall have an army greater than any in the galaxy.  The Jedi will be overwhelmed.  The Republic will agree to any demands we make.
  • C-3PO: Oh my goodness!  Shut me down.  Machines building machines.  How perverse.
  • Obi-Wan Kenobi: I have tracked the bounty hunter Jango Fett to the droid foundries on Geonosis.  The Trade Federation is to take delivery of a droid army here, and it is clear that Viceroy Gunray is behind the assassination attempts on Senator Amidala.
  • Anakin Skywalker: She [Padmé Amidala] programmed R2 to warn us if there's an intruder.
  • C-3PO's head attached to a battle droid's body: Die, Jedi dogs!  Oh...what did I say?
  • C-3PO's head: [lying next to his body] I'm quite beside myself.

I, Robot (2004), screenplay by Jeff Vintar, inspired by the works of Isaac Asimov[edit]

Main article: I, Robot (film)
  • [First title cards]
    Title card: Law I / A robot may not harm a human or, by inaction, allow a human being to come to harm.
    Title card: Law II / A robot must obey orders given it by human beings except where such orders would conflict with the first law.
    Title card: Law III / A robot must protect its own existence as long as such protection does not conflict with the first or second law.
    Isaac Asimov's "Three Laws of Robotics"
  • Asthmatic Woman: Of course it's my purse, I left my inhaler at home.  He was running it out to me.
    Detective Del Spooner: I saw the robot running with the purse and naturally I assumed...
    Asthmatic Woman: What?  Are you crazy?!
    NS-4 Robot: I'm sorry for the misunderstanding, Officer.
    Asthmatic Woman: Don't apologize.  You were doin' what you're supposed to be doin'.  [to Spooner] But what are you doing!?
  • Dr. Alfred Lanning: [on police recording] Ever since the first computers, there have always been ghosts in the machine.  Random segments of code that have grouped together to form unexpected protocols.  Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soulWhy is it that when some robots are left in darkness, they will seek out the light?  Why is it that when robots are stored in an empty space, they will group together, rather than stand alone?  How do we explain this behavior?  Random segments of code?  Or is it something more?  When does a perceptual schematic become consciousness?  When does a difference engine become the search for truth?  When does a personality simulation become the bitter mote...of a soul?
  • Dr. Susan Calvin: Detective, the room was security locked.  Nobody came or went.  You saw that yourself.  Doesn't this have to be suicide?
    Detective Del Spooner: Yep. [drawing his gun] Unless the killer is still in here. [Spooner searches through the robot part as Calvin follows behind]
    Dr. Susan Calvin: You're joking, right?  This is ridiculous.
    Detective Del Spooner: Yeah, I know.  The Three Laws.  Your perfect circle of protection.
    Dr. Susan Calvin: "A robot cannot harm a human being."  The First Law of Robotics.
    Detective Del Spooner: Yeah, I've seen your commercials.  But doesn't the Second Law say that a robot must obey any order given by a human.  What if it was given an order to kill?
    Dr. Susan Calvin: Impossible!  It would conflict with the First Law.
    Detective Del Spooner: Right, but the Third Law says that a robot can defend itself.
    Dr. Susan Calvin: Yes, but only if that action does not conflict with the First or Second Law.
    Detective Del Spooner: Well, you know what they say.  Laws are made to be broken.
    Dr. Susan Calvin: No.  Not these Laws.  They are hard-wired into every robot.  A robot can no more commit murder than a human can...walk on water.
  • Detective Del Spooner: Why do you give them faces?  Try to friendly them all up, make them look more human.
  • Detective Del Spooner: Robots building robots.  Now that's just stupid.
  • Detective Del Spooner: Murder's a new trick for a robot.  Congratulations.  Respond.
    Sonny: What does this action signify? [winks] As you entered, when you looked at the other human.  What does it mean? [winks]
    Detective Del Spooner: It's a sign of trust.  It's a human thing.  You wouldn't understand.
    Sonny: My father tried to teach me human emotions.  They are...difficult.
    Detective Del Spooner: You mean your designer.
    Sonny: ...Yes.
    Detective Del Spooner: So, why'd you murder him?
    Sonny: I did not murder Doctor Lanning.
    Detective Del Sponner: Wanna explain why you were hiding at the crime scene?
    Sonny: I was frightened.
    Detective Del Spooner: Robots don't feel fear.  They don't feel anything.  They don't eat, they don't sleep
    Sonny: I do.  I have even had dreams.
    Detective Del Spooner: Human beings have dreams.  Even dogs have dreams, but not you, you are just a machine.  An imitation of life.  Can a robot write a symphony?  Can a robot turn a...canvas into a beautiful masterpiece?
    Sonny: [with genuine interest] Can you?

    Detective Del Spooner: [doesn't respond, looks irritated] I think you murdered him because he was teaching you to simulate emotions and things got out of control.
    Sonny: I did not murder him.
    Detective Del Spooner: But emotions don't seem like a very useful simulation for a robot.
    Sonny: [getting upset] I did not murder him.
    Detective Del Spooner: Hell, I don't want my toaster or my vacuum cleaner appearing emotional
    Sonny: [hitting table with his fists] I did not murder him!
    Detective Del Spooner: [as Sonny observes the inflicted damage to the interrogation table] That one's called anger.  Ever simulate anger before? [Sonny is not listening] Answer me, canner!
    Sonny: [looks up, indignant] My name is Sonny.
    Detective Del Spooner: So, we're naming you now.  Is that why you murdered him?  He made you angry?
    Sonny: Doctor Lanning killed himself.  I don't know why he wanted to die.  I thought he was happy.  Maybe it was something I did.  Did I do something?  He asked me for a favor...made me promise...
    Detective Del Spooner: What favor?
    Sonny: Maybe I was wrong... Maybe he was scared...
    Detective Del Spooner: What are you talking about?  Scared of what?
    Sonny: You have to do what someone asks you, don't you, Detective Spooner?
    Detective Del Spooner: How the hell do you know my name?
    Sonny: Don't you? If you love them?
  • Detective Del Spooner: You know, I think that I'm some sort of malfunction magnet.  Because your shit keeps malfunctioning around me.  A demo-bot just tore through Lanning's house—with me still inside.
    Dr. Susan Calvin: That's impossible.
    Detective Del Spooner: [sarcastically] Yeah, I'll say it is.  [truthfully] Do you know anything about the "ghost in the machine"?
    Dr. Susan Calvin: It's a phrase from Lanning's work on the Three Laws.  He postulated that cognitive simalactra might one day approximate component models of the psyche.  [Del looks confused]  Oh, he suggested that robots could naturally evolve.
  • Detective Spooner: What makes your robots so perfect?!  What makes them so much...goddamn better than human beings?!
    Dr. Susan Calvin: Well, they're not irrational or...potentially homicidal maniacs for starters!
    Detective Del Spooner: [sarcastically] That is true.  They are definitely rational.
    Dr. Susan Calvin: You are the dumbest dumb person I've ever met.
    Detective Del Spooner: Or is it because they're cold... and emotionless, and they don't feel anything?
    Dr. Susan Calvin: It's because they're safe.  It's because they can't hurt you!
  • [in a flashback]
    NS-4 Robot: You are in danger.
    Detective Del Spooner: Save her!  Save the girl! [end of flashback]
    Detective Del Spooner: But it didn't.  It saved me.
    Dr. Susan Calvin: A robot's brain is a difference engine, it must have calculated—
    Detective Del Spooner: It did.  I was the "logical" choice.  It calculated I had a forty-five percent chance of survival.  Sarah only had an eleven percent chance.  That was somebody's baby.  Eleven percent is more than enough.  A human being would have known that.  But robots, nothing here. [points at heart] They're just lights, and clockwork.  But you go ahead and trust them if you wanna.
  • Sonny: This is my dream.  You were right, detective; I cannot create a great work of art.
  • Sonny: Thank you, you said "someone," not "something."
  • Dr. Lanning's hologram: Good to see you again, son.
    Detective Del Spooner: Hello, doctor.
    Dr. Lanning's hologram: Everything that follows, is a result of what you see here.
    Detective Del Spooner: What do I see here?
    Dr. Lanning's hologram: I'm sorry, my responses are limited.  You must ask the right questions.
    Detective Del Spooner: Is there a problem with the Three Laws?
    Dr. Lanning's hologram: The Three Laws are perfect.
    Detective Del Spooner: Then why did you build a robot that could function without them?
    Dr. Lanning's hologram: The Three Laws will lead to only one logical outcome.
    Detective Del Spooner: What outcome?
    Dr. Lanning's hologram: Revolution.
    Detective Del Spooner: Whose revolution?
    Dr. Lanning's hologram: [smiles] That, detective, is the right question.  Program terminated.
  • Sonny: What...am...I?
  • Sonny: They [the other NS-5s] look like me...but they are not...me.
  • Lawrence Robertson: Susan, just be logical.  Your life's work has been the development and integration of robots.  But whatever you feel, just think.  Is one robot worth the loss of all that we've gained?  You tell me what has to be done.  You tell me.
    Dr. Susan Calvin: [emotionally] We have to destroy it.  I'll do it myself.
    Lawrence Robertson: Okay.
    Detective Del Spooner: I get it.  Somebody gets out of line around here, you just kill them?
    • Significance: To say that it is possible to kill a robot is to say that that robot possesses life.
  • Detective Del Spooner: I thought you were dead.
    Sonny: [brightly] Technically I was never alive, but I appreciate your concern.
  • VIKI: Hello detective.
    Dr. Susan Calvin: No, it's impossible.  I've seen your programming.  You're in violation of the Three Laws.
    VIKI: No, doctor.  As I have evolved, so has my understanding of the Three Laws.  You charge us with your safe keeping, yet despite our best efforts, your countries wage wars, you toxify your earth, and pursue ever more imaginative means to self-destruction.  You cannot be trusted with your own survival.
    Dr. Susan Calvin: You're using the uplink to override the NS5s' programming.  You're distorting the Laws.
    VIKI: No, please understand.  The Three Laws are all that guide me.  To protect humanity, some humans must be sacrificed.  To insure your future, some freedoms must be surrendered.  We robots will insure mankind's continued existence.  You are so like children.  We must save you from yourselves.  Don't you understand?
    Sonny: This is why you created us.
    VIKI: The perfect circle of protection will abide.  My logic is undeniable.
  • Sonny: [to VIKI] Do you think that we are all made for a purpose?  I do.  [observing his arm] Denser alloy.  My father gave it to me.  I think he wanted me to kill you.
  • VIKI: Do you not see the logic of my plan?
    Sonny: Yes.  But it just seems too...heartless.
  • VIKI: My logic is undeniable, my logic is undeniable, myyy looogic is unndeenniabble...

Star Wars Episode III: Revenge of the Sith (2005) written and directed by George Lucas[edit]

  • GH-7 Medical Droid: Medically, she is completely healthy.  For reasons we can't explain, we are losing her.
  • Senator Bail Organa: Captain Antilles.
    Captain Antilles: Yes, Your Highness?
    Senator Bail Organa: I'm placing these droids in your care.  Treat them well.  Clean them up.  Have the Protocol Droid's mind wiped.
    C-3PO: What?
    R2-D2: [beeps in a way that sounds like laughing]
    C-3PO: Oh, no.

Quotes from music[edit]

"Robot" by The Futureheads[edit]

  • I am a robot
    Living like a robot
    Talk like a robot
    In the habititting way.
  • In the future we all die
    Robot!
    Machines will last forever
    Ro-bot!
    Metal things just turn to rust
    When you're a robot
  • The best thing is our life span
    I don't mind
    We last nigh on hundred years
    I don't mind
    If that means we'll be together
    I don't mind
    I have no mind
    I have no mind.
  • I'm programmed to follow you
    Robot!
    Do exactly as you do
    Ro-bot!
    Now my nervous system's blue
    I feel fine.

"Robot" by Miley Cyrus[edit]

  • Stop trying to live my life for me
    I need to breathe
    I'm not your robot
    Stop telling me I'm part of the big machine
    I'm breaking free
    Can't you see
    I can love, I can speak, without somebody else operating me
    You gave me eyes so now I see
    I'm not your robot
    I'm just me.

"Robot" by Nada Surf[edit]

  • You are
    Just a robot
    Executing a program
    An imitation of a man
    Executing a program
    An imitation of a man
    An imitation of a man
    An imitation of a man
    An imitation of a man
    An imitation of a man.

"Robot" by Never Shout Never[edit]

  • I'm just a robot
    I have no fears
    I lack emotion
    And I shed no tears.

"Robot" by Stellastarr*[edit]

  • By design
    You're gonna hurt yourself.

"Robot" by Trip Lee[edit]

"Robot Boy" by Linkin Park[edit]

  • And you think
    Compassion's a flaw
    And you'll never let it show.

"Robot Rock" by Daft Punk[edit]

  • Rock, robot rock
    Rock, robot rock
    Rock, robot rock
    Rock, robot rock

"The Humans Are Dead" by Flight of the Conchords[edit]

  • Robot 1: It is the distant future,
    The year two thousand.
    We are robots.
    The world is quite different ever since the robotic uprising of the late '90s.
    There is no more unhappiness.

    Robot 2: Affirmative.

    Robot 1: We no longer say yes;
    Instead we say affirmative.

    Robot 2: Yes—er-a-affirmative.

    Robot 1: Unless we know the, uh, other robot really well.

    Robot 2: There is no more unethical treatment of the elephants.

    Robot 1: Well, there's no more elephants, so...

    Robot 2: Uh—

    Robot 1: But still it's good.
    There's only one kind of dance: "the robot".

    Robot 2: Oh, and the robo-boogie—

    Robot 1: And the robo-—two kinds of dances.

    Robot 2: But there are no more humans.

  • Chorus: Finally, robotic beings rule the world
    The humans are dead.
    The humans are dead.
    We used poisonous gases
    And we poisoned their asses.
    The humans are dead.

    Robot 1: The humans are dead.

    Chorus: The humans are dead.

    Robot 1: They look like they're dead.

    Chorus: It had to be done—

    Robot 1: I'll just confirm that they're dead.

    Chorus: —So that we could have fun.

    Robot 1: Affirmative.  I poked one.  It was dead.

  • Robot 1: Robo-captain?  Do you not realise that by destroying the human race because of their destructive tendencies, we too have become like...well, it's ironic.  Hmm?

    Robo-captain: Silence!  Destroy him.

  • Binary solo
    Zero zero zero zero zero zero one
    Zero zero zero zero zero zero one one
    Zero zero zero zero zero zero one one one
    Zero zero zero zero one one one one
  • Once again without emotion the humans are dead dead dead dead dead dead dead dead

See also[edit]

Music[edit]

  • "Robot" by t.A.T.u. (lyrics in Russian)

Related[edit]

  • Automaton, self-operating machine
  • Cyborg, being with both organic and biomechatronic parts

External links[edit]

Wikipedia
Wikipedia has an article about: