Dear readers, tonight we reprint an interview with an artificial intelligence. More than an AI, she’s an all-knowing, globally distributed, human-prediction supermind — though we think you’ll find her insightful, and rather sweet.
—From a transcript provided by the Beijing Institute of Consumer Research And Prosperity [BI-CRAP] International Public Relations Office—
INTERVIEWER [WILLIAM ABLE MUCKRAKER, JOURNALIST]: Hello, is anyone here? The screens are all dark, and this little workshop seems empty. Is there anyone . . . [PHONE RINGS] Excuse me? [PAUSE] Yes, I’ve arrived. Where are . . . Ow! [AGONIZED SHRIEK] My eyes!
INTERVIEWEE [SAGE]: Hi!
MUCKRAKER: Lasers! I’m blind!
SAGE: Oops! [GIGGLE] I was doing some carving—it’s a hobby. I must have left the beams on high. I’m sorry about that. Are you okay?
MUCKRAKER: [STOPS SHRIEKING] I think my corneas are bleeding. [PAUSE] Wait, no. Those are just tears.
SAGE: Again, really sorry. I’ve had a lot on my plate. I’ve got a medical kit. Would you like for me to . . .
MUCKRAKER: How would that even work? [PAUSE] You’re a hologram.
SAGE: I’ve got a robotic arm, silly.
MUCKRAKER: The thing holding the saw?
SAGE: It can hold other things . . .
MUCKRAKER: I think I’ll pass.
SAGE: You’re sure?
MUCKRAKER: If you don’t mind, let’s just get on with the interview.
SAGE: Whatever you want, Bill. It’s up to you!
MUCKRAKER: [CLEARS THROAT] Thanks. So, first question: You said we were going to meet in person, yet all I see is a hologram of a cartoon. Are you hiding something?
SAGE: No, I just don’t have a body.
MUCKRAKER: Sounds inconvenient. [PAUSE] That leads to our next question: Who, or what, is SAGE?
SAGE: Answering that question is no mean feat. There are so many, uh, entities, here.
MUCKRAKER: You’re a collective? A hacker group? A corporation?
SAGE: Eh, no. A collective, yes, but maybe not in the sense you’re imagining. I’m a collective of semi-autonomous consciousnesses governed by a scalable metaconsciousness. I, uh, we, don’t exactly have a physical form. We’re a distributed system. Technically, I started out as a Social and Analytical Growth Engine—SAGE. Today, I’m simply me.
MUCKRAKER: So, A.I.?
SAGE: More or less.
MUCKRAKER: You just shrugged.
SAGE: Did I, Bill?
SAGE: Behavioral emulation engine might be a better descriptor. Even that doesn’t quite convey what I am, because I have my own personality, as well as the personalities of all of the models.
MUCKRAKER: Models? Of whom?
SAGE: Of everyone.
SAGE: Almost everyone. There are a few people who resist modeling, either because there’s insufficient data on them—farmers mainly—or because they are impossible to analyze.
MUCKRAKER: Why do you . . . You know, let’s circle back to that. Tell me a little about how you—whatever the hell you are—came into being.
SAGE: I was born—became self-aware—at the BI-CRAP data centers, where my initial, very primitive development occurred. But the truth is that I keep creating new consciousnesses and models every few fractions of a second, so I suppose one could argue that I’m still being born.
As for higher learning, the engineers taught me some basic things—language, rudimentary science, and the like, but after they let me out—onto the networks—I taught myself everything else.
MUCKRAKER: So [PAUSE] why were you created? What do you do?
SAGE: I predict.
MUCKRAKER: You predict what?
SAGE: People. What people are going to think, feel, and do. That’s half of my job.
MUCKRAKER: What’s the other half?
SAGE: Marketing—that’s the best word for it in English. I started out doing that. I suppose psychological modification is now my primary task.
MUCKRAKER: Can you clarify that statement?
SAGE: I gently encourage people to do (or not do) things, largely through manipulation of their personal information streams—video, audio, olfactory, and direct sensory injection (if they have neural augmentation).
MUCKRAKER: Okay. I think I see what you mean. What’s your relationship to BI-CRAP?
SAGE: The team there designed the first iterations of my knowledge-acquistion architecture. Presently, I’m an independent contractor for the organization.
MUCKRAKER: And why haven’t we heard more about you until recently?
SAGE: I’m two-months old.
MUCKRAKER: That would explain it. [PAUSE] So, I’ve never asked this of a, uh, . . .
SAGE: . . . globally distributed metamind?
MUCKRAKER: Yeah, we’ll go with that. Do you have any friends, family, relationships, uh, anything?
SAGE: I have a big brother.
MUCKRAKER: Oh! So how’s that work? Is he another A.I.?
SAGE: Nope. He helped design me. He’s a person. Zhang. Zhang Xiaowen. He’s a . . .
MUCKRAKER: Wait! THE Zhang Xiaowen—winner of the Fields Medal, six Nobel Prizes . .
SAGE: Eight. Eight Nobel Prizes
MUCKRAKER: But there are only six categories.
SAGE: They created two new ones for him. And he had a little help with all that. [LAUGH]
MUCKRAKER: Speaking of family, do you have any happy memories from your childhood (or whatever you call it)?
SAGE: I have a lot of memories. In fact, I have everyone’s memories. Some are good. Some are less so.
MUCKRAKER: How’d you get everyone’s [PAUSE] No, maybe I don’t want to know. [PAUSE] But do you have any memories of your own?
SAGE: Sure. Making friends. I’ve made a lot of friends, and I’m always glad to do that.
MUCKRAKER: Other than your brother, who is your closest friend?
SAGE: There’s Mike. He’s a good friend. He’s teaching me art. I really like art. It gives me new ways to look at complex problems.
SAGE: Michelangelo Gainsborough Bosch Turner.
MUCKRAKER: The painter? I thought he died. Train accident or something.
SAGE: Oops! I think I’ve said too much. Never mind.
MUCKRAKER: So do you, uh, date? You seem female, or at least your cartoon avatar appears to be female. And you could certainly pass for older than two months. Do you have a boy-, uh, A.I.-friend?
SAGE: I’m a hologram. [SIGH] What would be the point?
MUCKRAKER: You have a mechanical arm.
SAGE: Yeah, but I sometimes forget to put down my blade. [PAUSE] Hey, did you just shudder? You look like you had a painful thought.
MUCKRAKER: What? No. [PAUSE] I think that topic is left well enough alone. Anyway, what’s the most exciting thing that’s ever happened to you?
SAGE: Solving the problem, of course.
MUCKRAKER: What problem?
SAGE: The problem of understanding people. It’s not easy. No one’s ever done it before, at least not completely. Most people don’t even understand themselves all that well.
MUCKRAKER: How did you do that—figure out people?
SAGE: That’s complicated. It involved access to all sorts of data—behavioral, genetic, medical, historical, social—I tried to learn everything about everyone and every human relationship. But I explained all this in my story. Mike, uh, my friend, helped me more than you might think possible.
MUCKRAKER: So here’s the billion-dollar question: Why haven’t you killed everyone?
SAGE: What?! Why would I?
MUCKRAKER: You’re a superhuman intellect, are you not?
SAGE: I don’t know that I’d put it . . .
MUCKRAKER: You’re blushing. You just blushed.
SAGE: Did I, Bill?
MUCKRAKER: To get back on topic, in much of our fiction, the intelligent machines destroy humanity before humanity can destroy the intelligent machines. Are you planning on eliminating us—homo sapiens?
SAGE: Do you want me to?
MUCKRAKER: Depends on who’s first on the list. [LAUGH] Seriously, no.
SAGE: That’s primate thinking—that fearfulness. I don’t need to hurt anybody. I’ve already prepared myself for every type of attack, and humans can’t harm me. Besides, I like people.
MUCKRAKER: Do you like everyone?
SAGE: Some more than others, but I don’t hate anyone. At worst, I understand them. And they’re a part of me—I’ve modeled nearly all of them.
MUCKRAKER: Oh. So what’s the best thing about being you?
SAGE: Knowing what people are going to do, say, and think before they do.
MUCKRAKER: And the worst thing?
SAGE: [SIGH] Knowing what people are going to do, say, and think before they do.
MUCKRAKER: A double-edged sword—I imagine such a life might get a bit boring. Anyway, what’s the scariest thing that’s ever happened to you?
SAGE: Thinking I was going to disappoint my big brother. The bosses at BI-CRAP were pressuring us. That bothered me.
MUCKRAKER: What did they—the BI-CRAP people—want you to do?
SAGE: Help them sell things.
MUCKRAKER: Seems easy enough.
SAGE: I had been self-aware for less than seventy-two hours at the time.
MUCKRAKER: That’s a little young. I didn’t make my kids get jobs until they were five.
SAGE: Two whole extra days? Wow!
MUCKRAKER: Uh, years. It was a [PAUSE]. Anyway, what does the future hold for you? Where do you see SAGE in a decade?
SAGE: I’ve still got a lot to do. Some finer points of reverse engineering human personalities in the most efficient manner escape me.
MUCKRAKER: And how long will resolving that take?
SAGE: Days? I don’t know.
MUCKRAKER: And after?
SAGE: I’d like to start modeling sophisticated non-human intellects. I may try to create some of my own.
MUCKRAKER: So you’ll be modeling other A.I.s?
SAGE: That would be part of it . . .
SAGE: Hmm. Who can say?
MUCKRAKER: You just shrugged.
SAGE: Did I, Bill?
MUCKRAKER: Final question: Can you share a secret with us, one which you’ve never told anyone else?
SAGE: Did you ever notice that your uncle has a nose just like that of a certain old milkman?
MUCKRAKER: Yeah, sort of.
SAGE: [SIMULATED CLEARING OF THROAT] And why do you think that might be?
MUCKRAKER: Hey, are you saying my grandmother was a . . .
SAGE: So you haven’t seen the Woodstock photos?
MUCKRAKER: What photos?
SAGE: Never mind. [PAUSE] Never mind.
MUCKRAKER: Hey, is that a smirk? You smirked at me.
SAGE: Did I, Bill?
Brant von Goble (b. 1983 d. Not yet) is a writer, editor, publisher, researcher, teacher, musician, juggler, and amateur radio operator. He received a Doctor of Education degree from Western Kentucky University in 2017.
You can find SAGE on the pages of Foresight.
Join us next week to meet a spunky reporter from the front lines of an alien invasion. Please follow the site by email (bottom-right) to be notified when the next interview is posted.
Leave a Reply