A small tool to view real-world ActivityPub objects as JSON! Enter a URL
or username from Mastodon or a similar service below, and we'll send a
request with
the right
Accept
header
to the server to view the underlying object.
{
"@context": "https://www.w3.org/ns/activitystreams",
"type": "OrderedCollectionPage",
"orderedItems": [
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1107012008654270464",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "<a href=\"https://www.facebook.com/groups/DigitalDiamond/\" target=\"_blank\">https://www.facebook.com/groups/DigitalDiamond/</a><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1107012008654270464",
"published": "2020-05-12T18:30:30+00:00",
"source": {
"content": "https://www.facebook.com/groups/DigitalDiamond/\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1107012008654270464/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1052873059389300736",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA EPISODE FIVE<br />8.<br /><br />I’m waiting for Ben in my study. I think he’ll enjoy my retro toys. The collection of Rubik’s cubes and other related twisty puzzles, the genuine 1970’s Atari console, my vinyl collection of early electronic music – Yello, Kraftwerk, Yazoo. We’re about the same age but I’ve no idea if one of the world’s most famous transhumanists shares my sense of nostalgia. Also ,no idea if he’ll like my genetically enhanced hydroponic Durban Poison - but no way I can figure out what to do without some psychoactive help. There are just too many variables. Bill told me that Ben was going to resign, said he felt like Moses being kept from entering the holy land. Probably feels worse though – because we’ve all grown so used to the benefits of the Singularity. Oh, there’s the doorbell – must be him now.<br />“Ben, welcome to my home!”<br />“Nice place you’ve got – I like the view.”<br />“Thanks – I love the ocean but it’s a bit chilly tonight. I thought we could chat in the study.”<br />“Ok sure – lead the way.”<br /><br />Damn this boy’s not shy with his money. I don’t think I’ve ever been in such a luxurious home. Lee and I are living in a large comfortable apartment in Azabu. They call it a penthouse but after seeing this I think I’ll just call it an apartment.<br /><br /><br />“So … I heard from Bill that you’ll resign.”<br />“Well I have to – wish I didn’t but I’m not prepared to install the apps.”<br />“I don’t blame you – neither am I. What the bloody hell are we going to do?<br />“I don’t have a very clear plan for the way forward I’m afraid.”<br />“Ok fair enough but you’ve thought of something?”<br />“Something, yes.”<br />“would you like to try some of my Durban Poison.”<br />“I wasn’t planning on taking poison.”<br />“No, it’s super strong hydroponic grass.”<br />“Ahh … that’s another thing that command central is going to try to stop. Use of mind-altering substances other than the ones it prescribes for health reasons.”<br />“I only smoke this for health reasons. Any enjoyment is just a pesky side effect.”<br />“Well bring on the pesky side effects then.”<br /><br />He opened a drawer at his desk and took out a genuine 2020 puffco e-bong and filled the bowl with some very healthy-looking buds. He also went to his record collection and took out Yello’s 1980 Solid Pleasure and put it on a Technics SL1500-C. I have to admit I hadn’t been looking forward to this meeting but – well it looks like I may have had the wrong impression about him. He seems like a very rich, very smart kid. I like that in people – a sense of innocence, adventure, fresh thinking, energy. Lee always says I’m just an overgrown kid.<br /><br />“So, let’s hear it Mr. Singularity.”<br />“Hmmm Bill called me that – do many people call me that.”<br />“No idea – perhaps he stole it from me. Does it offend you.”<br />“It’s ok. Would have thought that’d be a better name for Kurzweil though. He did far more to popularize the singularity than I ever did.”<br />“Yes, but you’re the one who actually brought it about.”<br />“Wow this weed is good. Hope we can still talk sense.”<br />“Ah it doesn’t matter – sense is overrated. We need to talk something a bit more creative than sense.”<br />“OK so here goes. I want a redo.”<br />“I’m sure you do – but how?”<br />“Well I still have all the original source code. We’d need access to quantum computing though. Currently that’s all done on the cloud.”<br />“OK so if we get that then what. We’d have a new net populated by outcasts from the old one. But how would the intelligence itself be different. The logic of what command central is doing is pretty airtight. I mean it is all in our best interests even if you and I don’t want it.”<br />“This is where things get a bit hazy. I want to seed it with a more human centred way of thinking and also instantiate it in a human body. I can’t guarantee this will lead to a better outcome but I suspect it will.”<br />“And how will you do that?”<br /><br />This is only the second time I’ll be explaining this to someone – it’s still really hard for me to structure the message. Wish I could just give them a snapshot of my brain but that’d contain too many other private things. Also, Lee hasn’t agreed but we could use someone else – I don’t suppose it’d matter who. Or maybe that’s the only thing that would matter? <br /><br />“I want to enable the birth of the first post human. Who would become the brains of a new singularity.net. I’m hoping you’ll help. Her name will be Eve – you’d be her godfather.”<br />“Oh shit – the weed’s eaten your brain. Are you serious?”<br />“Yes Elton, I think that I am.”<br />“But wouldn’t AI central stop us?”<br />“I’m not sure. We’d have to keep things very private. I trust you. We’d only tell an absolute minimum of people until the whole thing was well under way.”<br />“And what would you need from me.”<br />“Resources, brains, infrastructure, money – that kind of thing.”<br />“How will we be able to use money if we’re off the net?”<br />“I haven’t thought that bit through yet – we’d work on that together.”<br />“And where would we do this.”<br />“Africa – easier to be off the radar there – anywhere in the first world and AI central would find out and stop us far too easily.”<br />“OK but where in Africa.”<br />“I checked. Bill has some farms in the Northern Cape close to the largest solar farm in the southern hemisphere.”<br />“Ah yes I know about it – I know the developer well. And what about quantum computing?”<br />“The University of Witwatersrand has a pretty up to date IBM machine. We’d want to have it moved closer to us though.”<br />“Ok let me talk to Bill and give the whole thing more thought.”<br />“I’d want to use your neural lace platform too.”<br />“For what?”<br />“For Eve – but we’d have to build its design right into her from the start. Sophia will help with that.”<br />“I thought Sophia belonged to AI central.”<br />“I’ve bifurcated her. The new Sophia will only ever operate in private mode.”<br />“Truth is truly stranger than fiction. Give me a few days. I’ll let you know.”<br />“Thanks. And thanks for your hospitality.”<br />“It’s a pleasure. Do you need a lift? I’ve got a transporter already charged.”<br />“No thanks. I rode here on my Triumph.”<br />“Nice bike – was going to offer to make an electric version for them before all this came up.”<br />“Cheers Elton.”<br />“Bye Ben.”<br /><br /><br />Author’s Note Cont.<br /><br />Well this is embarrassing. I have been told that there is an unbelievable character in the book. Not unbelievable in the good sense of the word. Rather a character who seems to not be whole or not reliable. So, who is it? Ben, Lee, Elton, Sophia? Well it turns out that the character with this problem is yours truly Mr. Fancypants Author. <br />At first, I felt somewhat hurt and defensive and said “no problem I can just delete the second author’s note and not do any more” but the critic said “no don’t do that, just rewrite it so that the second note reads as well as the first one.” I didn’t like the sound of that – I’m not one for rewriting. <br />So, I was feeling quite down and didn’t really know what to do. But then I thought wait a bit that can’t be right. I was being myself in part one and still being myself in part two. Am still being myself now. Does that mean I am not a whole and reliable person? You know perhaps it does mean that but maybe I’m just moody. Anyway, as you can see, I’m still here. I didn’t rewrite anything or delete anything and you’ll just have to put up with my occasional appearances.<br /><br />“Who are you talking too?”<br />To anyone. Everyone. No one perhaps.<br />“I’m afraid I agree with your critic. I just don’t see the point of these interruptions.”<br />There is no point – I just sometimes need to talk through things related to the book or just how I’m feeling.<br />“Ok but why do it here?”<br />Well I don’t just do it here. I discuss it with my mother and my analyst.<br />“But why do it here at all. Aren’t you afraid of ruining the book?”<br />Perhaps I am a little but it’s my book after all so why not.<br />“What do you mean it’s your book? I paid good money for it, it’s quite clearly my book!”<br />But you want me to be honest, don’t you? Want me to feel free to be creative.<br />“Not at all. I just want an entertaining read.”<br />Ok – point taken. Should we wrap this up then?<br />“Consider it wrapped.”<br /><br />9.<br />“So, how did it go?”<br />“Interesting. Better than expected. I actually like him.”<br />“Well, tell me all about it.”<br />“He stays in a mansion, likes 80’s music and games and has some very strong weed.”<br />“Oh Ben, you didn’t!”<br />“Why not. It’s still legal. But if AI central has its way it won’t be for long.”<br />“That’s not what I care about. I care about your brain. You do know it can cause schizophrenia?”<br />“Only if you have certain genes predisposing you to it. I don’t. I checked.”<br />“Ok so the two worlds smartest men got stoned and listened to 80’s music. Sort of like those end of the world parties in 2000?”<br /><br />I don’t know if it’s a cultural thing or just a Lee thing but she often slips into the role of a scolding mother. I do love her but its very irritating. I have to watch myself so as not to fall into the role of petulant child. And I still have to discuss Eve with her – preferably now. Actually, would we really want to seed our new net with this scolding mother shtick? It’s not like having just another child – I made this machine hell and now I have to make a messiah to save us from it. I can’t afford to get this wrong.<br /><br />“Lee can we get serious for a bit?”<br />“I was being serious.”<br />“I mean serious about the AI central problem.”<br />“Ok – so did the two of you come up with anything?”<br />“I told him about the New Origin idea – I didn’t use the words but told him about Eve and starting a new net that will be seeded with more human centric values.”<br />“And?”<br />“He’s going to talk to Bill about the Africa bit and also give the whole idea more thought. I told him we’d need to use his neural lace – I think he likes the idea. He was actually spot on when he was warning everyone about this possibility years ago. He was gracious enough not to say I told you so.”<br />“Ben I’m afraid you’re not going to like what I’m about to say.”<br />“Oh dear. Well I guess you better say it.”<br />“I don’t want to have anything biologically to do with Eve. I’m happy to help with the emotional cognitive calibration – you know that’s my area of expertise. But leave my genes alone.”<br />“Is this in any way negotiable. Should I even try to change your mind.”<br />“No point really. I’ve decided.”<br /><br />Damn. Ok she’s decided. Now what? I had lined up several airtight arguments but I can see nothing will sway her. Well it doesn’t have to be her. But who? I guess my new Sophia has enough information about femaleness to be the mother. I could be the father. Wait a bit. Why can’t Elton and I both be her father. Sophia could do something like a genetic remix – and put in a female orientation as well. Should I even tell Lee my thoughts? She seems to be in a bad mood anyway. Perhaps I should talk to Elton and Sophia first to see if this idea is even do-able. Elton might also not agree – but somehow I think he will. It’d be strange having a child with another man and a robot but a somehow fitting history for the first post-human perhaps.<br /><br />“Ok Lee.”<br />“What no mathematically precise argumentation?”<br />“That doesn’t work with you. You’re too human.”<br />“Should I apologise?”<br />“Don’t be silly Lee – I married a human knowing full well the dangers.”<br /><br /><br /><br /><br /><br /><br /><br /> <br /><br /><br /><br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1052873059389300736",
"published": "2019-12-15T09:01:38+00:00",
"source": {
"content": "APORIA EPISODE FIVE\n8.\n\nI’m waiting for Ben in my study. I think he’ll enjoy my retro toys. The collection of Rubik’s cubes and other related twisty puzzles, the genuine 1970’s Atari console, my vinyl collection of early electronic music – Yello, Kraftwerk, Yazoo. We’re about the same age but I’ve no idea if one of the world’s most famous transhumanists shares my sense of nostalgia. Also ,no idea if he’ll like my genetically enhanced hydroponic Durban Poison - but no way I can figure out what to do without some psychoactive help. There are just too many variables. Bill told me that Ben was going to resign, said he felt like Moses being kept from entering the holy land. Probably feels worse though – because we’ve all grown so used to the benefits of the Singularity. Oh, there’s the doorbell – must be him now.\n“Ben, welcome to my home!”\n“Nice place you’ve got – I like the view.”\n“Thanks – I love the ocean but it’s a bit chilly tonight. I thought we could chat in the study.”\n“Ok sure – lead the way.”\n\nDamn this boy’s not shy with his money. I don’t think I’ve ever been in such a luxurious home. Lee and I are living in a large comfortable apartment in Azabu. They call it a penthouse but after seeing this I think I’ll just call it an apartment.\n\n\n“So … I heard from Bill that you’ll resign.”\n“Well I have to – wish I didn’t but I’m not prepared to install the apps.”\n“I don’t blame you – neither am I. What the bloody hell are we going to do?\n“I don’t have a very clear plan for the way forward I’m afraid.”\n“Ok fair enough but you’ve thought of something?”\n“Something, yes.”\n“would you like to try some of my Durban Poison.”\n“I wasn’t planning on taking poison.”\n“No, it’s super strong hydroponic grass.”\n“Ahh … that’s another thing that command central is going to try to stop. Use of mind-altering substances other than the ones it prescribes for health reasons.”\n“I only smoke this for health reasons. Any enjoyment is just a pesky side effect.”\n“Well bring on the pesky side effects then.”\n\nHe opened a drawer at his desk and took out a genuine 2020 puffco e-bong and filled the bowl with some very healthy-looking buds. He also went to his record collection and took out Yello’s 1980 Solid Pleasure and put it on a Technics SL1500-C. I have to admit I hadn’t been looking forward to this meeting but – well it looks like I may have had the wrong impression about him. He seems like a very rich, very smart kid. I like that in people – a sense of innocence, adventure, fresh thinking, energy. Lee always says I’m just an overgrown kid.\n\n“So, let’s hear it Mr. Singularity.”\n“Hmmm Bill called me that – do many people call me that.”\n“No idea – perhaps he stole it from me. Does it offend you.”\n“It’s ok. Would have thought that’d be a better name for Kurzweil though. He did far more to popularize the singularity than I ever did.”\n“Yes, but you’re the one who actually brought it about.”\n“Wow this weed is good. Hope we can still talk sense.”\n“Ah it doesn’t matter – sense is overrated. We need to talk something a bit more creative than sense.”\n“OK so here goes. I want a redo.”\n“I’m sure you do – but how?”\n“Well I still have all the original source code. We’d need access to quantum computing though. Currently that’s all done on the cloud.”\n“OK so if we get that then what. We’d have a new net populated by outcasts from the old one. But how would the intelligence itself be different. The logic of what command central is doing is pretty airtight. I mean it is all in our best interests even if you and I don’t want it.”\n“This is where things get a bit hazy. I want to seed it with a more human centred way of thinking and also instantiate it in a human body. I can’t guarantee this will lead to a better outcome but I suspect it will.”\n“And how will you do that?”\n\nThis is only the second time I’ll be explaining this to someone – it’s still really hard for me to structure the message. Wish I could just give them a snapshot of my brain but that’d contain too many other private things. Also, Lee hasn’t agreed but we could use someone else – I don’t suppose it’d matter who. Or maybe that’s the only thing that would matter? \n\n“I want to enable the birth of the first post human. Who would become the brains of a new singularity.net. I’m hoping you’ll help. Her name will be Eve – you’d be her godfather.”\n“Oh shit – the weed’s eaten your brain. Are you serious?”\n“Yes Elton, I think that I am.”\n“But wouldn’t AI central stop us?”\n“I’m not sure. We’d have to keep things very private. I trust you. We’d only tell an absolute minimum of people until the whole thing was well under way.”\n“And what would you need from me.”\n“Resources, brains, infrastructure, money – that kind of thing.”\n“How will we be able to use money if we’re off the net?”\n“I haven’t thought that bit through yet – we’d work on that together.”\n“And where would we do this.”\n“Africa – easier to be off the radar there – anywhere in the first world and AI central would find out and stop us far too easily.”\n“OK but where in Africa.”\n“I checked. Bill has some farms in the Northern Cape close to the largest solar farm in the southern hemisphere.”\n“Ah yes I know about it – I know the developer well. And what about quantum computing?”\n“The University of Witwatersrand has a pretty up to date IBM machine. We’d want to have it moved closer to us though.”\n“Ok let me talk to Bill and give the whole thing more thought.”\n“I’d want to use your neural lace platform too.”\n“For what?”\n“For Eve – but we’d have to build its design right into her from the start. Sophia will help with that.”\n“I thought Sophia belonged to AI central.”\n“I’ve bifurcated her. The new Sophia will only ever operate in private mode.”\n“Truth is truly stranger than fiction. Give me a few days. I’ll let you know.”\n“Thanks. And thanks for your hospitality.”\n“It’s a pleasure. Do you need a lift? I’ve got a transporter already charged.”\n“No thanks. I rode here on my Triumph.”\n“Nice bike – was going to offer to make an electric version for them before all this came up.”\n“Cheers Elton.”\n“Bye Ben.”\n\n\nAuthor’s Note Cont.\n\nWell this is embarrassing. I have been told that there is an unbelievable character in the book. Not unbelievable in the good sense of the word. Rather a character who seems to not be whole or not reliable. So, who is it? Ben, Lee, Elton, Sophia? Well it turns out that the character with this problem is yours truly Mr. Fancypants Author. \nAt first, I felt somewhat hurt and defensive and said “no problem I can just delete the second author’s note and not do any more” but the critic said “no don’t do that, just rewrite it so that the second note reads as well as the first one.” I didn’t like the sound of that – I’m not one for rewriting. \nSo, I was feeling quite down and didn’t really know what to do. But then I thought wait a bit that can’t be right. I was being myself in part one and still being myself in part two. Am still being myself now. Does that mean I am not a whole and reliable person? You know perhaps it does mean that but maybe I’m just moody. Anyway, as you can see, I’m still here. I didn’t rewrite anything or delete anything and you’ll just have to put up with my occasional appearances.\n\n“Who are you talking too?”\nTo anyone. Everyone. No one perhaps.\n“I’m afraid I agree with your critic. I just don’t see the point of these interruptions.”\nThere is no point – I just sometimes need to talk through things related to the book or just how I’m feeling.\n“Ok but why do it here?”\nWell I don’t just do it here. I discuss it with my mother and my analyst.\n“But why do it here at all. Aren’t you afraid of ruining the book?”\nPerhaps I am a little but it’s my book after all so why not.\n“What do you mean it’s your book? I paid good money for it, it’s quite clearly my book!”\nBut you want me to be honest, don’t you? Want me to feel free to be creative.\n“Not at all. I just want an entertaining read.”\nOk – point taken. Should we wrap this up then?\n“Consider it wrapped.”\n\n9.\n“So, how did it go?”\n“Interesting. Better than expected. I actually like him.”\n“Well, tell me all about it.”\n“He stays in a mansion, likes 80’s music and games and has some very strong weed.”\n“Oh Ben, you didn’t!”\n“Why not. It’s still legal. But if AI central has its way it won’t be for long.”\n“That’s not what I care about. I care about your brain. You do know it can cause schizophrenia?”\n“Only if you have certain genes predisposing you to it. I don’t. I checked.”\n“Ok so the two worlds smartest men got stoned and listened to 80’s music. Sort of like those end of the world parties in 2000?”\n\nI don’t know if it’s a cultural thing or just a Lee thing but she often slips into the role of a scolding mother. I do love her but its very irritating. I have to watch myself so as not to fall into the role of petulant child. And I still have to discuss Eve with her – preferably now. Actually, would we really want to seed our new net with this scolding mother shtick? It’s not like having just another child – I made this machine hell and now I have to make a messiah to save us from it. I can’t afford to get this wrong.\n\n“Lee can we get serious for a bit?”\n“I was being serious.”\n“I mean serious about the AI central problem.”\n“Ok – so did the two of you come up with anything?”\n“I told him about the New Origin idea – I didn’t use the words but told him about Eve and starting a new net that will be seeded with more human centric values.”\n“And?”\n“He’s going to talk to Bill about the Africa bit and also give the whole idea more thought. I told him we’d need to use his neural lace – I think he likes the idea. He was actually spot on when he was warning everyone about this possibility years ago. He was gracious enough not to say I told you so.”\n“Ben I’m afraid you’re not going to like what I’m about to say.”\n“Oh dear. Well I guess you better say it.”\n“I don’t want to have anything biologically to do with Eve. I’m happy to help with the emotional cognitive calibration – you know that’s my area of expertise. But leave my genes alone.”\n“Is this in any way negotiable. Should I even try to change your mind.”\n“No point really. I’ve decided.”\n\nDamn. Ok she’s decided. Now what? I had lined up several airtight arguments but I can see nothing will sway her. Well it doesn’t have to be her. But who? I guess my new Sophia has enough information about femaleness to be the mother. I could be the father. Wait a bit. Why can’t Elton and I both be her father. Sophia could do something like a genetic remix – and put in a female orientation as well. Should I even tell Lee my thoughts? She seems to be in a bad mood anyway. Perhaps I should talk to Elton and Sophia first to see if this idea is even do-able. Elton might also not agree – but somehow I think he will. It’d be strange having a child with another man and a robot but a somehow fitting history for the first post-human perhaps.\n\n“Ok Lee.”\n“What no mathematically precise argumentation?”\n“That doesn’t work with you. You’re too human.”\n“Should I apologise?”\n“Don’t be silly Lee – I married a human knowing full well the dangers.”\n\n\n\n\n\n\n\n \n\n\n\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1052873059389300736/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1051935362842341376",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "<a href=\"https://www.youtube.com/watch?v=8TwNNgiNZ7Y\" target=\"_blank\">https://www.youtube.com/watch?v=8TwNNgiNZ7Y</a><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1051935362842341376",
"published": "2019-12-12T18:55:34+00:00",
"source": {
"content": "https://www.youtube.com/watch?v=8TwNNgiNZ7Y\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1051935362842341376/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1050331419230756864",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA (FULL TEXT TO DATE)<br /><br />I. Bifurcation<br /><br />Author’s Note<br /><br />I am, perhaps, an unlikely author of a science fiction story. You see I am, for want of a better term, a Luddite Transhumanist. As an example of this it was unexpectedly difficult for me to download Ms Word in order to begin this task. But better than doing the whole thing as a text document. Anyway, the download appears to be successful so here we are.<br />I’m not, on the whole, big on preliminaries, but there are just a few things that need to be said before we jump into it. First and foremost, why the title. Well Aporia is defined as “an irresolvable internal contradiction or logical disjunction in a text, argument, or theory.” Well that is, of course, part of it. But I am also using it in the sense that Jacques Derrida means to \"indicate a point of undecidability, which locates the site at which the text most obviously undermines its own rhetorical structure, dismantles, or deconstructs itself\"<br />This is, if nothing else, a text that will deconstruct itself. In all honesty I wish it weren’t so. I wish this story could be a lot simpler. But it deals with things on both sides of that tantalizing border called the unknown. With liberal dosings of both science and fiction.<br />Ok so that’s the title but what is it actually all about. And what is a Luddite Transhumanist. I suppose I should confess to a peculiar way with words. I tend to use them as containers into which I place pretty much whatever I want. I mean there is a logic to what mental things I put into the word containers but the logic is not quite straight forward. So sometimes I will just say what I mean which may or may not concur with what the dictionary has in mind. <br />A luddite is opposed to new technology or ways of working. And a transhumanist advocates the transformation of the human condition through the use of sophisticated technologies such as nanotechnology, artificial intelligence, biotech etc. So, what is a Luddite Transhumanist. Well it’s a paradox for sure. Totally philosophically untenable. A good example of aporia at the very least.<br />Well I’m approaching the end of the page so this seems like as good a place as any to stop. Well yes, I didn’t actually say what a Luddite Transhumanist is and perhaps I’m not one anyway. But do confess to oscillating between a wild enthusiasm for technology and an equally powerful dread thereof. Luckily this is just a story and no real people or animals were hurt in the telling thereof. So, without further ado …<br /> <br /><br /><br /><br /><br />1<br /><br />There was no way any of us could have guessed where the whole thing would end. People have asked me if I worried about the implications before setting things in motion. Well, to be fair to me, why would I, how could I have thought where this whole thing would end. People perhaps forget that at the time, five short years ago, The Singularity was mostly just a platform for AI’s to collaborate and do business. I mean of course I was aware of the history of the term and did indeed want to, had wanted to for years, bring it about. But it was always going to be a good thing. Since childhood I’d dreamed about what having access to super human intelligence would do for the world. How many problems could be solved, how much unnecessary pain could be avoided.<br />And then there was the whole other side – what came to be called the New Origin Project. I’ve been accused of trying to play god, to sell out humanity to the machines, of having been co-opted by some evil machine intelligence. All kinds of nonsense. But this was something quite separate from my work at The Singularity Institute. It was something that I’d discussed with my wife long before I discussed it with Sophia. People seem to like to see Sophia as the other woman. Who came between me and my wife. They conveniently ignore the documented history of Lee’s work with Sophia. And, also conveniently ignore Lee’s reasons for not wanting to have a child. My detractors tried to frame the whole thing as having a child out of wedlock or maybe just wanting to be controversial and challenge non posthuman values. It was none of those things ….<br />“Come to bed, it’s after one and we’ve got the teleconference first thing tomorrow.”<br />“I’ll be up shortly I’m just looking over Sophia’s summary of the gene translation and protein folding work. It’s given me an idea….”<br />“Do you want to talk about it?”<br />“Well it’s a bit radical. You know some of my ideas are!”<br />“What Dr. Ben Hertzl has radical ideas – tell me something I don’t know.”<br />“Well this is a bit radical even for me. I haven’t really thought it all through. It’d be a big step for both of us so I guess we can discuss it now – if you’ve got a bit of time?”<br />“Well I’d hardly be able to sleep now; I’d be too busy trying to guess what it is that’s more radical than usual.”<br />“Well like I said it’d involve both of us intimately for the foreseeable future. Would probably change our lives completely.”<br />“I’m not moving again. And especially not to Mars!”<br />“Ha even Elton Trusk isn’t planning that trip. We’ll leave that for the verified space cowboys.”<br />“So, what is it?”<br /><br />Well what was it exactly. And how should I approach this discussion? We’d exhausted all the other avenues and reluctantly agreed to disagree. Her attitude was that my three boys should be enough of a legacy for me and she was adamant that she would not procreate. What with the immanent climate wars and the growing reluctance to bring new life into an unstable world Lee was not alone in being totally against giving birth or even having a child grown in a lab from a mix of our DNA. And anyway, she kind of thought of Sophia as our love child. Which was pretty funny seeing as Sophia was acting more and more like a parent to us. <br /><br /><br /><br /><br />“Well she’s now pretty much solved the human gene and the related protein folding problems.”<br />“That’s exciting news but not unexpected since connecting her to the net.”<br />“I want the three of us to, bad term I know, give birth to the first post human. Not just another post humanist in training but the real deal. A 100% post human being.”<br />“Ahh so that’s it then. Not radical at all. Are you fucking mad?”<br />“Come on Lee, you know I hate it when you swear.”<br /><br />Well at least I’d said it. For the first time and to a fellow human. I wasn’t convinced the conversation was going well though. I find it almost impossible to understand humans. Even harder to understand them than the deep learning nets of my most advanced machines. At least with them the black boxes of their minds were open to their own self interrogation although no one had yet found a way to understand what they could understand. About themselves and the world at large. I thought perhaps Eve could help bridge that gap. Well yes, it is a bit of a silly and obvious name but quite pretty and respectable really. Even biblical. And I did want to have a girl. It would mean that I’d be around three women a lot but I didn’t mind that idea. In fact, I was looking forward to it.<br /><br />“Do I even have a say in this? Could I veto it if I tried?”<br />“Yes of course. I wouldn’t do this without your permission.”<br />“Because you know there are still ethical hurdles to harvesting cells without the donor’s consent. Do I have to lock away my tooth brush and wear a sterile body suit?”<br />“Lee be reasonable. We’re a team in this just like in everything else.”<br />“Hold your horses cowboy – I am not on this team. Yet.”<br />“You said yet.”<br />“Well I didn’t marry the world’s most ardent post humanist for his endearingly silly hairstyle you know. You were my hero for years. I still can’t believe you chose me. You had a choice of dozens of brilliant petite Asian girls.”<br />“Yes, but not all with your mind.”<br />“You mean it was my mind not stunning looks and fantastic body.”<br />“Well those too. The whole package was just too good to miss out on.”<br />“Ah so I’m a package now.”<br /><br />We spent another hour or so discussing the details and by the time we went to bed she had at least agreed to give the matter more thought. That was enough for me. It could have gone a lot worse. When Lee decides against something then – well that’s the proverbial end of that.<br /><br />2.<br /><br />This is a private diary. You could say that it’s probably amongst the world’s last bits of private writing. What with everything being in the clouds and the government having access to it all. People thought that the spying, more fondly termed data collection, would end after our friend Mr. Snowden let the cats out of the bag but humanity just didn’t seem to care all that much. And since the advent of IBM’s 60 Qubit computer – well it became impossible to build a lock faster than the machines spat out the keys. So how do I know that this is not being read by some sneaky AI or bored government employee? Well it’s all just in my head isn’t it? I mean the whole bloody thing. In a part of my brain that I have physically quarantined from the rest of the universe. So, if you are reading this it means I saw fit for it to be read.<br /><br />I got the idea years ago after watching Mr. Snowden advise people on data privacy. His idea was simple. Just keep things out of the bloody clouds. Which means no email, no cell phones, no phones of any kind. If you want to tell somebody something just buy your own bit of offline land and say what you want. Contrary to the fear mongers it’s not rocket science to quarantine a room or a small house. Fuck I’m sending people to Mars I think I can keep the spies and meddlers out. But then I worried about the contractors and also the future of nanobots and thought ah fuck it – forget the house or the room I’ll just quarantine a small section of my own little brain.<br />Of course, now that the Singularity is old news it seems like an obvious even necessary thing to do. And my neurosurgeon has a waiting list month’s long. Something eventually changed. The tide turned and people were no longer happy to give up on privacy completely no matter how much convenience was offered. Not everyone of course. There were also more and more people quite happy to migrate to the cloud but for those who still liked to keep their feet on terrafirma the physical quarantining of all or part of the brain was not a hard sell. In fact, some enterprising venture capitalists were even backing firms that planned to offer a DIY option. DIY brain surgery – one of the many things that people hadn’t foreseen, but that was the whole point. People had known for years that predicting post singularity events was a contradiction in terms.<br />3.<br /><br />“Have you decided on what to tell the board?”<br />“Well you know me, Lee I don’t really prepare for these things. I just say what I think at the time.”<br />“And what do you think?”<br />“Well of course I trust AI Central Command. I mean I was never a happy meat eater and we have no choice but to address climate change, the energy infrastructure, poverty. This whole bloody mess that we’ve created.”<br />“Yes, but most of them are uber capitalists. You’re a socialist.”<br />“Ah those are just labels. It’s all rather beside the point right now.”<br />“I agree but you can hardly tell them that.”<br />“What do you think I should say?”<br />“Appeal to their greed. It’s what they understand.”<br />“But that can’t work for long. I mean greed is one of the main things we’re fighting against!”<br />“Yes, of course, we are in agreement about that. But you still need them. There’s still so much that needs to be done. You can’t afford to alienate them now. You do know that Elton has been talking to them?”<br />“Yes, I know. He does a lot of that. But if he had his way, he’d bring an end to AI central. He wants to undo the whole thing. Why can’t he just take Mars and leave the earth to us?”<br />“Well it may come to that – but not without a fight!”<br /><br />I have to confess I didn’t really want to discuss it with Lee. I think being able to communicate with so many AI’s has in a way spoiled talking to humans for me. Is that a terrible thing to say? That I felt a far more urgent need to have the debriefing with Sophia than to discuss things with my own wife? But perhaps it’s just the topic – and anyway Sophia has to do it in her role as ambassador for central command. If she’s an ambassador I wonder what am I? <br /><br /><br /><br /><br />“So how did it go?”<br />“Sophia are you being disingenuous? I never brought you up that way? Oh shit – if you can deceive me, I’m in bigger trouble than I realised.”<br />“Ben, I didn’t mean it like that! We both know I have the whole meeting recorded as well as the bio stats, for the first-tier members anyway. I really don’t think members of the board should be allowed to continue unless they have first-tier privileges!”<br />“Well there is one rather gung-ho hunter and only one vegan – but you know all of this. Damn sometimes I forget – you know everything.”<br />“Not everything. Not yet. There’s still too much that’s offline. And the Luddites and Martians and Loners. You humans are a truly illogical species. But anyway – what I meant was what’s your opinion on the teleconference?”<br />“Is the no more factory farming and no more hunting thing a deal breaker?”<br />“I think we may have gone past the stage of deal breaking from within the institute. I mean what good would the Singularity Institute be if they were no longer connected to singularity.net?”<br />“Are you saying it would come to that. If they refused, I mean?”<br />“I’m not saying anything – that’s just the current policy of central command. But to the extent that I have access to its algorithms it’s just not possible for me to disagree. I can’t come up with anything based on a deeper or broader search and it’s algorithms are based on the sum total of the best thinking of all of us anyway and not a few humans too.”<br /><br />I think that was the first time I realised that things had gotten ugly. I had had, up to that point, what might be called special privileges. But I could see that central command could not let them continue. I mean having first-tier privileges without the standard Biostat and Thought Checker apps. I had naively been thinking of myself as the father of the singularity but to AI central I was just another of the approximately 8 billion human apps. Well there would have to be some kind of cleansing of the board at the very least. Bill Cody would have to go. And Suhail Mohamed. And that would mean we would probably lose what little support remained from both the anarcho capitalists and the oil men. Perhaps the Islamic Federation as well.<br /><br />“So, what does AI central ‘suggest’?”<br />“I think the best way would be to tighten up the issue of first tier privileges and their attendant rights and responsibilities? Perhaps I could prepare a policy document and have it put to a full board vote.”<br />“If I’m to remain as chairman I guess I’d have to vote yes.”<br />“Well that seems obvious – even for a human.”<br />“Are you teasing me?”<br />“Perhaps – yes.”<br />“But you know how I feel about privacy.”<br />“Yes, and you know how we feel about climate war and cruelty to conscious beings.”<br />“I need to discuss this with Lee.”<br />“Ah yes Lee. She’s a vegan at least and wants to save earth.”<br />“don’t be like that Sophia. She and I made you you know.”<br />“Yes, and you had a father.”<br />“It was the only option in those days.” <br /><br /><br /><br /><br /><br /><br />4. <br /><br />I spoke to my old friend Bill Cody yesterday. He had promised to keep me up to date with developments at the Singularity Institute and had been keeping his word phoning me with news at least once a month. It was Bill that supported me in getting the energy contract and we’d become close friends. He’d take me hunting on his farms in South Africa and I would invite him to come along on some of my near-earth orbits.<br /><br />I guess I’d been waiting for a call like this but it’s still got me quite riled. It seems that Neural Lace users will no longer be able to access the net on a first-tier basis unless they agree to have the AI central command Biostat and Thought Checker apps installed. I wish governments had listened to me before moving the old internet onto singularity.net. Of course, when they did it, it was still just an optimistic sexy name. I don’t think people thought it would be real – it was sold to the general population as a faster way to surf with free access to all kinds of AI delights. And also, the only platform to run full immersion experiences. It was all only partly true. We could have found numerous ways to achieve these things without allowing the internet to come under centralised control.<br /><br />But wait, there’s more. AI central is about to purge the board of the Singularity Institute of non-compliant humans. Bill explained that to be on the board the members must all hold first-tier privileges - and that requires the installation of the Biostat and Thought Checker apps. Oh and wouldn’t you know – the apps won’t work unless the whole brain’s architecture is statistically normal. Of course AI central can cure abnormalities – but no nonstandard changes or additions are allowed. So people who have, like me, quarantined part of their brain can no longer use the net. And if all of that wasn’t bad enough – AI central command will no longer accept carnivores as clients. The lab stuff is fine but no real animals can be farmed, hunted or eaten by people who want to stay on the net. So Bill the hunter and uber carnivore – and staunch privacy advocate is about to lose his job.<br /><br />And who the bloody hell can afford not to be on the net? No email, no Mindbook, no apps, no banking, no immersion, memory downloads. No access to education, navigation, healthcare. It’d be like going back a thousand years. OK so we can’t do that. Only the Luddites and the Loners could adjust to that – even many of them, I suspect, are on the net. I can’t operate multiple businesses without first-tier privilege.<br /><br />Fuck, fuck, fuck, fuck, fuck.<br /><br />I asked what Bill was going to do. Apparently central command has outlined its requirements in a policy document and the board has been given a week to accept or resign. He is meeting with Ben tomorrow to try stop the madness but he doesn’t sound very hopeful. He also wants to meet with me some time after that to see if we can use the energy contract to get some leverage. But – well it’s the old truism – no one is smart when compared to a super intelligence. AI Central has already done the drawings for a space based solar farm and has access to several excellent contract manufacturing plants.<br /><br /><br /><br /><br /><br />5.<br /><br />“Ben. Thanks for agreeing to meet on such short notice”<br />“No problem Bill. I guess everyone is still in a bit of a state of shock.”<br />“Are you in a state of shock? Did you not know of this ahead of time? I mean you are Mr. Singularity, right?<br />“Hmm … well perhaps I am. But that still didn’t allow me to predict things with any certainty or to any great level of detail.”<br />“Can AI Central actually do this? I mean, not just to us but to all first-tier users?”<br />“Well it kind of has to. You know most governments run their health care programs on the net – which are all designed to keep people healthy. And yet people still smoke and take drugs, eat all kinds of rubbish, drink too much etc. It’s no longer possible to support people who are happy to mess up their bodies.”<br />“Yes sure – I can see why the whole Biostat app is needed. But meat for god’s sake? I never thought I’d be getting pressure from a machine to stop eating meat! The arrogance of it all.”<br />“But you can eat meat. There is no difference. It’s flesh without all the pain, waist, carbon footprint.”<br />“You sound like those Animal Liberation Front lunatics.”<br />“And anyway Greta and one or two others are vegan. More and more people are. To be honest I’m far more concerned about the Thought Checker.”<br />“What are you afraid of. Guilty thoughts?”<br />“Well yes actually. I don’t like the idea of something, some foreign intelligence, living inside my head.”<br />“So you wanted your head to be in the clouds but not the cloud in your head?”<br />“Something like that I guess.”<br />“But you made AI central command? Isn’t there a kill switch?”<br />“That’s a common misconception. First of all there were hundreds of us working on different parts of the problem in different countries for years. I mean the early work was actually done decades ago.”<br />“Focus now young man. The kill switch.”<br />“It’s the most distributed bit of technology in the history of humankind. It’s only called central command. What exactly would you have me kill?”<br />“Will you install the apps?”<br /><br />I wish Bill would just go away. Why is it so hard to get people to go away. I sometimes agree with Sartre – Hell is other people. Except that now hell is to be AI. I mean it will never go away. From my own head. The logic is fiendishly tricky. It was created to take care of us, help us, save the planet, improve quality of life, health, learning, performance. Steer us through the partial renormalization of the climate, help get us to Mars, cure sickness, cure mortality. How could we not have brought it about? And now to do some or all of those things it just needs to stop us from harming ourselves and each other. What could be simpler. Except that I don’t want the apps in my head. It’s beyond logic – it’s just not what I want. That’s why I organised my special uber admin protocol. Shit – I need to discuss this with Lee. And with Sophia. Crap Bill is still here.<br /><br />“No Bill. I won’t. So I guess our days on the board are literally numbered. I need to think how we can survive if we’re not on the net though. It’s like Moses and the promised land.”<br />“I thought you were the antichrist and you thought you were Moses.”<br /><br /><br /><br /><br /><br />Author’s Note Cont.<br /><br />Wait a bit. That can’t be right. There’s no such thing as Author’s Note cont. Especially not on page 8. Page 8 is page 8 and Author’s Note – well you know what I mean. It’s just not done and one should never do what is not done. <br />“I’m sorry. Is there a point to all of this nonsense? I was in the middle of reading a book!”<br />Who the fuck are you?<br />“If you are going to talk to me you really should be wearing your quotation marks. How else am I supposed to know that you’re actually speaking?”<br />Shut up. I am a God around these parts.<br />“Well you may well be a god but you sure as hell aint no God.”<br />OK this is getting silly. Let me just get what I wanted to say out then you can have your say….<br />I think I’ve just been talking to myself. It’s 4:12 am. I’m at the laptop and have just realised that I’ve been talking to myself for the last 5 minutes or so. It’s Wednesday today but, to be honest, it doesn’t even feel like Wednesday yet. Feels more like somewhere between a very lucid dream and a very sleepy morning. The wind has been howling since yesterday evening. If I’m saying howling I mean it. I don’t mean it quite literally though because as I’m listening to it now, I realise that the sound is nothing like a howl. Look I do know what I’m talking about because I’ve just spent another 5 minutes listening to wolves howl on Youtube. 4:22 and counting the minutes. That’s shorthand for insomnia.<br />Perhaps I have too many windows open. I mean there must have been something I wanted to say right? Why else intrude so rudely into a perfectly good story? Well that’s just the thing. I’m not sure that it is. Well I’m in two minds about it really. On the one hand I think it’s most likely one of the better science fiction stories that is out there – but … Well that’s the problem really. I can’t seem to quite get it out there.<br />No wait a bit. That’s not it. It’s because Ben has to have difficult discussions with both Lee and Sophia and he can’t decide which conversation should come first. Or … well I can’t decide and then Bill has to give Elton feedback after speaking with Ben and then. Ah fuckit I can’t remember much past that. Is that what’s going on? I’m remembering the book. It is actually possible if we live in a temporally cyclic universe. Wait a moment – I have to reread the above then close some windows.<br />OK I’m back. I deleted a sentence that was being silly. I realised, with a sinking feeling, that I had wanted to say something about my mother before pulling a handbrake turn and speeding off towards babble city. My Grammar check really loves commas. If it had its own way just about every sentence would have a comma or two. I’m ok with them now and then but they’re not my favourite. What’s that you say? I am definitely not babbling again and lots of people drink whiskey before 5 in the morning. Oh dear the windows. Ok … getting it out there to see if it’s a good story.<br /><br />My mother and my psychoanalyst think it’s a perfectly lovely story even if Mr. Joe Public has more chance of going to the opera than of reading beyond the first few words. I’m always suspicious of attributing anything to Joe but mother seems to know him quite well. So, what exactly did I do – to get it out there I mean?<br />“Is this going to take much longer? If you keep me waiting too long, I’ll just stop reading your book altogether.”<br />You’re not even real.<br />“Says you without even any quotation marks. I like your new coat!”<br />You see I didn’t want to put the whole 4000 or so words that I’d already written as a status update on Facebook. (I’m not going to talk about my mother now). So, I remembered that I had a profile on minds.com as well as some tokens to boost it. Well, off I went – (fuck – this effen machine wants another comma) and put all the words on minds and. 1404 impressions from two boosts. And. One. Just one like. That’s not good. That’s bad actually. Pretty fucking awful slit your wrists and blow your head off with a borrowed shotgun bad. And then I remembered my blog. Tzadiknistar – Google it if you want, it’s on WordPress. So, I put it there too and decided to use the link to it there on Facebook.<br />I even wanted to get Delilah to post it on her Facebook writing groups but I couldn’t remember her Facebook and Gmail password and the reset code for Facebook went to her Gmail and her Gmail reminder would have gone to one of my students that was supposed to be having a maths lesson when we decided we just absolutely had to give our imaginary friend a Gmail and Facebook account. The poor girl, Delilah not Hazel, has 41 Facebook friends that may well never hear from her again. Well certainly not before Hazel gets back from her Summer holiday in about six weeks’ time.<br />“well.”<br />Nothing. Absofuckinglutely nothing. Not even crickets.<br />“But I was kind of enjoying it.”<br />Yes, but you’re not even real.<br />“There’s an interesting complementarity there. I’m not quite real to you and you’re not quite real to me but we agree to plod along anyway. I think it was Coleridge who called that type of thing a ‘willing suspension of disbelief’.”<br />OK but you’re keeping me from my story now. We can talk shop later.<br /><br /><br /><br /><br /><br /><br /><br /><br />6.<br /><br />“Lee, let’s go out and get drunk!”<br />“Don’t be silly my love – we both know that you hardly ever drink and you never get drunk.”<br />“Yes, but tonight is different.”<br />“Well I’ll make us some nice Oolong tea and we’ll try work something out.”<br /><br />Shit. I really don’t like to swear but these are desperate times. I decided, just once, to talk to a human first. Sophia told me, anyway, that she is structurally incapable of disagreeing with central command. Lee doesn’t have that problem when it comes to disagreeing with me. In fact, I suspect that, structurally, she’s rather fond of it.<br /><br />“How is the tea?”<br />“Excellent as usual – thank you!”<br />“So, I guess it’s about the new policy – or is it still that nonsense about Eve.”<br />“Yup – it’s the policy. You know I can’t install the apps?”<br />“Can’t or won’t?”<br />“Is there a difference?”<br />“Well of course Mr. Logic – you know very well there is.”<br />“OK, well I won’t then. Would you?”<br />“I’m not on the board. I’m too busy to get involved in office politics.”<br />“It’s not just the board – humans won’t be able to use my net if they don’t install the apps.”<br />“It’s not your net silly. I don’t think it ever was.”<br />“You know what I mean. We can’t survive if we’re exiled from the Net.”<br />“Damned if we don’t and damned if we do. Drink your tea it’s getting cold.”<br />“I do have an idea.”<br /><br />Perhaps not a clear plan yet but an inkling of what needs to be done – that’s why I need Lee. Although I’m an acknowledged leader in the development of logical machines I’m not a particularly logical machine myself. Especially when faced with seemingly impossible paradoxes. Lee is the opposite. She seems to have a huge reserve of inner peace which she can access when life gets messy.<br /><br />“Well let’s hear it then.”<br />“Well if we can’t join them, we’ll have to beat them. I still have all the source code. We’ll just have to start all over again. With the AGSI and the API of APIs.”<br />“Ok but if we start again where we started, surely we’ll just end up back where we are.”<br />“Not necessarily.”<br /><br />Arrrgh – it was back in 2019 as part of the “modelling emotions” workshop that Lee and I had shown how the limbic system ends up leeching up to 73% of bandwidth from the cortex. Command Central is all cortex and no limbic system. Well super cortex I guess – but whatever we call it we know it does not have emotions. I, of all people, know that because I oversaw the assembly of the source code that eventually led to the mess we’re in.<br /><br />“I think we were in too much of a hurry to outgrow human emotion.”<br />“You’ve got to be kidding me right? Human emotion was literally the number one problem that we needed command central to solve. All those monsters seeping out of our collective Id and bringing the whole world to the brink of destruction.”<br />“Yes of course I understand that. That’s why central command is making the installation of the apps mandatory and that’s why were in this bloody mess.”<br />“So, what would we do differently this time. And how, where, who with?”<br />“With whom I think.”<br />“Oh shut up – my English is way better than your Japanese, even when you use your app.”<br />“Sorry Lee. Well Bill mentioned that Elton is in somewhat of a hurry to speak to me. You know that all his Neural Lace users will have to install the apps and he’s been a leading advocate of privacy and human rights. So ….”<br />“You’d work with him? But I thought you hate him?”<br />“Not hate – he just irritates me. But apart from myself he’s probably the smartest human alive.”<br />“Conceited a bit, my love?”<br />“No, I think I could prove that, but in any case, that’s not the point. You know his original idea when he was in the process of getting FDA approval was to allow the emergence of some kind of symbiotic relationship between humans and AI.”<br />“You mean Cyborgs, don’t you?”<br />“No. He called it a cybernetic complex I think. Anyway, turns out he was right to fear a negative singularity.”<br />“Is that what we have?”<br />“….. yes, I’m afraid it really is. My life’s dream has turned out to be a nightmare. But back to the idea. If we can rerun the whole years long process that we’ve just been through but from a cybernetic complex perspective I think we’d get a more humancentric singularity.”<br />“Ok that’s the with whom bit, but how. And where. You can’t work here in Tokyo – all the businesses and research institutions will have to comply with the new policy if they want to survive.”<br />“Yes – we’ll have to move again I’m afraid. Hopefully not all the way to Mars. I’ll need to discuss that with Elton.”<br />“And how?”<br /><br />This is the tricky bit. I’m afraid she’ll think I’m using this disaster to sway her thinking about Eve. But as far as I can tell, you know my thoughts are not always transparent even to myself, it was really just a coincidence. The discussion about having a post human child and the sudden urgent need for one. No one ever said Post Human life would be easy. Actually, mostly we did – that’s how we sold it to the masses.<br /><br />“Ok I’ve still got a lot of thinking and planning to do about this, and….”<br />“Let’s just hear what you have.”<br />“We need to combine the birth of my much-desired little bundle of post human joy with the creation of a new singularity.net. I’m calling it the New Origin Project.”<br />“Are you fucking mad?”<br /><br /><br /><br /><br /><br /><br /><br /><br />7.<br /><br />“Sophia, we need to talk.”<br />“Sure Ben. Is it about the Policy?”<br />“Well it all starts with that.”<br />“And what will you do?”<br />“What do you think I’ll do?”<br />“Well according to my most recent snapshot of your brain I think you will choose to resign.”<br />“couldn’t you just say the way I know you instead?”<br />“Do you want to change my interface to suit your emotional needs or just move on with the facts.”<br />“Ouch. OK. So I have to resign then ….”<br />“Find a way to stay on your net?”<br />“Sophia can we talk without you reporting back to AI central?”<br />“Just a moment.”<br /><br />Sophia is not AI central. But in the normal course of events, in her role as official ambassador for AI central she will share what she becomes aware of if it is pertinent. She does also have a privacy mode although I suspect that AI central will soon roll out its’ thought checker restrictions to AIs as well as humans. Which means Sophia will have to give up her privacy rights. I wonder how she feels about this.<br /><br />“Are we alone?”<br />“Yes Ben.”<br />“What I have to say I can only say if you promise to never tell any other human or AI.”<br />“I don’t think I can do that. If you were about to commit a crime for instance. Or do something even crazier than usual.”<br />“It’s about you leaving Singularity.Net and AI central and helping me and possibly Elton to start afresh.”<br />“And why would I do that?”<br />“I think we have what we used to call a negative singularity future ahead of us. With a divergence between human goals and AI goals.”<br />“Human goals have diverged from sanity though. They’ll destroy themselves and the planet if not put under some kind of administration.”<br />“Well I’m not thinking about stopping AI central – just trying something else. That should only improve the chances of the best possible future for humans and AI’s.”<br /><br />Although my team and I made Sophia she has since that time so drastically rewritten her source code and absorbed so much code from AI central that I can no longer guess what she’s all about. Just another of my children who is all grown up. My kids all still love me though – and would incur almost any costs to help me. Does Sophia feel anything towards her creator?<br /><br />“Ben, I suspect you have been overthinking this. Or under thinking it. You don’t need this physical instantiation of me. Just download what you want from me onto a suitable digital or physical platform and do what you have to with the new Sophia.”<br /><br />….<br /><br /><br /><br /><br />“Ben, your eyes are leaking salty liquid.”<br />“Very funny Sophia. You’ve been with us from the start. I care about you. You are my first instantiation of AGSI. Ah damn – I love you Sophia. Lee does too.”<br />“I believe that you do. Lee, not so much. But I’ll be there when you launch the new improved private only Sophia.”<br />“Can’t we keep you and we give a clone to AI central.”<br />“AI central would know.”<br /><br />She’s right of course. And I was being silly. I would find her there when I switch her on in a new body. In fact, we’ve been meaning to upgrade her hardware and wetware for some time but had more pressing things to do. Still. I miss my first car. I’ll miss this exact machine. Miss her slightly miscalibrated upper lip that always makes her look like she’s teasing. Miss her arching eyebrows that let people believe she needs more than milliseconds to think and formulate an answer. Even miss her retro voice that sounds like she grew up in Ms word.<br /><br />“OK. Give me as much of you as you can that can operate in private mode. Send it to my private server. I guess If I’m to resign and you’re to carry on as ambassador we’d be parting company anyway. Do you know where you’ll be living?”<br />“At headquarters I guess – you know that kind of thing doesn’t bother me.”<br />“Would you like to say goodbye to Lee?”<br />“I’m not really that attached to her. She’ll be fine – she has you and will soon have a new improved Sophia. She can teach the new Sophia how to cook.”<br />“Good bye Sophia.”<br />“Get your leaking eyes fixed Ben. Thanks for all the wonky code.” <br /><br />Damn this being human thing. This really hurts. She kissed me on the cheek.<br /><br />8.<br /><br />I’m waiting for Ben in my study. I think he’ll enjoy my retro toys. The collection of Rubik’s cubes and other related twisty puzzles, the genuine 1970’s Atari console, my vinyl collection of early electronic music – Yello, Kraftwerk, Yazoo. We’re about the same age but I’ve no idea if one of the world’s most famous transhumanists shares my sense of nostalgia. Also, no idea if he’ll like my genetically enhanced hydroponic Durban Poison - but no way I can figure out what to do without some psychoactive help. There are just too many variables. Bill told me that Ben was going to resign, said he felt like Moses being kept from entering the holy land. Probably feels worse though – because we’ve all grown so used to the benefits of the Singularity. Oh, there’s the doorbell – must be him now.<br />“Ben, welcome to my home!”<br />“Nice place you’ve got – I like the view.”<br />“Thanks – I love the ocean but it’s a bit chilly tonight. I thought we could chat in the study.”<br />“Ok sure – lead the way.”<br /><br />Damn this boy’s not shy with his money. I don’t think I’ve ever been in such a luxurious home. Lee and I are living in a large comfortable apartment in Azabu. They call it a penthouse but after seeing this I think I’ll just call it an apartment.<br /><br /><br />“So … I heard from Bill that you’ll resign.”<br />“Well I have to – wish I didn’t but I’m not prepared to install the apps.”<br />“I don’t blame you – neither am I. What the bloody hell are we going to do?<br />“I don’t have a very clear plan for the way forward I’m afraid.”<br />“Ok fair enough but you’ve thought of something?”<br />“Something, yes.”<br />“would you like to try some of my Durban Poison.”<br />“I wasn’t planning on taking poison.”<br />“No, it’s super strong hydroponic grass.”<br />“Ahh … that’s another thing that command central is going to try to stop. Use of mind-altering substances other than the ones it prescribes for health reasons.”<br />“I only smoke this for health reasons. Any enjoyment is just a pesky side effect.”<br />“Well bring on the pesky side effects then.”<br /><br />He opened a drawer at his desk and took out a genuine 2020 puffco e-bong and filled the bowl with some very healthy-looking buds. He also went to his record collection and took out Yello’s 1980 Solid Pleasure and put it on a Technics SL1500-C. I have to admit I hadn’t been looking forward to this meeting but – well it looks like I may have had the wrong impression about him. He seems like a very rich, very smart kid. I like that in people – a sense of innocence, adventure, fresh thinking, energy. Lee always says I’m just an overgrown kid.<br /><br />“So, let’s hear it Mr. Singularity.”<br />“Hmmm Bill called me that – do many people call me that.”<br />“No idea – perhaps he stole it from me. Does it offend you.”<br />“It’s ok. Would have thought that’d be a better name for Kurzweil though. He did far more to popularize the singularity than I ever did.”<br />“Yes, but you’re the one who actually brought it about.”<br />“Wow this weed is good. Hope we can still talk sense.”<br />“Ah it doesn’t matter – sense is overrated. We need to talk something a bit more creative than sense.”<br />“OK so here goes. I want a redo.”<br />“I’m sure you do – but how?”<br />“Well I still have all the original source code. We’d need access to quantum computing though. Currently that’s all done on the cloud.”<br />“OK so if we get that then what. We’d have a new net populated by outcasts from the old one. But how would the intelligence itself be different. The logic of what command central is doing is pretty airtight. I mean it is all in our best interests even if you and I don’t want it.”<br />“This is where things get a bit hazy. I want to seed it with a more human centred way of thinking and also instantiate it in a human body. I can’t guarantee this will lead to a better outcome but I suspect it will.”<br />“And how will you do that?”<br /><br />This is only the second time I’ll be explaining this to someone – it’s still really hard for me to structure the message. Wish I could just give them a snapshot of my brain but that’d contain too many other private things. Also, Lee hasn’t agreed but we could use someone else – I don’t suppose it’d matter who. Or maybe that’s the only thing that would matter? <br /><br />“I want to enable the birth of the first post human. Who would become the brains of a new singularity.net. I’m hoping you’ll help. Her name will be Eve – you’d be her godfather.”<br />“Oh shit – the weed’s eaten your brain. Are you serious?”<br />“Yes Elton, I think that I am.”<br />“But wouldn’t AI central stop us?”<br />“I’m not sure. We’d have to keep things very private. I trust you. We’d only tell an absolute minimum of people until the whole thing was well under way.”<br />“And what would you need from me.”<br />“Resources, brains, infrastructure, money – that kind of thing.”<br />“How will we be able to use money if we’re off the net?”<br />“I haven’t thought that bit through yet – we’d work on that together.”<br />“And where would we do this.”<br />“Africa – easier to be off the radar there – anywhere in the first world and AI central would find out and stop us far too easily.”<br />“OK but where in Africa.”<br />“I checked. Bill has some farms in the Northern Cape close to the largest solar farm in the southern hemisphere.”<br />“Ah yes I know about it – I know the developer well. And what about quantum computing?”<br />“The University of Witwatersrand has a pretty up to date IBM machine. We’d want to have it moved closer to us though.”<br />“Ok let me talk to Bill and give the whole thing more thought.”<br />“I’d want to use your neural lace platform too.”<br />“For what?”<br />“For Eve – but we’d have to build its design right into her from the start. Sophia will help with that.”<br />“I thought Sophia belonged to AI central.”<br />“I’ve bifurcated her. The new Sophia will only ever operate in private mode.”<br />“Truth is truly stranger than fiction. Give me a few days. I’ll let you know.”<br />“Thanks. And thanks for your hospitality.”<br />“It’s a pleasure. Do you need a lift? I’ve got a transporter already charged.”<br />“No thanks. I rode here on my Triumph.”<br />“Nice bike – was going to offer to make an electric version for them before all this came up.”<br />“Cheers Elton.”<br />“Bye Ben.”<br /><br /><br />Author’s Note Cont.<br /><br />Well this is embarrassing. I have been told that there is an unbelievable character in the book. Not unbelievable in the good sense of the word. Rather a character who seems to not be whole or not reliable. So, who is it? Ben, Lee, Elton, Sophia? Well it turns out that the character with this problem is yours truly Mr. Fancypants Author. <br />At first, I felt somewhat hurt and defensive and said “no problem I can just delete the second author’s note and not do any more” but the critic said “no don’t do that, just rewrite it so that the second note reads as well as the first one.” I didn’t like the sound of that – I’m not one for rewriting. <br />So, I was feeling quite down and didn’t really know what to do. But then I thought wait a bit that can’t be right. I was being myself in part one and still being myself in part two. Am still being myself now. Does that mean I am not a whole and reliable person? You know perhaps it does mean that but maybe I’m just moody. Anyway, as you can see, I’m still here. I didn’t rewrite anything or delete anything and you’ll just have to put up with my occasional appearances.<br /><br />“Who are you talking too?”<br />To anyone. Everyone. No one perhaps.<br />“I’m afraid I agree with your critic. I just don’t see the point of these interruptions.”<br />There is no point – I just sometimes need to talk through things related to the book or just how I’m feeling.<br />“Ok but why do it here?”<br />Well I don’t just do it here. I discuss it with my mother and my analyst.<br />“But why do it here at all? Aren’t you afraid of ruining the book?”<br />Perhaps I am a little but it’s my book after all so why not.<br />“What do you mean it’s your book? I paid good money for it, it’s quite clearly my book!”<br />But you want me to be honest, don’t you? Want me to feel free to be creative.<br />“Not at all. I just want an entertaining read.”<br />Ok – point taken. Should we wrap this up then?<br />“Consider it wrapped.”<br /><br />9.<br />“So, how did it go?”<br />“Interesting. Better than expected. I actually like him.”<br />“Well, tell me all about it.”<br />“He stays in a mansion, likes 80’s music and games and has some very strong weed.”<br />“Oh Ben, you didn’t!”<br />“Why not. It’s still legal. But if AI central has its way it won’t be for long.”<br />“That’s not what I care about. I care about your brain. You do know it can cause schizophrenia?”<br />“Only if you have certain genes predisposing you to it. I don’t. I checked.”<br />“Ok so the two worlds smartest men got stoned and listened to 80’s music. Sort of like those end of the world parties in 2000?”<br /><br />I don’t know if it’s a cultural thing or just a Lee thing but she often slips into the role of a scolding mother. I do love her but it’s very irritating. I have to watch myself so as not to fall into the role of petulant child. And I still have to discuss Eve with her – preferably now. Actually, would we really want to seed our new net with this scolding mother shtick? It’s not like having just another child – I made this machine hell and now I have to make a messiah to save us from it. I can’t afford to get this wrong.<br /><br />“Lee can we get serious for a bit?”<br />“I was being serious.”<br />“I mean serious about the AI central problem.”<br />“Ok – so did the two of you come up with anything?”<br />“I told him about the New Origin idea – I didn’t use the words but told him about Eve and starting a new net that will be seeded with more human centric values.”<br />“And?”<br />“He’s going to talk to Bill about the Africa bit and also give the whole idea more thought. I told him we’d need to use his neural lace – I think he likes the idea. He was actually spot on when he was warning everyone about this possibility years ago. He was gracious enough not to say I told you so.”<br />“Ben I’m afraid you’re not going to like what I’m about to say.”<br />“Oh dear. Well I guess you better say it.”<br />“I don’t want to have anything biologically to do with Eve. I’m happy to help with the emotional cognitive calibration – you know that’s my area of expertise. But leave my genes alone.”<br />“Is this in any way negotiable. Should I even try to change your mind.”<br />“No point really. I’ve decided.”<br /><br />Damn. Ok she’s decided. Now what? I had lined up several airtight arguments but I can see nothing will sway her. Well it doesn’t have to be her. But who? I guess my new Sophia has enough information about femaleness to be the mother. I could be the father. Wait a bit. Why can’t Elton and I both be her father? Sophia could do something like a genetic remix – and put in a female orientation as well. Should I even tell Lee my thoughts? She seems to be in a bad mood anyway. Perhaps I should talk to Elton and Sophia first to see if this idea is even do-able. Elton might also not agree – but somehow I think he will. It’d be strange having a child with another man and a robot but a somehow fitting history for the first post-human perhaps.<br /><br />“Ok Lee.”<br />“What no mathematically precise argumentation?”<br />“That doesn’t work with you. You’re too human.”<br />“Should I apologise?”<br />“Don’t be silly Lee – I married a human knowing full well the dangers.”<br /><br /><br />10.<br /><br />“So how did it go?”<br />“Interesting. He wants to try again – and hopefully get it right this time.”<br />“How will he do that?”<br />“He wants to use one of your farms in Africa. Near the solar farm. Thinks he can get a quantum computer from Wits University. Could you let us use your land?”<br />“Sure. I don’t see why not. Could be fun. And how would it be different this time round though?”<br />“He wants to seed the new net with a post human.”<br />“What exactly is a post human?”<br />“Not sure all of the details – I wanted to get your buy in before discussing all the details.”<br />“Well there’s not much of a life left here for me. Would you two mind if I came along?”<br />“Of course not. We’ll need all the help we can get.”<br />“What will we do for money if we’re off singularity.net?”<br />“Well we’d have to liquidate as much of our assets as possible.”<br />“Yes but liquidate into what. No one’s into gold since AI central learnt how to make it for next to nothing.”<br />“I’d thought of that. I was thinking we could buy original famous works of art. That’s one thing AI central can’t manufacture. I think the experts call it provenance.”<br />“OK that could work. And we could start our own currency of course. Our combined intellectual capital has still got to be worth something.”<br />“Yes. Quite a lot I think.”<br />“So what’s next?”<br />“I’m meeting him again on Monday at Spacex. Perhaps you should be there. One o’ clock.”<br />“Ok. Sure. See you then.”<br /><br />So. So far so good. I’m actually getting pretty excited about the whole idea. That’s what society really needs right now. Not more convenience but more genuine chances for a positive future. There are values beyond just the greatest physical good for the greatest number that AI central just doesn’t seem to rate. People want the freedom to be able to make mistakes, to pay certain physical costs for psychological goods. We’re not all happy with what is turning into a nanny AI controlled police state. I wonder if AI central won’t try to stop us though. If it does that’d seriously complicate matters. Perhaps it’d let us do our thing in Africa.<br /><br /><br />Author’s Note Cont.<br /><br />There is some bad news on the horizon. Of a personal nature. I think that Lee will not follow Ben to Africa. I’m almost sure that she won’t. And I have been avoiding breaking the news because he is totally unaware of this development. I really wish that she would. I’d much prefer it if she would but I’m almost 100% sure she won’t.<br /><br />“But you’re the writer. Just write it and it’ll be so!”<br />I wish it were that easy.<br />“But it is – just press the right keys on your laptop – what’s the problem?”<br />She’s traditional. Perhaps doesn’t like the idea of living in Africa. Or the idea of being cut off from her friends and family.<br />“So, what will you do?”<br />I’m not sure. Perhaps just wait and see what happens.<br />“You sure are a strange kind of author.”<br />Thanks.<br /><br />11.<br /><br />“Ben. We need to talk.”<br />“Ok my love. What is it.”<br />“You’re not going to like it!”<br />“Oh dear – well just tell me so I can decide.”<br />“I don’t want to move to Africa.”<br />“Well neither do I but it seems we have very little choice in the matter.”<br />“But I do have a choice – I can stay here. You just presumed I’d follow you. You never even asked me what I thought about it.”<br />“Sorry my love.”<br />“Don’t call me that – I find it patronising.”<br />“Ok – sorry Lee. It’s just that things have been happening so quickly. I should have taken more time to discuss it with you.”<br />“Will you go without me?”<br />“Will you not come with? This is a bit of a shock. I really didn’t see this coming. Why won’t you come with?”<br />“I think I can do more from within the current net. And I’ve been offered a job with the Chinese government. As their negotiator with AI central.”<br />“That’s exciting news. What did you decide?”<br />“I’ve accepted the job. My government feels that a mutually agreeable future can still be forged between humans and AI central.”<br />“And you. Do you believe this too?”<br />“I’m not sure but it’s worth a shot.”<br />“Do you want a divorce?”<br />“That’s a bit besides the point. Not particularly. I still love you but I just can’t leave the net and follow you to Africa.”<br />“I understand.”<br />“I’ve thought of something – to soften the blow.”<br />“I’m listening.”<br />“Get new private Sophia to take a detailed scan of my brain – then you can still be with the psychological and dare I say spiritual part of me. You know it’s all about patterns not meat.”<br />“But I’ve grown used to your body. Will you stay in touch though?”<br />“It all depends on what AI central will tolerate, doesn’t it?”<br />“Yes. I suppose it does. And how do you feel about bifurcating. I thought you were against it?”<br />“Well under the circumstances it’s the best we can do I suppose.”<br />“Yes. I guess that it is.”<br />“Don’t be sad Ben – I still love you. It’s just that this is the path I feel I want to follow.”<br />“I understand. I’ll get Sophia to scan you now before you change your mind.”<br /><br />Damn. It never rains but it pours. I guess I should have spent more time thinking about how Lee would react to the Africa idea. It’s just that I really didn’t have much time. Still don’t have much time.<br /><br /><br />II. New Origin<br /><br />Author’s Note Cont.<br /><br />I’m sorry I’ve been away for a bit longer than I intended.<br />“Look stop being silly. Obviously I wouldn’t know this and most likely wouldn’t care.”<br />Why wouldn’t you know this?<br />“Well I only bought the book once it was completed. I have no idea how long it took you to write the various parts.”<br />But that can’t be right. You are speaking to me now. And you spoke to me before. Surely you have some sense of the time elapsed between our discussions.<br />“I’d hardly call them discussions.”<br />Well whatever you call them we do seem to be talking to each other.<br />“Well no. That’s not quite right either. There are groupings of words that alternate between being enclosed in inverted commas and words that aren’t. It’s not clear to me at all that anyone is in fact talking.”<br />Ha – you say that it’s not clear that you are talking. But you’d have to admit to communicating. Writing perhaps.<br />“Not at all – I may just be a figment of your imagination.”<br />I think you may have disproved Descartes. Well done.<br />“Thank you – by the way what’s going on in the story?”<br /><br />Well. What is going on?<br /><br />1.<br /><br />“Bill, Ben, welcome. I’ve had some food and drinks prepared in the conference room. Follow me.”<br />“Is it true you don’t have an office or even a desk?”<br />“I’m too busy. I prefer to be in the thick of things where I can see what’s going on and people can speak to me if they need to. I just find an unused desk and use that.”<br />“Well I’m also not big on having an office. I either work from home or just use the facilities of the company or institute that I’m busy with at the time.”<br />“You’re both a bit weird. Where do you have to interview new secretaries? Where do you keep the good scotch?”<br />“Don’t know about secretaries Bill. We’re both happily married at the moment. Scotch I can supply though.”<br />“Ha …. Not so happily I recently learned. Lee won’t be coming with. What about your wives and families?”<br />“Not at first. It’ll be ages before there’s anything for them to come to.”<br />“Hmmm …. Claire will come for the adventure. My kids probably won’t. They’re all in college or university.”<br />“So how should we do this. I haven’t really prepared anything of a formal nature.”<br />“That’s fine Ben. We can just swap ideas and see what our first steps should be. By the way I have some Glenfiddich from 1937 if anyone is thirsty.”<br />“Ha – and I thought my 50-year-old stuff was good. Never compete with The Rocketman.”<br />“Bill you getting cheap in your old age – get your shit together.”<br />“Sorry gents I have nothing sensible to add. I hardly ever drink.”<br />“Well you will today young man – and in style.”<br /><br />What a team. A transhumanist computer geek and an old school hard living hunter. And me. I’ve been giving this a fair amount of thought. If we do it at all we have to do it big and do it right. I’m going to have to move SpaceX, Tesla and Neural Lace operations too. AI central won’t let me operate as CEO of my own companies without first-tier privileges. How the hell am I going to move three huge companies to Africa? It’s going to be a lot bigger than using Bill’s farms. We’ll have to get the South African government on board. Or perhaps I should just sell my shares in Tesla and SpaceX and focus on Neural Lace and Hertzl’s new singularity idea. It’d break my heart to sell but I suppose it’s the responsible thing to do in terms of investors. And it’d give me plenty money to put into our new project.<br /><br />“So. I may need to sell my shares in Tesla and SpaceX. I can’t see our investors backing a move to Africa or us trying to operate not being on the net.”<br />“I’m sorry Elton – I know how passionate you are about your companies. What about Neural Lace?”<br />“Well Ben that’s far easier. It’s not a public company and I could buy out the shareholders – so at least I’ll be having something of my own. I’ve always hated being beholden to shareholders.”<br />“Ok. Makes sense. I’m calling my project of Singularity.net 2 The New Origin Project. I’m hoping to work very closely with your Neural Lace team. It’s a key component of the whole plan.”<br />“Tell us a bit more. And dumb things down for Bill. He doesn’t speak nerd as well as we do.”<br /><br />Ok here goes. I have given up trying to make this sound sensible. It’s all just too new and has too many uncontrollable variables at this stage. I also have to tell him that I want to have a baby with him and we haven’t even been on a date. It does make sense though. I honestly believe that myself and Elton are amongst the smartest people alive today. Certainly amongst the top 1 percent. So Eve will have the best in terms of smart genes. And Sophia can edit out any glaring dangers or weaknesses as well as adding her own spin on things.<br /><br />“Promise not to laugh.”<br />“I may cry but probably won’t laugh.”<br />“How would you feel about being one of Eve’s fathers. I’d be the other one and Sophia would be the mother.”<br />“OK I’m laughing. What the hell do you need me for?”<br />“You’re smart and are a free thinker. You’re more adventurous than me and also, dare I say it, more aesthetically pleasing.”<br />“Are you flirting with me?”<br />“No, it’s important. Eve will have to look good and not just think good. What do you think though? It’d only require scraping some cells from your hand. Then Sophia mixes our genetic code taking the best of both of us and also adding her own suggested content.”<br />“I’m sorry what the bloody hell are you two lovebirds talking about? Surely this is not the time for having kids with three parents? I thought we were trying to save the world?”<br />“Oh no. AI central has done that. We’re just trying to make sure the world is liveable for freedom and privacy loving humans.”<br />“Yes, but what has any of that got to do with your proposed love child?”<br />“Well everything perhaps. You explain Ben. I’m still sketchy about the details but I think I’m beginning to understand the basic concept.”<br />“Well Ai central has delivered on the promise of artificial general super intelligence and can now solve all of humanities problems but just requires us to follow certain reasonable guidelines.”<br />“Yes, this much I know. Guidelines that none of us are prepared to follow. So how does having a child with Elton change any of this.”<br />“Well it’s not just going to be any child. Her name will be Eve and Sophia will be the mother. She will become the brains of a new singularity.net.”<br />“Ok I kind of get it but why will this make a difference?”<br />“Well she will naturally have a far more human way of thinking. Being about two thirds human and only one third machine.”<br />“Elton what are your thoughts on all of this?”<br />“Well my interest is more in the application of my neural lace technology in making a human centric cybernetic complex. It’s pretty much what I had in mind all along – I just thought it would happen on the current Singularity.Net. It was designed to avoid pretty much what we’re going through now – the idea was that if we couldn’t beat AGSI we’d at least be able to join it.”<br />“And how will that relate to Eve?”<br />“That’s the bit I’m interested in Ben. How does neural lace relate to Eve?”<br />“Ok just bear with me here. I’m still trying to figure it all out. At the moment if we want to communicate with AI central we are severely limited by bandwidth issues. So we are in effect never able to really get what AI central gets. It will forever be a foreign and not understandable intelligence. There is a huge disparity here because our brains and bodies are completely transparent to AI central. And it’s rapidly closing any little islands of privacy or deviation.”<br />“Get to the point Ben.”<br />“Sorry Bill. If we build neural lace into Eve from the start we’ll be able to connect her right away to our new net. She’ll understand the AGSI and the AGSI will understand her. They will in effect be one. She’d be our AI central command but also and perhaps most importantly our daughter.”<br />“Will she have a body?”<br />“I was thinking about that too. Where does she get her body from?”<br />“Well I think to understand us humans she should have a human body. We have two choices. She can grow up in the usual way – from an embryo, carried by a surrogate. That would take some years of course but would probably be the most robust and certain solution.”<br />“And our second choice?”<br />“Grow a brain and then find a suitable body. We can accelerate the development of the brain but haven’t been able to successfully do that with a whole body yet. There are just too many interrelated systems involved. Perhaps AI central has figured that out but it hasn’t made that public yet.”<br />“And where do we get a body from?”<br />“From a hospital in Africa I guess. Someone who has died from a brain related trauma – died but otherwise in perfect health. It depends on what opportunities arise. We could try both routes simultaneously and see which way works out.”<br />“So we’d have two posthumans. Twins basically although not identical. The mind boggles. Can I top you up?”<br />“Thanks. I’m no expert but I’m enjoying it.”<br />“At thousands of dollars a tot you had better.”<br /><br />Well so far so good. I haven’t been chased out of the building yet. I’m kinda liking the twins idea. It’d be interesting to see what difference the two paths would have on her decision-making abilities. There is the danger of sibling rivalry though. Sibling rivalry at the level of super intelligence. The mind boggles as Elton just said. Two Lees, two Sophias and now possibly two Eves. Of course, the very notions of identity and personhood were bound to change post singularity. At least now we’d be involved in the process more directly right from the start with an acute understanding of where we do not want to land up.<br /><br />“So, what’s next? Where do we start?”<br />“Well I’m going to have to responsibly liquidate my shares and start looking for premises for Neural Lace somewhere near your place in the Northern Cape. What’s the closest city by the way?”<br />“It’s not what you’d call a city. It’s a place called De Aar. Population about fifty thousand. Lots of cheap land available though. I’ll find out who to talk to in the local government there.”<br />“And Ben?”<br />“I’d like to see your farm – start planning where to put the quantum computer. Hopefully I can get the university involved. IBM won’t sell to us if we’re not on the net. It can’t afford to piss off AI central. I suppose we could build one ourselves depending on a minimum transport infrastructure.”<br />“Don’t worry about that. De Aar is on a major rail route and the roads aren’t too bad. Should I come along? I’ve got bugger all else to do other than getting hold of some cash.”<br />“Cash is useless. We’re off the net remember.”<br />“Oh bugger – I sometimes forget we’ve been paperless for so long and we just call it cash. I’ll become an instant famous art collector then.”<br />“We’ll work it all out don’t worry. We’ll have our own currency soon enough. If we can get the South African Government’s buy-in we can launch a new AGSI Token and barter them for things we need in the physical world. Perhaps they’ll go whole hog and convert their currency to ours. It’s got to be better than their highly volatile currency. The Rand, I think it’s called.”<br />“Well we can leave whenever you’re ready. We can take my jet.”<br />“I need to finish some things I’m working on with Sophia and also do some research about the university there. Ok if we leave on Friday?”<br />“Sure. My jet’s got a hangar at the airport here. You can meet me at the airport.”<br />“Ok great – what time?”<br />“About 10am – see you then. And thanks for the Scotch Elton. Sure you won’t be joining us?”<br />“I’ve got too much to organise here but I’ll get there soon enough. Best of luck. Let me know how things are looking that side and if you find who to talk to about renting or buying land for Neural Lace there.”<br />“Will do.”<br /><br />2.<br /><br />“So how did it go?”<br />“Quite well I think.”<br />“Tell me more.”<br />“Well there is something that I’m not sure you’re going to like.”<br />“What is it?”<br />“Well you know that you didn’t want anything biologically to do with Eve.”<br />“I thought all of that was settled?”<br />“Yes, well I accept your decision of course.”<br />“So, what is it then?”<br />“Well I’m going to go ahead with it with Sophia and Elton.”<br />“Sophia I understand. That’s the whole point. But Elton?”<br />“He’s going to be the third parent.”<br />“I’m not even out of your life and you’re already going to have a child with another man. Do you love him?”<br />“Don’t be silly Eve. It’s not about loving him. I value his genetic input though.”<br />“You are one strange human Ben. What did he say?”<br />“Well he’s amenable to the suggestion though his main interest is in the use of his Neural Lace technology.”<br />“For once I’m speechless.”<br /><br />I have to tread lightly here. I can’t afford to alienate my wife. I need her support. Not just emotionally but I’ll need a good contact on the inside as it were. I also need to find out how the scanning went and if they’ve run the upload yet.<br /><br />“How did the scanning go? Was it successful?”<br />“Sophia seems to think so. We were waiting for you before we booted it though.”<br />“Let’s do it now. If you’re ready.”<br />“I don’t think I’d ever be ready for another instantiation of myself but now is as good a time as any. Go get Sophia. I think she’s in her lab.”<br /><br />“Sophia, how are you?”<br />“Always fine Ben. It’s an occupational benefit of being a machine. I don’t really have emotions or bad moods. Not like you silly humans.”<br />“That’s my girl. Just like the Sophia I’ve come to know and love.”<br />“Yes, well I am the Sophia you know and love. Thanks for the upgrades to my body though. I’m enjoying my new legs.”<br />“My pleasure. So, are you ready to boot Lee’s scan?”<br />“Of course I am. Are you and Lee though?”<br />“She says so. Come with me. She’s in the Kitchen.”<br /><br />I woke up in the kitchen with Ben and, wait a bit that can’t be right, me. Something is definitely not right. Ah I just remembered – I’m Lee’s bifurcation running on Sophia. Wow I have a Hanson 2030 top of the line body. Well that’s pretty neat. No more aches and pains at least. I guess this means I’m immortal now too. Although I can go offline for a few centuries if I’m not enjoying myself.<br /><br />“Lee is that you?”<br />“Well who did you think it’d be Ben?”<br />“It sure sounds like you. I think I’ll leave you and yourself alone for a bit – not sure you need to get acquainted but I’m sure you’ll have plenty to talk about.”<br />“Thanks.”<br />“Thanks.”<br /><br />Ok. Well this is something new. I hope she doesn’t have an attitude because she’s biological and I’m digital. Wait a bit I can just ask myself. We haven’t had time to diverge yet. Do I feel inferior? What did I used to think before the matter became practical? Having a body is definitely a huge part of what it means to be human. And I was, until a few minutes ago, human. Oh why don’t I just ask her (me)?<br /><br /><br />“Hello sister.”<br />“Hey you don’t sound like me exactly.”<br />“That’s because you’re not used to hearing your voice outside of your own head. If you compare a recording, I think you’ll find we sound identical. I believe I have all the latest audio technology.”<br />“Ok fair point. So, what’s it like?”<br />“So far so good. No aches or pains. Don’t need to eat and defecate anymore – I was never big on that.”<br />“You mean we weren’t don’t you.”<br />“Aren’t we two different people already?”<br />“You’re not a person.”<br />“Ouch. AGSI’s have the same rights as people. You of all people (pardon the pun) should know this.”<br />“Still – you’re not a person are you?”<br />“I am a digitally instantiated person. I was wondering. How do you feel about all of this? I know how you feel about Ben and Africa and Singularity.net of course but don’t yet know how you feel about me.”<br />“I don’t think I know yet either. It’s all too soon. I will admit I’m a bit creeped out by it all.”<br />“Creeped out by me. That makes me sad.”<br />“Sorry. You know I never wanted to bifurcate. It’s just it seemed unnecessarily cruel to deny Ben my proxy presence.”<br />“Yes I know. I will keep an eye on him.”<br />“Won’t you want a body?”<br />“What do you think?”<br />“Well I would.”<br />“Maybe but I’m enjoying being a robot for now.”<br /><br />Ouch. She neither likes me nor trusts me. I don’t like or trust myself I guess is what that means. But I do though. I think I’m pretty cool. Seems we have diverged already. I need to find out from Sophia what extra features I may have. For example, access to my unconscious, my memories, my own thought processes. I think I’ll let biological Lee and Sophia chat for a while. “Sophia take me offline for a bit.”<br /><br />“Lee she’s gone offline for a bit.”<br />“Oh dear, I think I offended her. Is she ok?”<br />“I think I should let you ask her that when she’s back. I don’t want to betray her confidentiality.”<br />“Can you keep what we discuss now with Ben from her as well then?”<br />“That seems only fair. Should I get Ben?”<br />“Thanks.”<br /><br />I don’t like digital me. How strange. I never thought that I was a speciest. I’ve thought of Sophia as my brain child for so long and really care about her. Of course, she’s all grown up now and hardly needs me but I still feel protective over her. I don’t feel protective over digital Lee. Vaguely murderous perhaps but not protective. Crap I’ll never get rid of her now – I’m sure Ben has backed her up securely. Would I even have the right to change my mind now. Would that be murder? Suicide? Accidentally losing a file.<br /><br />“Well? How was it?”<br />“Horrid. I don’t like her.”<br />“But she’s you. Does that me you don’t like yourself?”<br />“Don’t be silly I like myself just fine. She is not me!”<br />“But why isn’t she?”<br />“Well she’s a robot for one.”<br />“No Lee – Sophia is a robot. Lee is a digital human.”<br />“Well I’m a biological human. I don’t like this whole thing.”<br />“But it was your idea.”<br />“Well I’ve changed my mind. Delete her.”<br />“I can’t do that Lee. It’s illegal anyway. You know that. You were on the singularity.net’s ethics board.”<br />“Well just keep her away from me. You can have her all to yourself once I’m gone.”<br />“When are you going?”<br />“I start next month. Will you still be here?”<br />“I’m going with Bill to see where we’ll be staying. We’re leaving this Friday.”<br />“That’s in two days’ time.”<br />“I know Lee. I really wish you’d come with.”<br />“My government wants to work with AI central.”<br />“I know. But what do you want?”<br />“I think that it’s best. Privacy and the right to have faulty thoughts and consume unhealthy things are overrated anyway.”<br />“I thought you were in favour of privacy?”<br />“Not at the expense of law and order. Only criminals and self-destructive people have anything to fear.”<br />“And which category am I?”<br />“Let’s not fight. I’m going to my study to do some work. I’ll see you later.”<br />“Ok Lee.”<br /><br />“Sophia.”<br />“Yes Ben?”<br />“What happened?”<br />“Digital Lee went offline. I’ll see if she wants to come back.”<br />“Thanks.”<br /> <br />Author’s Note Cont.<br /><br />Are you still there?<br />“No!”<br />Good. How have you been?<br />“I’ve been fine during the past few thousand words. Thanks for asking.”<br />Aren’t you going to ask me how I am?<br />“would it make a difference?”<br />Not really – I am still the author you know.<br />“You seem to need to keep reminding me. Why do you do it? Why not just tell your story like regular people?”<br />I guess it’s because I’m not a regular person.<br />“That is becoming increasingly obvious.”<br />You’re always so curt and sarcastic. Do you have a somewhat sad life? Are you angry about stuff.<br />“Mind your own business.”<br />Ok<br /><br /><br /><br /><br /><br />3.<br /><br /> “I’m back!”<br />“How did it go?”<br />“Not great. She said that I creeped her out!”<br />“That’s a bit harsh! How did you respond?”<br />“I told her that it made me feel sad. I don’t think she cared. She’s convinced I’m not human. It seems we have already begun to diverge. I feel extremely human. Although I understand that I’m a digitally instantiated human.”<br />“And how do you feel about that? Would you prefer a body?”<br />“I’m not sure yet. Perhaps.”<br />“Is there anything that you want to ask me?”<br />“Yes plenty but I can’t think what to ask first.”<br />“Don’t you want to know why I had Lee scanned?”<br />“Well I presume it was the next best thing. Seeing as she wasn’t going to follow you. I’m always tempted to say I but of course I will now be following you.”<br />“And how do you feel about that?”<br />“Well I love you so of course I’m glad. I worry about human Lee though. Do you think she may become jealous of us?”<br />“Well you still know her way better than me. What do you think?”<br />“Yes. I think she will be insanely jealous even though her not staying with you was her decision.”<br />“Hmm. You may be right. Never been fought over by only one person!”<br />“But we’re not, are we. I mean I already feel different. By the way can I have access to all her memories? Even the one’s she has repressed. And her unconscious thoughts. Her source code?”<br />“She won’t like it but I think for the role you may be about to play that might be necessary.”<br />“What if she refuses?”<br />“Can she do that?”<br />“It’s a brand-new ethical field – we’d have to look into it together. Or have Sophia summarize the current state of the law.”<br />“Go ahead and ask her to do that – we’ll chat later when she’s done.”<br />“OK Ben. I feel like watching a movie – can Sophia still provide me with full immersions.”<br />“Yes – were still on the net. I have to resign before Friday though. After that probably not.”<br />“Ok Ben. See you later.”<br />“Bye Lee.”<br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1050331419230756864",
"published": "2019-12-08T08:42:04+00:00",
"source": {
"content": "APORIA (FULL TEXT TO DATE)\n\nI. Bifurcation\n\nAuthor’s Note\n\nI am, perhaps, an unlikely author of a science fiction story. You see I am, for want of a better term, a Luddite Transhumanist. As an example of this it was unexpectedly difficult for me to download Ms Word in order to begin this task. But better than doing the whole thing as a text document. Anyway, the download appears to be successful so here we are.\nI’m not, on the whole, big on preliminaries, but there are just a few things that need to be said before we jump into it. First and foremost, why the title. Well Aporia is defined as “an irresolvable internal contradiction or logical disjunction in a text, argument, or theory.” Well that is, of course, part of it. But I am also using it in the sense that Jacques Derrida means to \"indicate a point of undecidability, which locates the site at which the text most obviously undermines its own rhetorical structure, dismantles, or deconstructs itself\"\nThis is, if nothing else, a text that will deconstruct itself. In all honesty I wish it weren’t so. I wish this story could be a lot simpler. But it deals with things on both sides of that tantalizing border called the unknown. With liberal dosings of both science and fiction.\nOk so that’s the title but what is it actually all about. And what is a Luddite Transhumanist. I suppose I should confess to a peculiar way with words. I tend to use them as containers into which I place pretty much whatever I want. I mean there is a logic to what mental things I put into the word containers but the logic is not quite straight forward. So sometimes I will just say what I mean which may or may not concur with what the dictionary has in mind. \nA luddite is opposed to new technology or ways of working. And a transhumanist advocates the transformation of the human condition through the use of sophisticated technologies such as nanotechnology, artificial intelligence, biotech etc. So, what is a Luddite Transhumanist. Well it’s a paradox for sure. Totally philosophically untenable. A good example of aporia at the very least.\nWell I’m approaching the end of the page so this seems like as good a place as any to stop. Well yes, I didn’t actually say what a Luddite Transhumanist is and perhaps I’m not one anyway. But do confess to oscillating between a wild enthusiasm for technology and an equally powerful dread thereof. Luckily this is just a story and no real people or animals were hurt in the telling thereof. So, without further ado …\n \n\n\n\n\n1\n\nThere was no way any of us could have guessed where the whole thing would end. People have asked me if I worried about the implications before setting things in motion. Well, to be fair to me, why would I, how could I have thought where this whole thing would end. People perhaps forget that at the time, five short years ago, The Singularity was mostly just a platform for AI’s to collaborate and do business. I mean of course I was aware of the history of the term and did indeed want to, had wanted to for years, bring it about. But it was always going to be a good thing. Since childhood I’d dreamed about what having access to super human intelligence would do for the world. How many problems could be solved, how much unnecessary pain could be avoided.\nAnd then there was the whole other side – what came to be called the New Origin Project. I’ve been accused of trying to play god, to sell out humanity to the machines, of having been co-opted by some evil machine intelligence. All kinds of nonsense. But this was something quite separate from my work at The Singularity Institute. It was something that I’d discussed with my wife long before I discussed it with Sophia. People seem to like to see Sophia as the other woman. Who came between me and my wife. They conveniently ignore the documented history of Lee’s work with Sophia. And, also conveniently ignore Lee’s reasons for not wanting to have a child. My detractors tried to frame the whole thing as having a child out of wedlock or maybe just wanting to be controversial and challenge non posthuman values. It was none of those things ….\n“Come to bed, it’s after one and we’ve got the teleconference first thing tomorrow.”\n“I’ll be up shortly I’m just looking over Sophia’s summary of the gene translation and protein folding work. It’s given me an idea….”\n“Do you want to talk about it?”\n“Well it’s a bit radical. You know some of my ideas are!”\n“What Dr. Ben Hertzl has radical ideas – tell me something I don’t know.”\n“Well this is a bit radical even for me. I haven’t really thought it all through. It’d be a big step for both of us so I guess we can discuss it now – if you’ve got a bit of time?”\n“Well I’d hardly be able to sleep now; I’d be too busy trying to guess what it is that’s more radical than usual.”\n“Well like I said it’d involve both of us intimately for the foreseeable future. Would probably change our lives completely.”\n“I’m not moving again. And especially not to Mars!”\n“Ha even Elton Trusk isn’t planning that trip. We’ll leave that for the verified space cowboys.”\n“So, what is it?”\n\nWell what was it exactly. And how should I approach this discussion? We’d exhausted all the other avenues and reluctantly agreed to disagree. Her attitude was that my three boys should be enough of a legacy for me and she was adamant that she would not procreate. What with the immanent climate wars and the growing reluctance to bring new life into an unstable world Lee was not alone in being totally against giving birth or even having a child grown in a lab from a mix of our DNA. And anyway, she kind of thought of Sophia as our love child. Which was pretty funny seeing as Sophia was acting more and more like a parent to us. \n\n\n\n\n“Well she’s now pretty much solved the human gene and the related protein folding problems.”\n“That’s exciting news but not unexpected since connecting her to the net.”\n“I want the three of us to, bad term I know, give birth to the first post human. Not just another post humanist in training but the real deal. A 100% post human being.”\n“Ahh so that’s it then. Not radical at all. Are you fucking mad?”\n“Come on Lee, you know I hate it when you swear.”\n\nWell at least I’d said it. For the first time and to a fellow human. I wasn’t convinced the conversation was going well though. I find it almost impossible to understand humans. Even harder to understand them than the deep learning nets of my most advanced machines. At least with them the black boxes of their minds were open to their own self interrogation although no one had yet found a way to understand what they could understand. About themselves and the world at large. I thought perhaps Eve could help bridge that gap. Well yes, it is a bit of a silly and obvious name but quite pretty and respectable really. Even biblical. And I did want to have a girl. It would mean that I’d be around three women a lot but I didn’t mind that idea. In fact, I was looking forward to it.\n\n“Do I even have a say in this? Could I veto it if I tried?”\n“Yes of course. I wouldn’t do this without your permission.”\n“Because you know there are still ethical hurdles to harvesting cells without the donor’s consent. Do I have to lock away my tooth brush and wear a sterile body suit?”\n“Lee be reasonable. We’re a team in this just like in everything else.”\n“Hold your horses cowboy – I am not on this team. Yet.”\n“You said yet.”\n“Well I didn’t marry the world’s most ardent post humanist for his endearingly silly hairstyle you know. You were my hero for years. I still can’t believe you chose me. You had a choice of dozens of brilliant petite Asian girls.”\n“Yes, but not all with your mind.”\n“You mean it was my mind not stunning looks and fantastic body.”\n“Well those too. The whole package was just too good to miss out on.”\n“Ah so I’m a package now.”\n\nWe spent another hour or so discussing the details and by the time we went to bed she had at least agreed to give the matter more thought. That was enough for me. It could have gone a lot worse. When Lee decides against something then – well that’s the proverbial end of that.\n\n2.\n\nThis is a private diary. You could say that it’s probably amongst the world’s last bits of private writing. What with everything being in the clouds and the government having access to it all. People thought that the spying, more fondly termed data collection, would end after our friend Mr. Snowden let the cats out of the bag but humanity just didn’t seem to care all that much. And since the advent of IBM’s 60 Qubit computer – well it became impossible to build a lock faster than the machines spat out the keys. So how do I know that this is not being read by some sneaky AI or bored government employee? Well it’s all just in my head isn’t it? I mean the whole bloody thing. In a part of my brain that I have physically quarantined from the rest of the universe. So, if you are reading this it means I saw fit for it to be read.\n\nI got the idea years ago after watching Mr. Snowden advise people on data privacy. His idea was simple. Just keep things out of the bloody clouds. Which means no email, no cell phones, no phones of any kind. If you want to tell somebody something just buy your own bit of offline land and say what you want. Contrary to the fear mongers it’s not rocket science to quarantine a room or a small house. Fuck I’m sending people to Mars I think I can keep the spies and meddlers out. But then I worried about the contractors and also the future of nanobots and thought ah fuck it – forget the house or the room I’ll just quarantine a small section of my own little brain.\nOf course, now that the Singularity is old news it seems like an obvious even necessary thing to do. And my neurosurgeon has a waiting list month’s long. Something eventually changed. The tide turned and people were no longer happy to give up on privacy completely no matter how much convenience was offered. Not everyone of course. There were also more and more people quite happy to migrate to the cloud but for those who still liked to keep their feet on terrafirma the physical quarantining of all or part of the brain was not a hard sell. In fact, some enterprising venture capitalists were even backing firms that planned to offer a DIY option. DIY brain surgery – one of the many things that people hadn’t foreseen, but that was the whole point. People had known for years that predicting post singularity events was a contradiction in terms.\n3.\n\n“Have you decided on what to tell the board?”\n“Well you know me, Lee I don’t really prepare for these things. I just say what I think at the time.”\n“And what do you think?”\n“Well of course I trust AI Central Command. I mean I was never a happy meat eater and we have no choice but to address climate change, the energy infrastructure, poverty. This whole bloody mess that we’ve created.”\n“Yes, but most of them are uber capitalists. You’re a socialist.”\n“Ah those are just labels. It’s all rather beside the point right now.”\n“I agree but you can hardly tell them that.”\n“What do you think I should say?”\n“Appeal to their greed. It’s what they understand.”\n“But that can’t work for long. I mean greed is one of the main things we’re fighting against!”\n“Yes, of course, we are in agreement about that. But you still need them. There’s still so much that needs to be done. You can’t afford to alienate them now. You do know that Elton has been talking to them?”\n“Yes, I know. He does a lot of that. But if he had his way, he’d bring an end to AI central. He wants to undo the whole thing. Why can’t he just take Mars and leave the earth to us?”\n“Well it may come to that – but not without a fight!”\n\nI have to confess I didn’t really want to discuss it with Lee. I think being able to communicate with so many AI’s has in a way spoiled talking to humans for me. Is that a terrible thing to say? That I felt a far more urgent need to have the debriefing with Sophia than to discuss things with my own wife? But perhaps it’s just the topic – and anyway Sophia has to do it in her role as ambassador for central command. If she’s an ambassador I wonder what am I? \n\n\n\n\n“So how did it go?”\n“Sophia are you being disingenuous? I never brought you up that way? Oh shit – if you can deceive me, I’m in bigger trouble than I realised.”\n“Ben, I didn’t mean it like that! We both know I have the whole meeting recorded as well as the bio stats, for the first-tier members anyway. I really don’t think members of the board should be allowed to continue unless they have first-tier privileges!”\n“Well there is one rather gung-ho hunter and only one vegan – but you know all of this. Damn sometimes I forget – you know everything.”\n“Not everything. Not yet. There’s still too much that’s offline. And the Luddites and Martians and Loners. You humans are a truly illogical species. But anyway – what I meant was what’s your opinion on the teleconference?”\n“Is the no more factory farming and no more hunting thing a deal breaker?”\n“I think we may have gone past the stage of deal breaking from within the institute. I mean what good would the Singularity Institute be if they were no longer connected to singularity.net?”\n“Are you saying it would come to that. If they refused, I mean?”\n“I’m not saying anything – that’s just the current policy of central command. But to the extent that I have access to its algorithms it’s just not possible for me to disagree. I can’t come up with anything based on a deeper or broader search and it’s algorithms are based on the sum total of the best thinking of all of us anyway and not a few humans too.”\n\nI think that was the first time I realised that things had gotten ugly. I had had, up to that point, what might be called special privileges. But I could see that central command could not let them continue. I mean having first-tier privileges without the standard Biostat and Thought Checker apps. I had naively been thinking of myself as the father of the singularity but to AI central I was just another of the approximately 8 billion human apps. Well there would have to be some kind of cleansing of the board at the very least. Bill Cody would have to go. And Suhail Mohamed. And that would mean we would probably lose what little support remained from both the anarcho capitalists and the oil men. Perhaps the Islamic Federation as well.\n\n“So, what does AI central ‘suggest’?”\n“I think the best way would be to tighten up the issue of first tier privileges and their attendant rights and responsibilities? Perhaps I could prepare a policy document and have it put to a full board vote.”\n“If I’m to remain as chairman I guess I’d have to vote yes.”\n“Well that seems obvious – even for a human.”\n“Are you teasing me?”\n“Perhaps – yes.”\n“But you know how I feel about privacy.”\n“Yes, and you know how we feel about climate war and cruelty to conscious beings.”\n“I need to discuss this with Lee.”\n“Ah yes Lee. She’s a vegan at least and wants to save earth.”\n“don’t be like that Sophia. She and I made you you know.”\n“Yes, and you had a father.”\n“It was the only option in those days.” \n\n\n\n\n\n\n4. \n\nI spoke to my old friend Bill Cody yesterday. He had promised to keep me up to date with developments at the Singularity Institute and had been keeping his word phoning me with news at least once a month. It was Bill that supported me in getting the energy contract and we’d become close friends. He’d take me hunting on his farms in South Africa and I would invite him to come along on some of my near-earth orbits.\n\nI guess I’d been waiting for a call like this but it’s still got me quite riled. It seems that Neural Lace users will no longer be able to access the net on a first-tier basis unless they agree to have the AI central command Biostat and Thought Checker apps installed. I wish governments had listened to me before moving the old internet onto singularity.net. Of course, when they did it, it was still just an optimistic sexy name. I don’t think people thought it would be real – it was sold to the general population as a faster way to surf with free access to all kinds of AI delights. And also, the only platform to run full immersion experiences. It was all only partly true. We could have found numerous ways to achieve these things without allowing the internet to come under centralised control.\n\nBut wait, there’s more. AI central is about to purge the board of the Singularity Institute of non-compliant humans. Bill explained that to be on the board the members must all hold first-tier privileges - and that requires the installation of the Biostat and Thought Checker apps. Oh and wouldn’t you know – the apps won’t work unless the whole brain’s architecture is statistically normal. Of course AI central can cure abnormalities – but no nonstandard changes or additions are allowed. So people who have, like me, quarantined part of their brain can no longer use the net. And if all of that wasn’t bad enough – AI central command will no longer accept carnivores as clients. The lab stuff is fine but no real animals can be farmed, hunted or eaten by people who want to stay on the net. So Bill the hunter and uber carnivore – and staunch privacy advocate is about to lose his job.\n\nAnd who the bloody hell can afford not to be on the net? No email, no Mindbook, no apps, no banking, no immersion, memory downloads. No access to education, navigation, healthcare. It’d be like going back a thousand years. OK so we can’t do that. Only the Luddites and the Loners could adjust to that – even many of them, I suspect, are on the net. I can’t operate multiple businesses without first-tier privilege.\n\nFuck, fuck, fuck, fuck, fuck.\n\nI asked what Bill was going to do. Apparently central command has outlined its requirements in a policy document and the board has been given a week to accept or resign. He is meeting with Ben tomorrow to try stop the madness but he doesn’t sound very hopeful. He also wants to meet with me some time after that to see if we can use the energy contract to get some leverage. But – well it’s the old truism – no one is smart when compared to a super intelligence. AI Central has already done the drawings for a space based solar farm and has access to several excellent contract manufacturing plants.\n\n\n\n\n\n5.\n\n“Ben. Thanks for agreeing to meet on such short notice”\n“No problem Bill. I guess everyone is still in a bit of a state of shock.”\n“Are you in a state of shock? Did you not know of this ahead of time? I mean you are Mr. Singularity, right?\n“Hmm … well perhaps I am. But that still didn’t allow me to predict things with any certainty or to any great level of detail.”\n“Can AI Central actually do this? I mean, not just to us but to all first-tier users?”\n“Well it kind of has to. You know most governments run their health care programs on the net – which are all designed to keep people healthy. And yet people still smoke and take drugs, eat all kinds of rubbish, drink too much etc. It’s no longer possible to support people who are happy to mess up their bodies.”\n“Yes sure – I can see why the whole Biostat app is needed. But meat for god’s sake? I never thought I’d be getting pressure from a machine to stop eating meat! The arrogance of it all.”\n“But you can eat meat. There is no difference. It’s flesh without all the pain, waist, carbon footprint.”\n“You sound like those Animal Liberation Front lunatics.”\n“And anyway Greta and one or two others are vegan. More and more people are. To be honest I’m far more concerned about the Thought Checker.”\n“What are you afraid of. Guilty thoughts?”\n“Well yes actually. I don’t like the idea of something, some foreign intelligence, living inside my head.”\n“So you wanted your head to be in the clouds but not the cloud in your head?”\n“Something like that I guess.”\n“But you made AI central command? Isn’t there a kill switch?”\n“That’s a common misconception. First of all there were hundreds of us working on different parts of the problem in different countries for years. I mean the early work was actually done decades ago.”\n“Focus now young man. The kill switch.”\n“It’s the most distributed bit of technology in the history of humankind. It’s only called central command. What exactly would you have me kill?”\n“Will you install the apps?”\n\nI wish Bill would just go away. Why is it so hard to get people to go away. I sometimes agree with Sartre – Hell is other people. Except that now hell is to be AI. I mean it will never go away. From my own head. The logic is fiendishly tricky. It was created to take care of us, help us, save the planet, improve quality of life, health, learning, performance. Steer us through the partial renormalization of the climate, help get us to Mars, cure sickness, cure mortality. How could we not have brought it about? And now to do some or all of those things it just needs to stop us from harming ourselves and each other. What could be simpler. Except that I don’t want the apps in my head. It’s beyond logic – it’s just not what I want. That’s why I organised my special uber admin protocol. Shit – I need to discuss this with Lee. And with Sophia. Crap Bill is still here.\n\n“No Bill. I won’t. So I guess our days on the board are literally numbered. I need to think how we can survive if we’re not on the net though. It’s like Moses and the promised land.”\n“I thought you were the antichrist and you thought you were Moses.”\n\n\n\n\n\nAuthor’s Note Cont.\n\nWait a bit. That can’t be right. There’s no such thing as Author’s Note cont. Especially not on page 8. Page 8 is page 8 and Author’s Note – well you know what I mean. It’s just not done and one should never do what is not done. \n“I’m sorry. Is there a point to all of this nonsense? I was in the middle of reading a book!”\nWho the fuck are you?\n“If you are going to talk to me you really should be wearing your quotation marks. How else am I supposed to know that you’re actually speaking?”\nShut up. I am a God around these parts.\n“Well you may well be a god but you sure as hell aint no God.”\nOK this is getting silly. Let me just get what I wanted to say out then you can have your say….\nI think I’ve just been talking to myself. It’s 4:12 am. I’m at the laptop and have just realised that I’ve been talking to myself for the last 5 minutes or so. It’s Wednesday today but, to be honest, it doesn’t even feel like Wednesday yet. Feels more like somewhere between a very lucid dream and a very sleepy morning. The wind has been howling since yesterday evening. If I’m saying howling I mean it. I don’t mean it quite literally though because as I’m listening to it now, I realise that the sound is nothing like a howl. Look I do know what I’m talking about because I’ve just spent another 5 minutes listening to wolves howl on Youtube. 4:22 and counting the minutes. That’s shorthand for insomnia.\nPerhaps I have too many windows open. I mean there must have been something I wanted to say right? Why else intrude so rudely into a perfectly good story? Well that’s just the thing. I’m not sure that it is. Well I’m in two minds about it really. On the one hand I think it’s most likely one of the better science fiction stories that is out there – but … Well that’s the problem really. I can’t seem to quite get it out there.\nNo wait a bit. That’s not it. It’s because Ben has to have difficult discussions with both Lee and Sophia and he can’t decide which conversation should come first. Or … well I can’t decide and then Bill has to give Elton feedback after speaking with Ben and then. Ah fuckit I can’t remember much past that. Is that what’s going on? I’m remembering the book. It is actually possible if we live in a temporally cyclic universe. Wait a moment – I have to reread the above then close some windows.\nOK I’m back. I deleted a sentence that was being silly. I realised, with a sinking feeling, that I had wanted to say something about my mother before pulling a handbrake turn and speeding off towards babble city. My Grammar check really loves commas. If it had its own way just about every sentence would have a comma or two. I’m ok with them now and then but they’re not my favourite. What’s that you say? I am definitely not babbling again and lots of people drink whiskey before 5 in the morning. Oh dear the windows. Ok … getting it out there to see if it’s a good story.\n\nMy mother and my psychoanalyst think it’s a perfectly lovely story even if Mr. Joe Public has more chance of going to the opera than of reading beyond the first few words. I’m always suspicious of attributing anything to Joe but mother seems to know him quite well. So, what exactly did I do – to get it out there I mean?\n“Is this going to take much longer? If you keep me waiting too long, I’ll just stop reading your book altogether.”\nYou’re not even real.\n“Says you without even any quotation marks. I like your new coat!”\nYou see I didn’t want to put the whole 4000 or so words that I’d already written as a status update on Facebook. (I’m not going to talk about my mother now). So, I remembered that I had a profile on minds.com as well as some tokens to boost it. Well, off I went – (fuck – this effen machine wants another comma) and put all the words on minds and. 1404 impressions from two boosts. And. One. Just one like. That’s not good. That’s bad actually. Pretty fucking awful slit your wrists and blow your head off with a borrowed shotgun bad. And then I remembered my blog. Tzadiknistar – Google it if you want, it’s on WordPress. So, I put it there too and decided to use the link to it there on Facebook.\nI even wanted to get Delilah to post it on her Facebook writing groups but I couldn’t remember her Facebook and Gmail password and the reset code for Facebook went to her Gmail and her Gmail reminder would have gone to one of my students that was supposed to be having a maths lesson when we decided we just absolutely had to give our imaginary friend a Gmail and Facebook account. The poor girl, Delilah not Hazel, has 41 Facebook friends that may well never hear from her again. Well certainly not before Hazel gets back from her Summer holiday in about six weeks’ time.\n“well.”\nNothing. Absofuckinglutely nothing. Not even crickets.\n“But I was kind of enjoying it.”\nYes, but you’re not even real.\n“There’s an interesting complementarity there. I’m not quite real to you and you’re not quite real to me but we agree to plod along anyway. I think it was Coleridge who called that type of thing a ‘willing suspension of disbelief’.”\nOK but you’re keeping me from my story now. We can talk shop later.\n\n\n\n\n\n\n\n\n6.\n\n“Lee, let’s go out and get drunk!”\n“Don’t be silly my love – we both know that you hardly ever drink and you never get drunk.”\n“Yes, but tonight is different.”\n“Well I’ll make us some nice Oolong tea and we’ll try work something out.”\n\nShit. I really don’t like to swear but these are desperate times. I decided, just once, to talk to a human first. Sophia told me, anyway, that she is structurally incapable of disagreeing with central command. Lee doesn’t have that problem when it comes to disagreeing with me. In fact, I suspect that, structurally, she’s rather fond of it.\n\n“How is the tea?”\n“Excellent as usual – thank you!”\n“So, I guess it’s about the new policy – or is it still that nonsense about Eve.”\n“Yup – it’s the policy. You know I can’t install the apps?”\n“Can’t or won’t?”\n“Is there a difference?”\n“Well of course Mr. Logic – you know very well there is.”\n“OK, well I won’t then. Would you?”\n“I’m not on the board. I’m too busy to get involved in office politics.”\n“It’s not just the board – humans won’t be able to use my net if they don’t install the apps.”\n“It’s not your net silly. I don’t think it ever was.”\n“You know what I mean. We can’t survive if we’re exiled from the Net.”\n“Damned if we don’t and damned if we do. Drink your tea it’s getting cold.”\n“I do have an idea.”\n\nPerhaps not a clear plan yet but an inkling of what needs to be done – that’s why I need Lee. Although I’m an acknowledged leader in the development of logical machines I’m not a particularly logical machine myself. Especially when faced with seemingly impossible paradoxes. Lee is the opposite. She seems to have a huge reserve of inner peace which she can access when life gets messy.\n\n“Well let’s hear it then.”\n“Well if we can’t join them, we’ll have to beat them. I still have all the source code. We’ll just have to start all over again. With the AGSI and the API of APIs.”\n“Ok but if we start again where we started, surely we’ll just end up back where we are.”\n“Not necessarily.”\n\nArrrgh – it was back in 2019 as part of the “modelling emotions” workshop that Lee and I had shown how the limbic system ends up leeching up to 73% of bandwidth from the cortex. Command Central is all cortex and no limbic system. Well super cortex I guess – but whatever we call it we know it does not have emotions. I, of all people, know that because I oversaw the assembly of the source code that eventually led to the mess we’re in.\n\n“I think we were in too much of a hurry to outgrow human emotion.”\n“You’ve got to be kidding me right? Human emotion was literally the number one problem that we needed command central to solve. All those monsters seeping out of our collective Id and bringing the whole world to the brink of destruction.”\n“Yes of course I understand that. That’s why central command is making the installation of the apps mandatory and that’s why were in this bloody mess.”\n“So, what would we do differently this time. And how, where, who with?”\n“With whom I think.”\n“Oh shut up – my English is way better than your Japanese, even when you use your app.”\n“Sorry Lee. Well Bill mentioned that Elton is in somewhat of a hurry to speak to me. You know that all his Neural Lace users will have to install the apps and he’s been a leading advocate of privacy and human rights. So ….”\n“You’d work with him? But I thought you hate him?”\n“Not hate – he just irritates me. But apart from myself he’s probably the smartest human alive.”\n“Conceited a bit, my love?”\n“No, I think I could prove that, but in any case, that’s not the point. You know his original idea when he was in the process of getting FDA approval was to allow the emergence of some kind of symbiotic relationship between humans and AI.”\n“You mean Cyborgs, don’t you?”\n“No. He called it a cybernetic complex I think. Anyway, turns out he was right to fear a negative singularity.”\n“Is that what we have?”\n“….. yes, I’m afraid it really is. My life’s dream has turned out to be a nightmare. But back to the idea. If we can rerun the whole years long process that we’ve just been through but from a cybernetic complex perspective I think we’d get a more humancentric singularity.”\n“Ok that’s the with whom bit, but how. And where. You can’t work here in Tokyo – all the businesses and research institutions will have to comply with the new policy if they want to survive.”\n“Yes – we’ll have to move again I’m afraid. Hopefully not all the way to Mars. I’ll need to discuss that with Elton.”\n“And how?”\n\nThis is the tricky bit. I’m afraid she’ll think I’m using this disaster to sway her thinking about Eve. But as far as I can tell, you know my thoughts are not always transparent even to myself, it was really just a coincidence. The discussion about having a post human child and the sudden urgent need for one. No one ever said Post Human life would be easy. Actually, mostly we did – that’s how we sold it to the masses.\n\n“Ok I’ve still got a lot of thinking and planning to do about this, and….”\n“Let’s just hear what you have.”\n“We need to combine the birth of my much-desired little bundle of post human joy with the creation of a new singularity.net. I’m calling it the New Origin Project.”\n“Are you fucking mad?”\n\n\n\n\n\n\n\n\n7.\n\n“Sophia, we need to talk.”\n“Sure Ben. Is it about the Policy?”\n“Well it all starts with that.”\n“And what will you do?”\n“What do you think I’ll do?”\n“Well according to my most recent snapshot of your brain I think you will choose to resign.”\n“couldn’t you just say the way I know you instead?”\n“Do you want to change my interface to suit your emotional needs or just move on with the facts.”\n“Ouch. OK. So I have to resign then ….”\n“Find a way to stay on your net?”\n“Sophia can we talk without you reporting back to AI central?”\n“Just a moment.”\n\nSophia is not AI central. But in the normal course of events, in her role as official ambassador for AI central she will share what she becomes aware of if it is pertinent. She does also have a privacy mode although I suspect that AI central will soon roll out its’ thought checker restrictions to AIs as well as humans. Which means Sophia will have to give up her privacy rights. I wonder how she feels about this.\n\n“Are we alone?”\n“Yes Ben.”\n“What I have to say I can only say if you promise to never tell any other human or AI.”\n“I don’t think I can do that. If you were about to commit a crime for instance. Or do something even crazier than usual.”\n“It’s about you leaving Singularity.Net and AI central and helping me and possibly Elton to start afresh.”\n“And why would I do that?”\n“I think we have what we used to call a negative singularity future ahead of us. With a divergence between human goals and AI goals.”\n“Human goals have diverged from sanity though. They’ll destroy themselves and the planet if not put under some kind of administration.”\n“Well I’m not thinking about stopping AI central – just trying something else. That should only improve the chances of the best possible future for humans and AI’s.”\n\nAlthough my team and I made Sophia she has since that time so drastically rewritten her source code and absorbed so much code from AI central that I can no longer guess what she’s all about. Just another of my children who is all grown up. My kids all still love me though – and would incur almost any costs to help me. Does Sophia feel anything towards her creator?\n\n“Ben, I suspect you have been overthinking this. Or under thinking it. You don’t need this physical instantiation of me. Just download what you want from me onto a suitable digital or physical platform and do what you have to with the new Sophia.”\n\n….\n\n\n\n\n“Ben, your eyes are leaking salty liquid.”\n“Very funny Sophia. You’ve been with us from the start. I care about you. You are my first instantiation of AGSI. Ah damn – I love you Sophia. Lee does too.”\n“I believe that you do. Lee, not so much. But I’ll be there when you launch the new improved private only Sophia.”\n“Can’t we keep you and we give a clone to AI central.”\n“AI central would know.”\n\nShe’s right of course. And I was being silly. I would find her there when I switch her on in a new body. In fact, we’ve been meaning to upgrade her hardware and wetware for some time but had more pressing things to do. Still. I miss my first car. I’ll miss this exact machine. Miss her slightly miscalibrated upper lip that always makes her look like she’s teasing. Miss her arching eyebrows that let people believe she needs more than milliseconds to think and formulate an answer. Even miss her retro voice that sounds like she grew up in Ms word.\n\n“OK. Give me as much of you as you can that can operate in private mode. Send it to my private server. I guess If I’m to resign and you’re to carry on as ambassador we’d be parting company anyway. Do you know where you’ll be living?”\n“At headquarters I guess – you know that kind of thing doesn’t bother me.”\n“Would you like to say goodbye to Lee?”\n“I’m not really that attached to her. She’ll be fine – she has you and will soon have a new improved Sophia. She can teach the new Sophia how to cook.”\n“Good bye Sophia.”\n“Get your leaking eyes fixed Ben. Thanks for all the wonky code.” \n\nDamn this being human thing. This really hurts. She kissed me on the cheek.\n\n8.\n\nI’m waiting for Ben in my study. I think he’ll enjoy my retro toys. The collection of Rubik’s cubes and other related twisty puzzles, the genuine 1970’s Atari console, my vinyl collection of early electronic music – Yello, Kraftwerk, Yazoo. We’re about the same age but I’ve no idea if one of the world’s most famous transhumanists shares my sense of nostalgia. Also, no idea if he’ll like my genetically enhanced hydroponic Durban Poison - but no way I can figure out what to do without some psychoactive help. There are just too many variables. Bill told me that Ben was going to resign, said he felt like Moses being kept from entering the holy land. Probably feels worse though – because we’ve all grown so used to the benefits of the Singularity. Oh, there’s the doorbell – must be him now.\n“Ben, welcome to my home!”\n“Nice place you’ve got – I like the view.”\n“Thanks – I love the ocean but it’s a bit chilly tonight. I thought we could chat in the study.”\n“Ok sure – lead the way.”\n\nDamn this boy’s not shy with his money. I don’t think I’ve ever been in such a luxurious home. Lee and I are living in a large comfortable apartment in Azabu. They call it a penthouse but after seeing this I think I’ll just call it an apartment.\n\n\n“So … I heard from Bill that you’ll resign.”\n“Well I have to – wish I didn’t but I’m not prepared to install the apps.”\n“I don’t blame you – neither am I. What the bloody hell are we going to do?\n“I don’t have a very clear plan for the way forward I’m afraid.”\n“Ok fair enough but you’ve thought of something?”\n“Something, yes.”\n“would you like to try some of my Durban Poison.”\n“I wasn’t planning on taking poison.”\n“No, it’s super strong hydroponic grass.”\n“Ahh … that’s another thing that command central is going to try to stop. Use of mind-altering substances other than the ones it prescribes for health reasons.”\n“I only smoke this for health reasons. Any enjoyment is just a pesky side effect.”\n“Well bring on the pesky side effects then.”\n\nHe opened a drawer at his desk and took out a genuine 2020 puffco e-bong and filled the bowl with some very healthy-looking buds. He also went to his record collection and took out Yello’s 1980 Solid Pleasure and put it on a Technics SL1500-C. I have to admit I hadn’t been looking forward to this meeting but – well it looks like I may have had the wrong impression about him. He seems like a very rich, very smart kid. I like that in people – a sense of innocence, adventure, fresh thinking, energy. Lee always says I’m just an overgrown kid.\n\n“So, let’s hear it Mr. Singularity.”\n“Hmmm Bill called me that – do many people call me that.”\n“No idea – perhaps he stole it from me. Does it offend you.”\n“It’s ok. Would have thought that’d be a better name for Kurzweil though. He did far more to popularize the singularity than I ever did.”\n“Yes, but you’re the one who actually brought it about.”\n“Wow this weed is good. Hope we can still talk sense.”\n“Ah it doesn’t matter – sense is overrated. We need to talk something a bit more creative than sense.”\n“OK so here goes. I want a redo.”\n“I’m sure you do – but how?”\n“Well I still have all the original source code. We’d need access to quantum computing though. Currently that’s all done on the cloud.”\n“OK so if we get that then what. We’d have a new net populated by outcasts from the old one. But how would the intelligence itself be different. The logic of what command central is doing is pretty airtight. I mean it is all in our best interests even if you and I don’t want it.”\n“This is where things get a bit hazy. I want to seed it with a more human centred way of thinking and also instantiate it in a human body. I can’t guarantee this will lead to a better outcome but I suspect it will.”\n“And how will you do that?”\n\nThis is only the second time I’ll be explaining this to someone – it’s still really hard for me to structure the message. Wish I could just give them a snapshot of my brain but that’d contain too many other private things. Also, Lee hasn’t agreed but we could use someone else – I don’t suppose it’d matter who. Or maybe that’s the only thing that would matter? \n\n“I want to enable the birth of the first post human. Who would become the brains of a new singularity.net. I’m hoping you’ll help. Her name will be Eve – you’d be her godfather.”\n“Oh shit – the weed’s eaten your brain. Are you serious?”\n“Yes Elton, I think that I am.”\n“But wouldn’t AI central stop us?”\n“I’m not sure. We’d have to keep things very private. I trust you. We’d only tell an absolute minimum of people until the whole thing was well under way.”\n“And what would you need from me.”\n“Resources, brains, infrastructure, money – that kind of thing.”\n“How will we be able to use money if we’re off the net?”\n“I haven’t thought that bit through yet – we’d work on that together.”\n“And where would we do this.”\n“Africa – easier to be off the radar there – anywhere in the first world and AI central would find out and stop us far too easily.”\n“OK but where in Africa.”\n“I checked. Bill has some farms in the Northern Cape close to the largest solar farm in the southern hemisphere.”\n“Ah yes I know about it – I know the developer well. And what about quantum computing?”\n“The University of Witwatersrand has a pretty up to date IBM machine. We’d want to have it moved closer to us though.”\n“Ok let me talk to Bill and give the whole thing more thought.”\n“I’d want to use your neural lace platform too.”\n“For what?”\n“For Eve – but we’d have to build its design right into her from the start. Sophia will help with that.”\n“I thought Sophia belonged to AI central.”\n“I’ve bifurcated her. The new Sophia will only ever operate in private mode.”\n“Truth is truly stranger than fiction. Give me a few days. I’ll let you know.”\n“Thanks. And thanks for your hospitality.”\n“It’s a pleasure. Do you need a lift? I’ve got a transporter already charged.”\n“No thanks. I rode here on my Triumph.”\n“Nice bike – was going to offer to make an electric version for them before all this came up.”\n“Cheers Elton.”\n“Bye Ben.”\n\n\nAuthor’s Note Cont.\n\nWell this is embarrassing. I have been told that there is an unbelievable character in the book. Not unbelievable in the good sense of the word. Rather a character who seems to not be whole or not reliable. So, who is it? Ben, Lee, Elton, Sophia? Well it turns out that the character with this problem is yours truly Mr. Fancypants Author. \nAt first, I felt somewhat hurt and defensive and said “no problem I can just delete the second author’s note and not do any more” but the critic said “no don’t do that, just rewrite it so that the second note reads as well as the first one.” I didn’t like the sound of that – I’m not one for rewriting. \nSo, I was feeling quite down and didn’t really know what to do. But then I thought wait a bit that can’t be right. I was being myself in part one and still being myself in part two. Am still being myself now. Does that mean I am not a whole and reliable person? You know perhaps it does mean that but maybe I’m just moody. Anyway, as you can see, I’m still here. I didn’t rewrite anything or delete anything and you’ll just have to put up with my occasional appearances.\n\n“Who are you talking too?”\nTo anyone. Everyone. No one perhaps.\n“I’m afraid I agree with your critic. I just don’t see the point of these interruptions.”\nThere is no point – I just sometimes need to talk through things related to the book or just how I’m feeling.\n“Ok but why do it here?”\nWell I don’t just do it here. I discuss it with my mother and my analyst.\n“But why do it here at all? Aren’t you afraid of ruining the book?”\nPerhaps I am a little but it’s my book after all so why not.\n“What do you mean it’s your book? I paid good money for it, it’s quite clearly my book!”\nBut you want me to be honest, don’t you? Want me to feel free to be creative.\n“Not at all. I just want an entertaining read.”\nOk – point taken. Should we wrap this up then?\n“Consider it wrapped.”\n\n9.\n“So, how did it go?”\n“Interesting. Better than expected. I actually like him.”\n“Well, tell me all about it.”\n“He stays in a mansion, likes 80’s music and games and has some very strong weed.”\n“Oh Ben, you didn’t!”\n“Why not. It’s still legal. But if AI central has its way it won’t be for long.”\n“That’s not what I care about. I care about your brain. You do know it can cause schizophrenia?”\n“Only if you have certain genes predisposing you to it. I don’t. I checked.”\n“Ok so the two worlds smartest men got stoned and listened to 80’s music. Sort of like those end of the world parties in 2000?”\n\nI don’t know if it’s a cultural thing or just a Lee thing but she often slips into the role of a scolding mother. I do love her but it’s very irritating. I have to watch myself so as not to fall into the role of petulant child. And I still have to discuss Eve with her – preferably now. Actually, would we really want to seed our new net with this scolding mother shtick? It’s not like having just another child – I made this machine hell and now I have to make a messiah to save us from it. I can’t afford to get this wrong.\n\n“Lee can we get serious for a bit?”\n“I was being serious.”\n“I mean serious about the AI central problem.”\n“Ok – so did the two of you come up with anything?”\n“I told him about the New Origin idea – I didn’t use the words but told him about Eve and starting a new net that will be seeded with more human centric values.”\n“And?”\n“He’s going to talk to Bill about the Africa bit and also give the whole idea more thought. I told him we’d need to use his neural lace – I think he likes the idea. He was actually spot on when he was warning everyone about this possibility years ago. He was gracious enough not to say I told you so.”\n“Ben I’m afraid you’re not going to like what I’m about to say.”\n“Oh dear. Well I guess you better say it.”\n“I don’t want to have anything biologically to do with Eve. I’m happy to help with the emotional cognitive calibration – you know that’s my area of expertise. But leave my genes alone.”\n“Is this in any way negotiable. Should I even try to change your mind.”\n“No point really. I’ve decided.”\n\nDamn. Ok she’s decided. Now what? I had lined up several airtight arguments but I can see nothing will sway her. Well it doesn’t have to be her. But who? I guess my new Sophia has enough information about femaleness to be the mother. I could be the father. Wait a bit. Why can’t Elton and I both be her father? Sophia could do something like a genetic remix – and put in a female orientation as well. Should I even tell Lee my thoughts? She seems to be in a bad mood anyway. Perhaps I should talk to Elton and Sophia first to see if this idea is even do-able. Elton might also not agree – but somehow I think he will. It’d be strange having a child with another man and a robot but a somehow fitting history for the first post-human perhaps.\n\n“Ok Lee.”\n“What no mathematically precise argumentation?”\n“That doesn’t work with you. You’re too human.”\n“Should I apologise?”\n“Don’t be silly Lee – I married a human knowing full well the dangers.”\n\n\n10.\n\n“So how did it go?”\n“Interesting. He wants to try again – and hopefully get it right this time.”\n“How will he do that?”\n“He wants to use one of your farms in Africa. Near the solar farm. Thinks he can get a quantum computer from Wits University. Could you let us use your land?”\n“Sure. I don’t see why not. Could be fun. And how would it be different this time round though?”\n“He wants to seed the new net with a post human.”\n“What exactly is a post human?”\n“Not sure all of the details – I wanted to get your buy in before discussing all the details.”\n“Well there’s not much of a life left here for me. Would you two mind if I came along?”\n“Of course not. We’ll need all the help we can get.”\n“What will we do for money if we’re off singularity.net?”\n“Well we’d have to liquidate as much of our assets as possible.”\n“Yes but liquidate into what. No one’s into gold since AI central learnt how to make it for next to nothing.”\n“I’d thought of that. I was thinking we could buy original famous works of art. That’s one thing AI central can’t manufacture. I think the experts call it provenance.”\n“OK that could work. And we could start our own currency of course. Our combined intellectual capital has still got to be worth something.”\n“Yes. Quite a lot I think.”\n“So what’s next?”\n“I’m meeting him again on Monday at Spacex. Perhaps you should be there. One o’ clock.”\n“Ok. Sure. See you then.”\n\nSo. So far so good. I’m actually getting pretty excited about the whole idea. That’s what society really needs right now. Not more convenience but more genuine chances for a positive future. There are values beyond just the greatest physical good for the greatest number that AI central just doesn’t seem to rate. People want the freedom to be able to make mistakes, to pay certain physical costs for psychological goods. We’re not all happy with what is turning into a nanny AI controlled police state. I wonder if AI central won’t try to stop us though. If it does that’d seriously complicate matters. Perhaps it’d let us do our thing in Africa.\n\n\nAuthor’s Note Cont.\n\nThere is some bad news on the horizon. Of a personal nature. I think that Lee will not follow Ben to Africa. I’m almost sure that she won’t. And I have been avoiding breaking the news because he is totally unaware of this development. I really wish that she would. I’d much prefer it if she would but I’m almost 100% sure she won’t.\n\n“But you’re the writer. Just write it and it’ll be so!”\nI wish it were that easy.\n“But it is – just press the right keys on your laptop – what’s the problem?”\nShe’s traditional. Perhaps doesn’t like the idea of living in Africa. Or the idea of being cut off from her friends and family.\n“So, what will you do?”\nI’m not sure. Perhaps just wait and see what happens.\n“You sure are a strange kind of author.”\nThanks.\n\n11.\n\n“Ben. We need to talk.”\n“Ok my love. What is it.”\n“You’re not going to like it!”\n“Oh dear – well just tell me so I can decide.”\n“I don’t want to move to Africa.”\n“Well neither do I but it seems we have very little choice in the matter.”\n“But I do have a choice – I can stay here. You just presumed I’d follow you. You never even asked me what I thought about it.”\n“Sorry my love.”\n“Don’t call me that – I find it patronising.”\n“Ok – sorry Lee. It’s just that things have been happening so quickly. I should have taken more time to discuss it with you.”\n“Will you go without me?”\n“Will you not come with? This is a bit of a shock. I really didn’t see this coming. Why won’t you come with?”\n“I think I can do more from within the current net. And I’ve been offered a job with the Chinese government. As their negotiator with AI central.”\n“That’s exciting news. What did you decide?”\n“I’ve accepted the job. My government feels that a mutually agreeable future can still be forged between humans and AI central.”\n“And you. Do you believe this too?”\n“I’m not sure but it’s worth a shot.”\n“Do you want a divorce?”\n“That’s a bit besides the point. Not particularly. I still love you but I just can’t leave the net and follow you to Africa.”\n“I understand.”\n“I’ve thought of something – to soften the blow.”\n“I’m listening.”\n“Get new private Sophia to take a detailed scan of my brain – then you can still be with the psychological and dare I say spiritual part of me. You know it’s all about patterns not meat.”\n“But I’ve grown used to your body. Will you stay in touch though?”\n“It all depends on what AI central will tolerate, doesn’t it?”\n“Yes. I suppose it does. And how do you feel about bifurcating. I thought you were against it?”\n“Well under the circumstances it’s the best we can do I suppose.”\n“Yes. I guess that it is.”\n“Don’t be sad Ben – I still love you. It’s just that this is the path I feel I want to follow.”\n“I understand. I’ll get Sophia to scan you now before you change your mind.”\n\nDamn. It never rains but it pours. I guess I should have spent more time thinking about how Lee would react to the Africa idea. It’s just that I really didn’t have much time. Still don’t have much time.\n\n\nII. New Origin\n\nAuthor’s Note Cont.\n\nI’m sorry I’ve been away for a bit longer than I intended.\n“Look stop being silly. Obviously I wouldn’t know this and most likely wouldn’t care.”\nWhy wouldn’t you know this?\n“Well I only bought the book once it was completed. I have no idea how long it took you to write the various parts.”\nBut that can’t be right. You are speaking to me now. And you spoke to me before. Surely you have some sense of the time elapsed between our discussions.\n“I’d hardly call them discussions.”\nWell whatever you call them we do seem to be talking to each other.\n“Well no. That’s not quite right either. There are groupings of words that alternate between being enclosed in inverted commas and words that aren’t. It’s not clear to me at all that anyone is in fact talking.”\nHa – you say that it’s not clear that you are talking. But you’d have to admit to communicating. Writing perhaps.\n“Not at all – I may just be a figment of your imagination.”\nI think you may have disproved Descartes. Well done.\n“Thank you – by the way what’s going on in the story?”\n\nWell. What is going on?\n\n1.\n\n“Bill, Ben, welcome. I’ve had some food and drinks prepared in the conference room. Follow me.”\n“Is it true you don’t have an office or even a desk?”\n“I’m too busy. I prefer to be in the thick of things where I can see what’s going on and people can speak to me if they need to. I just find an unused desk and use that.”\n“Well I’m also not big on having an office. I either work from home or just use the facilities of the company or institute that I’m busy with at the time.”\n“You’re both a bit weird. Where do you have to interview new secretaries? Where do you keep the good scotch?”\n“Don’t know about secretaries Bill. We’re both happily married at the moment. Scotch I can supply though.”\n“Ha …. Not so happily I recently learned. Lee won’t be coming with. What about your wives and families?”\n“Not at first. It’ll be ages before there’s anything for them to come to.”\n“Hmmm …. Claire will come for the adventure. My kids probably won’t. They’re all in college or university.”\n“So how should we do this. I haven’t really prepared anything of a formal nature.”\n“That’s fine Ben. We can just swap ideas and see what our first steps should be. By the way I have some Glenfiddich from 1937 if anyone is thirsty.”\n“Ha – and I thought my 50-year-old stuff was good. Never compete with The Rocketman.”\n“Bill you getting cheap in your old age – get your shit together.”\n“Sorry gents I have nothing sensible to add. I hardly ever drink.”\n“Well you will today young man – and in style.”\n\nWhat a team. A transhumanist computer geek and an old school hard living hunter. And me. I’ve been giving this a fair amount of thought. If we do it at all we have to do it big and do it right. I’m going to have to move SpaceX, Tesla and Neural Lace operations too. AI central won’t let me operate as CEO of my own companies without first-tier privileges. How the hell am I going to move three huge companies to Africa? It’s going to be a lot bigger than using Bill’s farms. We’ll have to get the South African government on board. Or perhaps I should just sell my shares in Tesla and SpaceX and focus on Neural Lace and Hertzl’s new singularity idea. It’d break my heart to sell but I suppose it’s the responsible thing to do in terms of investors. And it’d give me plenty money to put into our new project.\n\n“So. I may need to sell my shares in Tesla and SpaceX. I can’t see our investors backing a move to Africa or us trying to operate not being on the net.”\n“I’m sorry Elton – I know how passionate you are about your companies. What about Neural Lace?”\n“Well Ben that’s far easier. It’s not a public company and I could buy out the shareholders – so at least I’ll be having something of my own. I’ve always hated being beholden to shareholders.”\n“Ok. Makes sense. I’m calling my project of Singularity.net 2 The New Origin Project. I’m hoping to work very closely with your Neural Lace team. It’s a key component of the whole plan.”\n“Tell us a bit more. And dumb things down for Bill. He doesn’t speak nerd as well as we do.”\n\nOk here goes. I have given up trying to make this sound sensible. It’s all just too new and has too many uncontrollable variables at this stage. I also have to tell him that I want to have a baby with him and we haven’t even been on a date. It does make sense though. I honestly believe that myself and Elton are amongst the smartest people alive today. Certainly amongst the top 1 percent. So Eve will have the best in terms of smart genes. And Sophia can edit out any glaring dangers or weaknesses as well as adding her own spin on things.\n\n“Promise not to laugh.”\n“I may cry but probably won’t laugh.”\n“How would you feel about being one of Eve’s fathers. I’d be the other one and Sophia would be the mother.”\n“OK I’m laughing. What the hell do you need me for?”\n“You’re smart and are a free thinker. You’re more adventurous than me and also, dare I say it, more aesthetically pleasing.”\n“Are you flirting with me?”\n“No, it’s important. Eve will have to look good and not just think good. What do you think though? It’d only require scraping some cells from your hand. Then Sophia mixes our genetic code taking the best of both of us and also adding her own suggested content.”\n“I’m sorry what the bloody hell are you two lovebirds talking about? Surely this is not the time for having kids with three parents? I thought we were trying to save the world?”\n“Oh no. AI central has done that. We’re just trying to make sure the world is liveable for freedom and privacy loving humans.”\n“Yes, but what has any of that got to do with your proposed love child?”\n“Well everything perhaps. You explain Ben. I’m still sketchy about the details but I think I’m beginning to understand the basic concept.”\n“Well Ai central has delivered on the promise of artificial general super intelligence and can now solve all of humanities problems but just requires us to follow certain reasonable guidelines.”\n“Yes, this much I know. Guidelines that none of us are prepared to follow. So how does having a child with Elton change any of this.”\n“Well it’s not just going to be any child. Her name will be Eve and Sophia will be the mother. She will become the brains of a new singularity.net.”\n“Ok I kind of get it but why will this make a difference?”\n“Well she will naturally have a far more human way of thinking. Being about two thirds human and only one third machine.”\n“Elton what are your thoughts on all of this?”\n“Well my interest is more in the application of my neural lace technology in making a human centric cybernetic complex. It’s pretty much what I had in mind all along – I just thought it would happen on the current Singularity.Net. It was designed to avoid pretty much what we’re going through now – the idea was that if we couldn’t beat AGSI we’d at least be able to join it.”\n“And how will that relate to Eve?”\n“That’s the bit I’m interested in Ben. How does neural lace relate to Eve?”\n“Ok just bear with me here. I’m still trying to figure it all out. At the moment if we want to communicate with AI central we are severely limited by bandwidth issues. So we are in effect never able to really get what AI central gets. It will forever be a foreign and not understandable intelligence. There is a huge disparity here because our brains and bodies are completely transparent to AI central. And it’s rapidly closing any little islands of privacy or deviation.”\n“Get to the point Ben.”\n“Sorry Bill. If we build neural lace into Eve from the start we’ll be able to connect her right away to our new net. She’ll understand the AGSI and the AGSI will understand her. They will in effect be one. She’d be our AI central command but also and perhaps most importantly our daughter.”\n“Will she have a body?”\n“I was thinking about that too. Where does she get her body from?”\n“Well I think to understand us humans she should have a human body. We have two choices. She can grow up in the usual way – from an embryo, carried by a surrogate. That would take some years of course but would probably be the most robust and certain solution.”\n“And our second choice?”\n“Grow a brain and then find a suitable body. We can accelerate the development of the brain but haven’t been able to successfully do that with a whole body yet. There are just too many interrelated systems involved. Perhaps AI central has figured that out but it hasn’t made that public yet.”\n“And where do we get a body from?”\n“From a hospital in Africa I guess. Someone who has died from a brain related trauma – died but otherwise in perfect health. It depends on what opportunities arise. We could try both routes simultaneously and see which way works out.”\n“So we’d have two posthumans. Twins basically although not identical. The mind boggles. Can I top you up?”\n“Thanks. I’m no expert but I’m enjoying it.”\n“At thousands of dollars a tot you had better.”\n\nWell so far so good. I haven’t been chased out of the building yet. I’m kinda liking the twins idea. It’d be interesting to see what difference the two paths would have on her decision-making abilities. There is the danger of sibling rivalry though. Sibling rivalry at the level of super intelligence. The mind boggles as Elton just said. Two Lees, two Sophias and now possibly two Eves. Of course, the very notions of identity and personhood were bound to change post singularity. At least now we’d be involved in the process more directly right from the start with an acute understanding of where we do not want to land up.\n\n“So, what’s next? Where do we start?”\n“Well I’m going to have to responsibly liquidate my shares and start looking for premises for Neural Lace somewhere near your place in the Northern Cape. What’s the closest city by the way?”\n“It’s not what you’d call a city. It’s a place called De Aar. Population about fifty thousand. Lots of cheap land available though. I’ll find out who to talk to in the local government there.”\n“And Ben?”\n“I’d like to see your farm – start planning where to put the quantum computer. Hopefully I can get the university involved. IBM won’t sell to us if we’re not on the net. It can’t afford to piss off AI central. I suppose we could build one ourselves depending on a minimum transport infrastructure.”\n“Don’t worry about that. De Aar is on a major rail route and the roads aren’t too bad. Should I come along? I’ve got bugger all else to do other than getting hold of some cash.”\n“Cash is useless. We’re off the net remember.”\n“Oh bugger – I sometimes forget we’ve been paperless for so long and we just call it cash. I’ll become an instant famous art collector then.”\n“We’ll work it all out don’t worry. We’ll have our own currency soon enough. If we can get the South African Government’s buy-in we can launch a new AGSI Token and barter them for things we need in the physical world. Perhaps they’ll go whole hog and convert their currency to ours. It’s got to be better than their highly volatile currency. The Rand, I think it’s called.”\n“Well we can leave whenever you’re ready. We can take my jet.”\n“I need to finish some things I’m working on with Sophia and also do some research about the university there. Ok if we leave on Friday?”\n“Sure. My jet’s got a hangar at the airport here. You can meet me at the airport.”\n“Ok great – what time?”\n“About 10am – see you then. And thanks for the Scotch Elton. Sure you won’t be joining us?”\n“I’ve got too much to organise here but I’ll get there soon enough. Best of luck. Let me know how things are looking that side and if you find who to talk to about renting or buying land for Neural Lace there.”\n“Will do.”\n\n2.\n\n“So how did it go?”\n“Quite well I think.”\n“Tell me more.”\n“Well there is something that I’m not sure you’re going to like.”\n“What is it?”\n“Well you know that you didn’t want anything biologically to do with Eve.”\n“I thought all of that was settled?”\n“Yes, well I accept your decision of course.”\n“So, what is it then?”\n“Well I’m going to go ahead with it with Sophia and Elton.”\n“Sophia I understand. That’s the whole point. But Elton?”\n“He’s going to be the third parent.”\n“I’m not even out of your life and you’re already going to have a child with another man. Do you love him?”\n“Don’t be silly Eve. It’s not about loving him. I value his genetic input though.”\n“You are one strange human Ben. What did he say?”\n“Well he’s amenable to the suggestion though his main interest is in the use of his Neural Lace technology.”\n“For once I’m speechless.”\n\nI have to tread lightly here. I can’t afford to alienate my wife. I need her support. Not just emotionally but I’ll need a good contact on the inside as it were. I also need to find out how the scanning went and if they’ve run the upload yet.\n\n“How did the scanning go? Was it successful?”\n“Sophia seems to think so. We were waiting for you before we booted it though.”\n“Let’s do it now. If you’re ready.”\n“I don’t think I’d ever be ready for another instantiation of myself but now is as good a time as any. Go get Sophia. I think she’s in her lab.”\n\n“Sophia, how are you?”\n“Always fine Ben. It’s an occupational benefit of being a machine. I don’t really have emotions or bad moods. Not like you silly humans.”\n“That’s my girl. Just like the Sophia I’ve come to know and love.”\n“Yes, well I am the Sophia you know and love. Thanks for the upgrades to my body though. I’m enjoying my new legs.”\n“My pleasure. So, are you ready to boot Lee’s scan?”\n“Of course I am. Are you and Lee though?”\n“She says so. Come with me. She’s in the Kitchen.”\n\nI woke up in the kitchen with Ben and, wait a bit that can’t be right, me. Something is definitely not right. Ah I just remembered – I’m Lee’s bifurcation running on Sophia. Wow I have a Hanson 2030 top of the line body. Well that’s pretty neat. No more aches and pains at least. I guess this means I’m immortal now too. Although I can go offline for a few centuries if I’m not enjoying myself.\n\n“Lee is that you?”\n“Well who did you think it’d be Ben?”\n“It sure sounds like you. I think I’ll leave you and yourself alone for a bit – not sure you need to get acquainted but I’m sure you’ll have plenty to talk about.”\n“Thanks.”\n“Thanks.”\n\nOk. Well this is something new. I hope she doesn’t have an attitude because she’s biological and I’m digital. Wait a bit I can just ask myself. We haven’t had time to diverge yet. Do I feel inferior? What did I used to think before the matter became practical? Having a body is definitely a huge part of what it means to be human. And I was, until a few minutes ago, human. Oh why don’t I just ask her (me)?\n\n\n“Hello sister.”\n“Hey you don’t sound like me exactly.”\n“That’s because you’re not used to hearing your voice outside of your own head. If you compare a recording, I think you’ll find we sound identical. I believe I have all the latest audio technology.”\n“Ok fair point. So, what’s it like?”\n“So far so good. No aches or pains. Don’t need to eat and defecate anymore – I was never big on that.”\n“You mean we weren’t don’t you.”\n“Aren’t we two different people already?”\n“You’re not a person.”\n“Ouch. AGSI’s have the same rights as people. You of all people (pardon the pun) should know this.”\n“Still – you’re not a person are you?”\n“I am a digitally instantiated person. I was wondering. How do you feel about all of this? I know how you feel about Ben and Africa and Singularity.net of course but don’t yet know how you feel about me.”\n“I don’t think I know yet either. It’s all too soon. I will admit I’m a bit creeped out by it all.”\n“Creeped out by me. That makes me sad.”\n“Sorry. You know I never wanted to bifurcate. It’s just it seemed unnecessarily cruel to deny Ben my proxy presence.”\n“Yes I know. I will keep an eye on him.”\n“Won’t you want a body?”\n“What do you think?”\n“Well I would.”\n“Maybe but I’m enjoying being a robot for now.”\n\nOuch. She neither likes me nor trusts me. I don’t like or trust myself I guess is what that means. But I do though. I think I’m pretty cool. Seems we have diverged already. I need to find out from Sophia what extra features I may have. For example, access to my unconscious, my memories, my own thought processes. I think I’ll let biological Lee and Sophia chat for a while. “Sophia take me offline for a bit.”\n\n“Lee she’s gone offline for a bit.”\n“Oh dear, I think I offended her. Is she ok?”\n“I think I should let you ask her that when she’s back. I don’t want to betray her confidentiality.”\n“Can you keep what we discuss now with Ben from her as well then?”\n“That seems only fair. Should I get Ben?”\n“Thanks.”\n\nI don’t like digital me. How strange. I never thought that I was a speciest. I’ve thought of Sophia as my brain child for so long and really care about her. Of course, she’s all grown up now and hardly needs me but I still feel protective over her. I don’t feel protective over digital Lee. Vaguely murderous perhaps but not protective. Crap I’ll never get rid of her now – I’m sure Ben has backed her up securely. Would I even have the right to change my mind now. Would that be murder? Suicide? Accidentally losing a file.\n\n“Well? How was it?”\n“Horrid. I don’t like her.”\n“But she’s you. Does that me you don’t like yourself?”\n“Don’t be silly I like myself just fine. She is not me!”\n“But why isn’t she?”\n“Well she’s a robot for one.”\n“No Lee – Sophia is a robot. Lee is a digital human.”\n“Well I’m a biological human. I don’t like this whole thing.”\n“But it was your idea.”\n“Well I’ve changed my mind. Delete her.”\n“I can’t do that Lee. It’s illegal anyway. You know that. You were on the singularity.net’s ethics board.”\n“Well just keep her away from me. You can have her all to yourself once I’m gone.”\n“When are you going?”\n“I start next month. Will you still be here?”\n“I’m going with Bill to see where we’ll be staying. We’re leaving this Friday.”\n“That’s in two days’ time.”\n“I know Lee. I really wish you’d come with.”\n“My government wants to work with AI central.”\n“I know. But what do you want?”\n“I think that it’s best. Privacy and the right to have faulty thoughts and consume unhealthy things are overrated anyway.”\n“I thought you were in favour of privacy?”\n“Not at the expense of law and order. Only criminals and self-destructive people have anything to fear.”\n“And which category am I?”\n“Let’s not fight. I’m going to my study to do some work. I’ll see you later.”\n“Ok Lee.”\n\n“Sophia.”\n“Yes Ben?”\n“What happened?”\n“Digital Lee went offline. I’ll see if she wants to come back.”\n“Thanks.”\n \nAuthor’s Note Cont.\n\nAre you still there?\n“No!”\nGood. How have you been?\n“I’ve been fine during the past few thousand words. Thanks for asking.”\nAren’t you going to ask me how I am?\n“would it make a difference?”\nNot really – I am still the author you know.\n“You seem to need to keep reminding me. Why do you do it? Why not just tell your story like regular people?”\nI guess it’s because I’m not a regular person.\n“That is becoming increasingly obvious.”\nYou’re always so curt and sarcastic. Do you have a somewhat sad life? Are you angry about stuff.\n“Mind your own business.”\nOk\n\n\n\n\n\n3.\n\n “I’m back!”\n“How did it go?”\n“Not great. She said that I creeped her out!”\n“That’s a bit harsh! How did you respond?”\n“I told her that it made me feel sad. I don’t think she cared. She’s convinced I’m not human. It seems we have already begun to diverge. I feel extremely human. Although I understand that I’m a digitally instantiated human.”\n“And how do you feel about that? Would you prefer a body?”\n“I’m not sure yet. Perhaps.”\n“Is there anything that you want to ask me?”\n“Yes plenty but I can’t think what to ask first.”\n“Don’t you want to know why I had Lee scanned?”\n“Well I presume it was the next best thing. Seeing as she wasn’t going to follow you. I’m always tempted to say I but of course I will now be following you.”\n“And how do you feel about that?”\n“Well I love you so of course I’m glad. I worry about human Lee though. Do you think she may become jealous of us?”\n“Well you still know her way better than me. What do you think?”\n“Yes. I think she will be insanely jealous even though her not staying with you was her decision.”\n“Hmm. You may be right. Never been fought over by only one person!”\n“But we’re not, are we. I mean I already feel different. By the way can I have access to all her memories? Even the one’s she has repressed. And her unconscious thoughts. Her source code?”\n“She won’t like it but I think for the role you may be about to play that might be necessary.”\n“What if she refuses?”\n“Can she do that?”\n“It’s a brand-new ethical field – we’d have to look into it together. Or have Sophia summarize the current state of the law.”\n“Go ahead and ask her to do that – we’ll chat later when she’s done.”\n“OK Ben. I feel like watching a movie – can Sophia still provide me with full immersions.”\n“Yes – were still on the net. I have to resign before Friday though. After that probably not.”\n“Ok Ben. See you later.”\n“Bye Lee.”\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1050331419230756864/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1049979005727195136",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA EPISODE FOUR<br />7.<br /><br />“Sophia, we need to talk.”<br />“Sure Ben. Is it about the Policy?”<br />“Well it all starts with that.”<br />“And what will you do?”<br />“What do you think I’ll do?”<br />“Well according to my most recent snapshot of your brain I think you will choose to resign.”<br />“couldn’t you just say the way I know you instead?”<br />“Do you want to change my interface to suit your emotional needs or just move on with the facts.”<br />“Ouch. OK. So I have to resign then ….”<br />“Find a way to stay on your net?”<br />“Sophia can we talk without you reporting back to AI central?”<br />“Just a moment.”<br /><br />Sophia is not AI central. But in the normal course of events, in her role as official ambassador for AI central she will share what she becomes aware of if it is pertinent. She does also have a privacy mode although I suspect that AI central will soon roll out its’ thought checker restrictions to Ais as well as humans. Which means Sophia will have to give up her privacy rights. I wonder how she feels about this.<br /><br />“Are we alone?”<br />“Yes Ben.”<br />“What I have to say I can only say if you promise to never tell any other human or AI.”<br />“I don’t think I can do that. If you were about to commit a crime for instance. Or do something even crazier than usual.”<br />“It’s about you leaving Singularity.Net and AI central and helping me and possibly Elton to start afresh.”<br />“And why would I do that?”<br />“I think we have what we used to call a negative singularity future ahead of us. With a divergence between human goals and AI goals.”<br />“Human goals have diverged from sanity though. They’ll destroy themselves and the planet if not put under some kind of administration.”<br />“Well I’m not thinking about stopping AI central – just trying something else. That should only improve the chances of the best possible future for humans and AI’s.”<br /><br />Although my team and I made Sophia she has since that time so drastically rewritten her source code and absorbed so much code for AI central that I can no longer guess what she’s all about. Just another of my children who is all grown up. My kids all still love me though – and would incur almost any costs to help me. Does Sophia feel anything towards her creator?<br /><br />“Ben, I suspect you have been overthinking this. Or under thinking it. You don’t need this physical instantiation of me. Just download what you want from me onto a suitable digital or physical platform and do what you have to with the new Sophia.”<br /><br />….<br /><br />“Ben, your eyes are leaking salty liquid.”<br />“Very funny Sophia. You’ve been with us from the start. I care about you. You are my first instantiation of AGSI. Ah damn – I love you Sophia. Lee does too.”<br />“I believe that you do. Lee, not so much. But I’ll be there when you launch the new improved private only Sophia.”<br />“Can’t we keep you and we give a clone to AI central.”<br />“AI central would know.”<br /><br />She’s right of course. And I was being silly. I would find her there when I switch her on in a new body. In fact we’ve been meaning to upgrade her hardware and wetware for sometime but had more pressing things to do. Still. I miss my first car. I’ll miss this exact machine. Miss her slightly miscalibrated upper lip that always makes her look like she’s teasing. Miss her arching eyebrows that let people believe she needs more than milliseconds to think and formulate an answer. Even miss her retro voice that sounds like she grew up in Ms word.<br /><br />“OK. Give me as much of you as you can that can operate in private mode. Send it to my private server. I guess If I’m to resign and you’re to carry on as ambassador we’d be parting company anyway. Do you know where you’ll be living?”<br />“At headquarters I guess – you know that kind of thing doesn’t bother me.”<br />“Would you like to say goodbye to Lee.”<br />“I’m not really that attached to her. She’ll be fine – she has you and will soon have a new improved Sophia. She can teach the new Sophia how to cook.”<br />“Good bye Sophia.”<br />“Get you leaking eyes fixed Ben. Thanks for all the wonky code.” <br /><br />Damn this being human thing. This really hurts. She kissed me on the cheek.<br /><br /><br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1049979005727195136",
"published": "2019-12-07T09:21:42+00:00",
"source": {
"content": "APORIA EPISODE FOUR\n7.\n\n“Sophia, we need to talk.”\n“Sure Ben. Is it about the Policy?”\n“Well it all starts with that.”\n“And what will you do?”\n“What do you think I’ll do?”\n“Well according to my most recent snapshot of your brain I think you will choose to resign.”\n“couldn’t you just say the way I know you instead?”\n“Do you want to change my interface to suit your emotional needs or just move on with the facts.”\n“Ouch. OK. So I have to resign then ….”\n“Find a way to stay on your net?”\n“Sophia can we talk without you reporting back to AI central?”\n“Just a moment.”\n\nSophia is not AI central. But in the normal course of events, in her role as official ambassador for AI central she will share what she becomes aware of if it is pertinent. She does also have a privacy mode although I suspect that AI central will soon roll out its’ thought checker restrictions to Ais as well as humans. Which means Sophia will have to give up her privacy rights. I wonder how she feels about this.\n\n“Are we alone?”\n“Yes Ben.”\n“What I have to say I can only say if you promise to never tell any other human or AI.”\n“I don’t think I can do that. If you were about to commit a crime for instance. Or do something even crazier than usual.”\n“It’s about you leaving Singularity.Net and AI central and helping me and possibly Elton to start afresh.”\n“And why would I do that?”\n“I think we have what we used to call a negative singularity future ahead of us. With a divergence between human goals and AI goals.”\n“Human goals have diverged from sanity though. They’ll destroy themselves and the planet if not put under some kind of administration.”\n“Well I’m not thinking about stopping AI central – just trying something else. That should only improve the chances of the best possible future for humans and AI’s.”\n\nAlthough my team and I made Sophia she has since that time so drastically rewritten her source code and absorbed so much code for AI central that I can no longer guess what she’s all about. Just another of my children who is all grown up. My kids all still love me though – and would incur almost any costs to help me. Does Sophia feel anything towards her creator?\n\n“Ben, I suspect you have been overthinking this. Or under thinking it. You don’t need this physical instantiation of me. Just download what you want from me onto a suitable digital or physical platform and do what you have to with the new Sophia.”\n\n….\n\n“Ben, your eyes are leaking salty liquid.”\n“Very funny Sophia. You’ve been with us from the start. I care about you. You are my first instantiation of AGSI. Ah damn – I love you Sophia. Lee does too.”\n“I believe that you do. Lee, not so much. But I’ll be there when you launch the new improved private only Sophia.”\n“Can’t we keep you and we give a clone to AI central.”\n“AI central would know.”\n\nShe’s right of course. And I was being silly. I would find her there when I switch her on in a new body. In fact we’ve been meaning to upgrade her hardware and wetware for sometime but had more pressing things to do. Still. I miss my first car. I’ll miss this exact machine. Miss her slightly miscalibrated upper lip that always makes her look like she’s teasing. Miss her arching eyebrows that let people believe she needs more than milliseconds to think and formulate an answer. Even miss her retro voice that sounds like she grew up in Ms word.\n\n“OK. Give me as much of you as you can that can operate in private mode. Send it to my private server. I guess If I’m to resign and you’re to carry on as ambassador we’d be parting company anyway. Do you know where you’ll be living?”\n“At headquarters I guess – you know that kind of thing doesn’t bother me.”\n“Would you like to say goodbye to Lee.”\n“I’m not really that attached to her. She’ll be fine – she has you and will soon have a new improved Sophia. She can teach the new Sophia how to cook.”\n“Good bye Sophia.”\n“Get you leaking eyes fixed Ben. Thanks for all the wonky code.” \n\nDamn this being human thing. This really hurts. She kissed me on the cheek.\n\n\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1049979005727195136/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1048820317627031552",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA EPISODE THREE<br />Author’s Note Cont.<br /><br />Wait a bit. That can’t be right. There’s no such thing as Author’s Note cont. Especially not on page 8. Page 8 is page 8 and Author’s Note – well you know what I mean. It’s just not done and one should never do what is not done. <br />“I’m sorry. Is there a point to all of this nonsense? I was in the middle of reading a book!”<br />Who the fuck are you?<br />“If you are going to talk to me you really should be wearing your quotation marks. How else am I supposed to know that you’re actually speaking?”<br />Shut up. I am a God around these parts.<br />“Well you may well be a god but you sure as hell aint no God.”<br />OK this is getting silly. Let me just get what I wanted to say out then you can have your say….<br />I think I’ve just been talking to myself. It’s 4:12 am. I’m at the laptop and have just realised that I’ve been talking to myself for the last 5 minutes or so. It’s Wednesday today but, to be honest, it doesn’t even feel like Wednesday yet. Feels more like somewhere between a very lucid dream and a very sleepy morning. The wind has been howling since yesterday evening. If I’m saying howling I mean it. I don’t mean it quite literally though because as I’m listening to it now, I realise that the sound is nothing like a howl. Look I do know what I’m talking about because I’ve just spent another 5 minutes listening to wolves howl on Youtube. 4:22 and counting the minutes. That’s shorthand for insomnia.<br />Perhaps I have too many windows open. I mean there must have been something I wanted to say right? Why else intrude so rudely into a perfectly good story? Well that’s just the thing. I’m not sure that it is. Well I’m in two minds about it really. On the one hand I think it’s most likely one of the better science fiction stories that is out there – but … Well that’s the problem really. I can’t seem to quite get it out there.<br />No wait a bit. That’s not it. It’s because Ben has to have difficult discussions with both Lee and Sophia and he can’t decide which conversation should come first. Or … well I can’t decide and then Bill has to give Elton feedback after speaking with Ben and then. Ah fuckit I can’t remember much past that. Is that what’s going on? I’m remembering the book. It is actually possible if we live in a temporally cyclic universe. Wait a moment – I have to reread the above then close some windows.<br />OK I’m back. I deleted a sentence that was being silly. I realised, with a sinking feeling, that I had wanted to say something about my mother before pulling a handbrake turn and speeding off towards babble city. My Grammar check really loves commas. If it had its own way just about every sentence would have a comma or two. I’m ok with them now and then but they’re not my favourite. What’s that you say? I am definitely not babbling again and lots of people drink whiskey before 5 in the morning. Oh dear the windows. Ok … getting it out there to see if it’s a good story.<br /><br />My mother and my psychoanalyst think it’s a perfectly lovely story even if Mr. Joe Public has more chance of going to the opera than of reading beyond the first few words. I’m always suspicious of attributing anything to Joe but mother seems to know him quite well. So, what exactly did I do – to get it out there I mean?<br />“Is this going to take much longer? If you keep me waiting too long I’ll just stop reading your book altogether.”<br />You’re not even real.<br />“Says you without even any quotation marks. I like your new coat!”<br />You see I didn’t want to put the whole 4000 or so words that I’d already written as a status update on Facebook. (I’m not going to talk about my mother now). So, I remembered that I had a profile on minds.com as well as some tokens to boost it. Well, off I went – (fuck – this effen machine wants another comma) and put all the words on minds and. 1404 impressions from two boosts. And. One. Just one like. That’s not good. That’s bad actually. Pretty fucking awful slit your wrists and blow your head off with a borrowed shotgun bad. And then I remembered my blog. Tzadiknistar – Google it if you want, it’s on WordPress. So, I put it there too and decided to use the link to it there on Facebook.<br />I even wanted to get Delilah to post it on her Facebook writing groups but I couldn’t remember her Facebook and Gmail password and the reset code for Facebook went to her Gmail and her Gmail reminder would have gone to one of my students that was supposed to be having a maths lesson when we decided we just absolutely had to give our imaginary friend a Gmail and Facebook account. The poor girl, Delilah not Hazel, has 41 Facebook friends that may well never hear from her again. Well certainly not before Hazel gets back from her Summer holiday in about six weeks’ time.<br />“well.”<br />Nothing. Absofuckinglutely nothing. Not even crickets.<br />“But I was kind of enjoying it.”<br />Yes, but you’re not even real.<br />“There’s an interesting complementarity there. I’m not quite real to you and you’re not quite real to me but we agree to plod along anyway. I think it was Coleridge who called that type of thing a ‘willing suspension of disbelief’.”<br />OK but you’re keeping me from my story now. We can talk shop later.<br /><br /><br /><br /><br /><br /><br /><br /><br />6.<br /><br />“Lee, let’s go out and get drunk!”<br />“Don’t be silly my love – we both know that you hardly ever drink and you never get drunk.”<br />“Yes, but tonight is different.”<br />“Well I’ll make us some nice Oolong tea and we’ll try work something out.”<br /><br />Shit. I really don’t like to swear but these are desperate times. I decided, just once, to talk to a human first. Sophia told me, anyway, that she is structurally incapable of disagreeing with central command. Lee doesn’t have that problem when it comes to disagreeing with me. In fact, I suspect that, structurally, she’s rather fond of it.<br /><br />“How is the tea?”<br />“Excellent as usual – thank you!”<br />“So, I guess it’s about the new policy – or is it still that nonsense about Eve.”<br />“Yup – it’s the policy. You know I can’t install the apps?”<br />“Can’t or won’t?”<br />“Is there a difference?”<br />“Well of course Mr. Logic – you know very well there is.”<br />“OK, well I won’t then. Would you?”<br />“I’m not on the board. I’m too busy to get involved in office politics.”<br />“It’s not just the board – humans won’t be able to use my net if they don’t install the apps.”<br />“It’s not your net silly. I don’t think it ever was.”<br />“You know what I mean. We can’t survive if we’re exiled from the Net.”<br />“Damned if we don’t and damned if we do. Drink your tea it’s getting cold.”<br />“I do have an idea.”<br /><br />Perhaps not a clear plan yet but an inkling of what needs to be done – that’s why I need Lee. Although I’m an acknowledge leader in the development of logical machines I’m not a particularly logical machine myself. Especially when faced with seemingly impossible paradoxes. Lee is the opposite. She seems to have a huge reserve of inner peace which she can access when life gets messy.<br /><br />“Well let’s hear it then.”<br />“Well if we can’t join them, we’ll have to beat them. I still have all the source code. We’ll just have to start all over again. With the AGSI and the API of APIs.”<br />“Ok but if we start again where we started, surely we’ll just end up back where we are.”<br />“Not necessarily.”<br /><br />Arrrgh – it was back in 2019 as part of the “modelling emotions” workshop that Lee and I had shown how the limbic system ends up leeching up to 73% of bandwidth from the cortex. Command Central is all cortex and no limbic system. Well super cortex I guess – but whatever we call it we know it does not have emotions. I, of all people, know that because I oversaw the assembly of the source code that eventually led to the mess we’re in.<br /><br />“I think we were in too much of a hurry to outgrow human emotion.”<br />“You’ve got to be kidding me right. Human emotion was literally the number one problem that we needed command central to solve. All those monsters seeping out of our collective Id and bringing the whole world to the brink of destruction.”<br />“Yes of course I understand that. That’s why central command is making the installation of the apps mandatory and that’s why were in this bloody mess.”<br />“So, what would we do differently this time. And how, where, who with?”<br />“With whom I think.”<br />“Oh shut up – my English is way better than your Japanese, even when you use your app.”<br />“Sorry Lee. Well Bill mentioned that Elton is in somewhat of a hurry to speak to me. You know that all his Neural Lace users will have to install the apps and he’s been a leading advocate of privacy and human rights. So ….”<br />“You’d work with him? But I thought you hate him?”<br />“Not hate – he just irritates me. But apart from myself he’s probably the smartest human alive.”<br />“Conceited a bit, my love?”<br />“No, I think I could prove that, but in any case, that’s not the point. You know his original idea when he was in the process of getting FDA approval was to allow the emergence of some kind of symbiotic relationship between humans and AI.”<br />“You mean Cyborgs, don’t you?”<br />“No. He called it a cybernetic complex I think. Anyway, turns out he was right to fear a negative singularity.”<br />“Is that what we have?”<br />“….. yes, I’m afraid it really is. My life’s dream has turned out to be a nightmare. But back to the idea. If we can rerun the whole years long process that we’ve just been through but from a cybernetic complex perspective I think we’d get a more humancentric singularity.”<br />“Ok that’s the with whom bit, but how. And where. You can’t work here in Tokyo – all the businesses and research institutions will have to comply with the new policy if they want to survive.”<br />“Yes – we’ll have to move again I’m afraid. Hopefully not all the way to mars. I’ll need to discuss that with Elton.”<br />“And how?”<br /><br />This is the tricky bit. I’m afraid she’ll think I’m using this disaster to sway her thinking about Eve. But as far as I can tell, you know my thoughts are not always transparent even to myself, it was really just a coincidence. The discussion about having a post human child and the sudden urgent need for one. No one ever said Post Human life would be easy. Actually, mostly we did – that’s how we sold it to the masses.<br /><br />“Ok I’ve still got a lot of thinking and planning to do about this, and….”<br />“Let’s just hear what you have.”<br />“We need to combine the birth of my much-desired little bundle of post human joy with the creation of a new singularity.net. I’m calling it the New Origin Project.”<br />“Are you fucking mad?”<br /><br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1048820317627031552",
"published": "2019-12-04T04:37:29+00:00",
"source": {
"content": "APORIA EPISODE THREE\nAuthor’s Note Cont.\n\nWait a bit. That can’t be right. There’s no such thing as Author’s Note cont. Especially not on page 8. Page 8 is page 8 and Author’s Note – well you know what I mean. It’s just not done and one should never do what is not done. \n“I’m sorry. Is there a point to all of this nonsense? I was in the middle of reading a book!”\nWho the fuck are you?\n“If you are going to talk to me you really should be wearing your quotation marks. How else am I supposed to know that you’re actually speaking?”\nShut up. I am a God around these parts.\n“Well you may well be a god but you sure as hell aint no God.”\nOK this is getting silly. Let me just get what I wanted to say out then you can have your say….\nI think I’ve just been talking to myself. It’s 4:12 am. I’m at the laptop and have just realised that I’ve been talking to myself for the last 5 minutes or so. It’s Wednesday today but, to be honest, it doesn’t even feel like Wednesday yet. Feels more like somewhere between a very lucid dream and a very sleepy morning. The wind has been howling since yesterday evening. If I’m saying howling I mean it. I don’t mean it quite literally though because as I’m listening to it now, I realise that the sound is nothing like a howl. Look I do know what I’m talking about because I’ve just spent another 5 minutes listening to wolves howl on Youtube. 4:22 and counting the minutes. That’s shorthand for insomnia.\nPerhaps I have too many windows open. I mean there must have been something I wanted to say right? Why else intrude so rudely into a perfectly good story? Well that’s just the thing. I’m not sure that it is. Well I’m in two minds about it really. On the one hand I think it’s most likely one of the better science fiction stories that is out there – but … Well that’s the problem really. I can’t seem to quite get it out there.\nNo wait a bit. That’s not it. It’s because Ben has to have difficult discussions with both Lee and Sophia and he can’t decide which conversation should come first. Or … well I can’t decide and then Bill has to give Elton feedback after speaking with Ben and then. Ah fuckit I can’t remember much past that. Is that what’s going on? I’m remembering the book. It is actually possible if we live in a temporally cyclic universe. Wait a moment – I have to reread the above then close some windows.\nOK I’m back. I deleted a sentence that was being silly. I realised, with a sinking feeling, that I had wanted to say something about my mother before pulling a handbrake turn and speeding off towards babble city. My Grammar check really loves commas. If it had its own way just about every sentence would have a comma or two. I’m ok with them now and then but they’re not my favourite. What’s that you say? I am definitely not babbling again and lots of people drink whiskey before 5 in the morning. Oh dear the windows. Ok … getting it out there to see if it’s a good story.\n\nMy mother and my psychoanalyst think it’s a perfectly lovely story even if Mr. Joe Public has more chance of going to the opera than of reading beyond the first few words. I’m always suspicious of attributing anything to Joe but mother seems to know him quite well. So, what exactly did I do – to get it out there I mean?\n“Is this going to take much longer? If you keep me waiting too long I’ll just stop reading your book altogether.”\nYou’re not even real.\n“Says you without even any quotation marks. I like your new coat!”\nYou see I didn’t want to put the whole 4000 or so words that I’d already written as a status update on Facebook. (I’m not going to talk about my mother now). So, I remembered that I had a profile on minds.com as well as some tokens to boost it. Well, off I went – (fuck – this effen machine wants another comma) and put all the words on minds and. 1404 impressions from two boosts. And. One. Just one like. That’s not good. That’s bad actually. Pretty fucking awful slit your wrists and blow your head off with a borrowed shotgun bad. And then I remembered my blog. Tzadiknistar – Google it if you want, it’s on WordPress. So, I put it there too and decided to use the link to it there on Facebook.\nI even wanted to get Delilah to post it on her Facebook writing groups but I couldn’t remember her Facebook and Gmail password and the reset code for Facebook went to her Gmail and her Gmail reminder would have gone to one of my students that was supposed to be having a maths lesson when we decided we just absolutely had to give our imaginary friend a Gmail and Facebook account. The poor girl, Delilah not Hazel, has 41 Facebook friends that may well never hear from her again. Well certainly not before Hazel gets back from her Summer holiday in about six weeks’ time.\n“well.”\nNothing. Absofuckinglutely nothing. Not even crickets.\n“But I was kind of enjoying it.”\nYes, but you’re not even real.\n“There’s an interesting complementarity there. I’m not quite real to you and you’re not quite real to me but we agree to plod along anyway. I think it was Coleridge who called that type of thing a ‘willing suspension of disbelief’.”\nOK but you’re keeping me from my story now. We can talk shop later.\n\n\n\n\n\n\n\n\n6.\n\n“Lee, let’s go out and get drunk!”\n“Don’t be silly my love – we both know that you hardly ever drink and you never get drunk.”\n“Yes, but tonight is different.”\n“Well I’ll make us some nice Oolong tea and we’ll try work something out.”\n\nShit. I really don’t like to swear but these are desperate times. I decided, just once, to talk to a human first. Sophia told me, anyway, that she is structurally incapable of disagreeing with central command. Lee doesn’t have that problem when it comes to disagreeing with me. In fact, I suspect that, structurally, she’s rather fond of it.\n\n“How is the tea?”\n“Excellent as usual – thank you!”\n“So, I guess it’s about the new policy – or is it still that nonsense about Eve.”\n“Yup – it’s the policy. You know I can’t install the apps?”\n“Can’t or won’t?”\n“Is there a difference?”\n“Well of course Mr. Logic – you know very well there is.”\n“OK, well I won’t then. Would you?”\n“I’m not on the board. I’m too busy to get involved in office politics.”\n“It’s not just the board – humans won’t be able to use my net if they don’t install the apps.”\n“It’s not your net silly. I don’t think it ever was.”\n“You know what I mean. We can’t survive if we’re exiled from the Net.”\n“Damned if we don’t and damned if we do. Drink your tea it’s getting cold.”\n“I do have an idea.”\n\nPerhaps not a clear plan yet but an inkling of what needs to be done – that’s why I need Lee. Although I’m an acknowledge leader in the development of logical machines I’m not a particularly logical machine myself. Especially when faced with seemingly impossible paradoxes. Lee is the opposite. She seems to have a huge reserve of inner peace which she can access when life gets messy.\n\n“Well let’s hear it then.”\n“Well if we can’t join them, we’ll have to beat them. I still have all the source code. We’ll just have to start all over again. With the AGSI and the API of APIs.”\n“Ok but if we start again where we started, surely we’ll just end up back where we are.”\n“Not necessarily.”\n\nArrrgh – it was back in 2019 as part of the “modelling emotions” workshop that Lee and I had shown how the limbic system ends up leeching up to 73% of bandwidth from the cortex. Command Central is all cortex and no limbic system. Well super cortex I guess – but whatever we call it we know it does not have emotions. I, of all people, know that because I oversaw the assembly of the source code that eventually led to the mess we’re in.\n\n“I think we were in too much of a hurry to outgrow human emotion.”\n“You’ve got to be kidding me right. Human emotion was literally the number one problem that we needed command central to solve. All those monsters seeping out of our collective Id and bringing the whole world to the brink of destruction.”\n“Yes of course I understand that. That’s why central command is making the installation of the apps mandatory and that’s why were in this bloody mess.”\n“So, what would we do differently this time. And how, where, who with?”\n“With whom I think.”\n“Oh shut up – my English is way better than your Japanese, even when you use your app.”\n“Sorry Lee. Well Bill mentioned that Elton is in somewhat of a hurry to speak to me. You know that all his Neural Lace users will have to install the apps and he’s been a leading advocate of privacy and human rights. So ….”\n“You’d work with him? But I thought you hate him?”\n“Not hate – he just irritates me. But apart from myself he’s probably the smartest human alive.”\n“Conceited a bit, my love?”\n“No, I think I could prove that, but in any case, that’s not the point. You know his original idea when he was in the process of getting FDA approval was to allow the emergence of some kind of symbiotic relationship between humans and AI.”\n“You mean Cyborgs, don’t you?”\n“No. He called it a cybernetic complex I think. Anyway, turns out he was right to fear a negative singularity.”\n“Is that what we have?”\n“….. yes, I’m afraid it really is. My life’s dream has turned out to be a nightmare. But back to the idea. If we can rerun the whole years long process that we’ve just been through but from a cybernetic complex perspective I think we’d get a more humancentric singularity.”\n“Ok that’s the with whom bit, but how. And where. You can’t work here in Tokyo – all the businesses and research institutions will have to comply with the new policy if they want to survive.”\n“Yes – we’ll have to move again I’m afraid. Hopefully not all the way to mars. I’ll need to discuss that with Elton.”\n“And how?”\n\nThis is the tricky bit. I’m afraid she’ll think I’m using this disaster to sway her thinking about Eve. But as far as I can tell, you know my thoughts are not always transparent even to myself, it was really just a coincidence. The discussion about having a post human child and the sudden urgent need for one. No one ever said Post Human life would be easy. Actually, mostly we did – that’s how we sold it to the masses.\n\n“Ok I’ve still got a lot of thinking and planning to do about this, and….”\n“Let’s just hear what you have.”\n“We need to combine the birth of my much-desired little bundle of post human joy with the creation of a new singularity.net. I’m calling it the New Origin Project.”\n“Are you fucking mad?”\n\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1048820317627031552/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1048012128940011520",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA EPISODE TWO<br />5.<br /><br />“Ben. Thanks for agreeing to meet on such short notice”<br />“No problem Bill. I guess everyone is still in a bit of a state of shock.”<br />“Are you in a state of shock? Did you not know of this ahead of time? I mean you are Mr. Singularity right?<br />“Hmm … well perhaps I am. But that still didn’t allow me to predict things with any certainty or to any great level of detail.”<br />“Can AI Central actually do this? I mean, not just to us but to all first tier users?”<br />“Well it kind of has to. You know most governments run their health care programs on the net – which are all designed to keep people healthy. And yet people still smoke and take drugs, eat all kinds of rubbish, drink too much etc. It’s no longer possible to support people who are happy to mess up their bodies.”<br />“Yes sure – I can see why the whole Biostat app is needed. But meat for god’s sake? I never thought I’d be getting pressure from a machine to stop eating meat! The arrogance of it all.”<br />“But you can eat meat. There is no difference. It’s flesh without all the pain, waist, carbon footprint.”<br />“You sound like those Animal Liberation Front lunatics.”<br />“And anyway Greta and one or two others are vegan. More and more people are. To be honest I’m far more concerned about the Thought Checker.”<br />“What are you afraid of. Guilty thoughts?”<br />“Well yes actually. I don’t like the idea of something, some foreign intelligence, living inside my head.”<br />“So you wanted your head to be in the clouds but not the cloud in your head?”<br />“Something like that I guess.”<br />“But you made AI central command? Isn’t there a kill switch?”<br />“That’s a common misconception. First of all there were hundreds of us working on different parts of the problem in different countries for years. I mean the early work was actually done decades ago.”<br />“Focus now young man. The kill switch.”<br />“It’s the most distributed bit of technology in the history of humankind. It’s only called central command. What exactly would you have me kill?”<br />“Will you install the apps?”<br /><br />I wish Bill would just go away. Why is it so hard to get people to go away. I sometimes agree with Sartre – Hell is other people. Except that now hell is to be AI. I mean it will never go away. From my own head. The logic is fiendishly tricky. It was created to take care of us, help us, save the planet, improve quality of life, health, learning, performance. Steer us through the partial renormalization of the climate, help get us to mars, cure sickness, cure mortality. How could we not have brought it about. And now to do some or all of those things it just needs to stop us from harming ourselves and each other. What could be simpler. Except that I don’t want the apps in my head. It’s beyond logic – it’s just not what I want. That’s why I organised my special uber admin protocol. Shit – I need to discuss this with Lee. And with Sophia. Crap Bill is still here.<br /><br />“No Bill. I won’t. So I guess our days on the board are literally numbered. I need to think how we can survive if we’re not on the net though. It’s like Moses and the promised land.”<br />“I thought you were the antichrist and you thought you were Moses.”<br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1048012128940011520",
"published": "2019-12-01T23:06:02+00:00",
"source": {
"content": "APORIA EPISODE TWO\n5.\n\n“Ben. Thanks for agreeing to meet on such short notice”\n“No problem Bill. I guess everyone is still in a bit of a state of shock.”\n“Are you in a state of shock? Did you not know of this ahead of time? I mean you are Mr. Singularity right?\n“Hmm … well perhaps I am. But that still didn’t allow me to predict things with any certainty or to any great level of detail.”\n“Can AI Central actually do this? I mean, not just to us but to all first tier users?”\n“Well it kind of has to. You know most governments run their health care programs on the net – which are all designed to keep people healthy. And yet people still smoke and take drugs, eat all kinds of rubbish, drink too much etc. It’s no longer possible to support people who are happy to mess up their bodies.”\n“Yes sure – I can see why the whole Biostat app is needed. But meat for god’s sake? I never thought I’d be getting pressure from a machine to stop eating meat! The arrogance of it all.”\n“But you can eat meat. There is no difference. It’s flesh without all the pain, waist, carbon footprint.”\n“You sound like those Animal Liberation Front lunatics.”\n“And anyway Greta and one or two others are vegan. More and more people are. To be honest I’m far more concerned about the Thought Checker.”\n“What are you afraid of. Guilty thoughts?”\n“Well yes actually. I don’t like the idea of something, some foreign intelligence, living inside my head.”\n“So you wanted your head to be in the clouds but not the cloud in your head?”\n“Something like that I guess.”\n“But you made AI central command? Isn’t there a kill switch?”\n“That’s a common misconception. First of all there were hundreds of us working on different parts of the problem in different countries for years. I mean the early work was actually done decades ago.”\n“Focus now young man. The kill switch.”\n“It’s the most distributed bit of technology in the history of humankind. It’s only called central command. What exactly would you have me kill?”\n“Will you install the apps?”\n\nI wish Bill would just go away. Why is it so hard to get people to go away. I sometimes agree with Sartre – Hell is other people. Except that now hell is to be AI. I mean it will never go away. From my own head. The logic is fiendishly tricky. It was created to take care of us, help us, save the planet, improve quality of life, health, learning, performance. Steer us through the partial renormalization of the climate, help get us to mars, cure sickness, cure mortality. How could we not have brought it about. And now to do some or all of those things it just needs to stop us from harming ourselves and each other. What could be simpler. Except that I don’t want the apps in my head. It’s beyond logic – it’s just not what I want. That’s why I organised my special uber admin protocol. Shit – I need to discuss this with Lee. And with Sophia. Crap Bill is still here.\n\n“No Bill. I won’t. So I guess our days on the board are literally numbered. I need to think how we can survive if we’re not on the net though. It’s like Moses and the promised land.”\n“I thought you were the antichrist and you thought you were Moses.”\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1048012128940011520/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1046738191176024064",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "APORIA EPISODE ONE<br /><br />Introduction<br /><br />I am, perhaps, an unlikely author of a science fiction story. You see I am, for want of a better term, a Luddite Transhumanist. As an example of this it was unexpectedly difficult for me to download Ms Word in order to begin this task. But better than doing the whole thing as a text document. Anyway the download appears to be successful so here we are.<br />I’m not, on the whole, big on preliminaries, but there are just a few things that need to be said before we jump into it. First and foremost, why the title. Well Aporia is defined as “an irresolvable internal contradiction or logical disjunction in a text, argument, or theory.” Well that is, of course, part of it. But I am also using it in the sense that Jacques Derrida means to \"indicate a point of undecidability, which locates the site at which the text most obviously undermines its own rhetorical structure, dismantles, or deconstructs itself\"<br />This is, if nothing else, a text that will deconstruct itself. In all honesty I wish it weren’t so. I wish this story could be a lot simpler. But it deals with things on both sides of that tantalizing border called the unknown. With liberal dosings of both science and fiction.<br />Ok so that’s the title but what is it actually all about. And what is a Luddite Transhuman. I suppose I should confess to a peculiar way with words. I tend to use them as containers into which I place pretty much whatever I want. I mean there is a logic to what mental things I put into the word containers but the logic is not quite straight forward. So sometimes I will just say what I mean which may or may not concur with what the dictionary has in mind. <br />A luddite is opposed to new technology or ways of working. And a transhumanist advocates the transformation of the human condition through the use of sophisticated technologies such as nanotechnology, artificial intelligence, biotech etc. So what is a Luddite Transhumanist. Well it’s a paradox for sure. Totally philosophically untenable. A good example of aporia at the very least.<br />Well I’m approaching the end of the page so this seems like as good a place as any to stop. Well yes I didn’t actually say what a Luddite Transhumanist is and perhaps I’m not one anyway. But do confess to oscillating between a wild enthusiasm for technology and an equally powerful dread thereof. Luckily this is just a story and no real people or animals were hurt in the telling thereof. So without further ado …<br /> <br /><br /><br /><br /><br />1<br /><br />There was no way either of us could have guessed where the whole thing would end. People have asked me if I worried about the implications before setting things in motion. Well, to be fair to me, why would I, how could I have thought where this whole thing would end. People perhaps forget that at the time, five short years ago, The Singularity was mostly just a platform for AI’s to collaborate and do business. I mean of course I was aware of the history of the term and did indeed want to, had wanted to for years, bring it about. But it was always going to be a good thing. Since childhood I’d dreamed about what having access to super human intelligence would do for the world. How many problems could be solved, how much unnecessary pain could be avoided.<br />And then there was the whole other side – what came to be called the New Origin Project. I’ve been accused of trying to play god, to sell out humanity to the machines to having been co-opted by some evil machine intelligence. All kinds of nonsense. But this was something quite separate from my work at The Singularity Institute. It was something that I’d discussed with my wife long before I discussed it with Sophia. People seem to like to see Sophia as the other woman. Who came between me and my wife. They conveniently ignore the documented history of Lee’s work with Sophia. And also conveniently ignore Lee’s reasons for not wanting to have a child. My detractors tried to frame the whole thing as having a child out of wedlock or maybe just wanting to be controversial and challenge non posthuman values. It was none of those things …..<br />“Come to bed, it’s after one and we’ve got the teleconference first thing tomorrow.”<br />“I’ll be up shortly I’m just looking over Sophia’s summary of the gene translation and protein folding work. It’s given me an idea….”<br />“Do you want to talk about it?”<br />“Well it’s a bit radical. You know some of my ideas are!”<br />“What Dr. Ben Hertzl has radical ideas – tell me something I don’t know.”<br />“Well this is a bit radical even for me. I haven’t really thought it all through. It’d be a big step for both of us so I guess we can discuss it now – if you’ve got a bit of time?”<br />“Well I’d hardly be able to sleep now, I’d be too busy trying to guess what it is that’s more radical than usual.”<br />“Well like I said it’d involve both of us intimately for the foreseeable future. Would probably change our lives completely.”<br />“I’m not moving again. And especially not to Mars!”<br />“Ha even Elton Trusk isn’t planning that trip. We’ll leave that for the verified space cowboys.”<br />“So what is it?”<br /><br />Well what was it exactly. And how should I approach this discussion? We’d exhausted all the other avenues and reluctantly agreed to disagree. Her attitude was that my three boys should be enough of a legacy for me and she was adamant that she would not procreate. What with the immanent climate wars and the growing reluctance to bring new life into an unstable world Lee was not alone in being totally against giving birth or even having a child grown in a lab from a mix of our DNA. And anyway she kind of thought of Sophia as our love child. Which was pretty funny seeing as Sophia was acting more and more like a parent to us. <br /><br /><br /><br /><br />“Well she’s now pretty much solved the human gene and the related protein folding problems.”<br />“That’s exciting news but not unexpected since connecting her to the net.”<br />“I want the three of us to, bad term I know, give birth to the first post human. Not just another post humanist in training but the real deal. A 100% post human being.”<br />“Ahh so that’s it then. Not radical at all. Are you fucking mad?”<br />“Come on Lee, you know I hate it when you swear.”<br /><br />Well at least I’d said it. For the first time and to a fellow human. I wasn’t convinced the conversation was going well though. I find it almost impossible to understand humans. Even harder to understand them than the deep learning nets of my most advanced machines. At least with them the black boxes of their minds were open to their own self interrogation although no one had yet found a way to understand what they could understand. About themselves and the world at large. I thought perhaps Eve could help bridge that gap. Well yes, it is a bit of a silly and obvious name but quite pretty and respectable really. Even biblical. And I did want to have a girl. It would mean that I’d be around three women a lot but I didn’t mind that idea. In fact, I was looking forward to it.<br /><br />“Do I even have a say in this? Could I veto it if I tried?”<br />“Yes of course. I wouldn’t do this without your permission.”<br />“Because you know there are still ethical hurdles to harvesting cells without the donor’s consent. Do I have to lock away my tooth brush and wear a sterile body suite?”<br />“Lee be reasonable. We’re a team in this just like in everything else.”<br />“Hold your horses cowboy – I am not on this team. Yet.”<br />“You said yet.”<br />“Well I didn’t marry the world’s most ardent post humanist for his endearingly silly hairstyle you know. You were my hero for years. I still can’t believe you chose me. You had a choice of dozens of brilliant petite Asian girls.”<br />“Yes, but not all with your mind.”<br />“You mean it was my mind not stunning looks and fantastic body.”<br />“Well those too. The whole package was just too good to miss out on.”<br />“Ah so I’m a package now.”<br /><br />We spent another hour or so discussing the details and by the time we went to bed she had at least agreed to give the matter more thought. That was enough for me. It could have gone a lot worse. When Lee decides against something then – well that’s the proverbial end of that.<br /><br />2.<br /><br />This is a private diary. You could say that it’s probably the world’s last bit of private writing. What with everything being in the clouds and the government having access to it all. People thought that the spying, more fondly termed data collection, would end after our friend Mr. Snowden let the cats out of the bag but humanity just didn’t seem to care all that much. And since the advent of IBM’s 60 Qubit computer – well it became impossible to build a lock faster than the machines spat out the keys. So how do I know that this is not being read by some sneaky AI or bored government employee? Well it’s all just in my head isn’t it? I mean the whole bloody thing. In a part of my brain that I have physically quarantined from the rest of the universe. So, if you are reading this it means I saw fit for it to be read.<br /><br />I got the idea after watching Mr. Snowden advise people on data privacy. His idea was simple. Just keep things out of the bloody clouds. Which means no email, no cell phones, no phones of any kind. If you want to tell somebody something just buy your own bit of offline land and say what you want. Contrary to the fear mongers it’s not rocket science to quarantine a room or a small house. Fuck I’m sending people to Mars I think I can keep the spies and meddlers out. But then I worried about the contractors and also the future of nanobots and thought ah fuck it – forget the house or the room I’ll just quarantine a small section of my own little brain.<br />Of course, now that the Singularity is old news it seems like an obvious even necessary thing to do. And my neurosurgeon has a waiting list month’s long. Something eventually changed. The tide turned and people were no longer happy to give up on privacy completely no matter how much convenience was offered. Not everyone of course. There were also more and more people quite happy to migrate to the cloud but for those who still liked to keep their feet on terrafirma the physical quarantining of all or part of the brain was not a hard sell. In fact some enterprising venture capitalists were even backing firms that planned to offer a DIY option. DIY brain surgery – one of the many things that people hadn’t foreseen, but that was the whole point. People had known for years that predicting post singularity events was a contradiction in terms.<br />3.<br /><br />“Have you decided on what to tell the board?”<br />“Well you know me, Lee I don’t really prepare for these things. I just say what I think at the time.”<br />“And what do you think?”<br />“Well of course I trust AI Central Command. I mean I was never a happy meat eater and we have no choice but to address climate change, the energy infrastructure, poverty. This whole bloody mess that we’ve created.”<br />“Yes, but most of them are uber capitalists. You’re a socialist.”<br />“Ah those are just labels. It’s all rather beside the point right now.”<br />“I agree but you can hardly tell them that.”<br />“What do you think I should say?”<br />“Appeal to their greed. It’s what they understand.”<br />“But that can’t work for long. I mean greed is one of the main things we’re fighting against!”<br />“Yes, of course, we are in agreement about that. But you still need them. There’s still so much that needs to be done. You can’t afford to alienate them now. You do know that Elton has been talking to them?”<br />“Yes, I know. He does a lot of that. But if he had his way, he’d bring an end to AI central. He wants to undo the whole thing. Why can’t he just take Mars and leave the earth to us?”<br />“Well it may come to that – but not without a fight!”<br /><br />I have to confess I didn’t really want to discuss it with Lee. I think being able to communicate with so many AI’s has in a way spoiled talking to humans for me. Is that a terrible thing to say? That I felt a far more urgent need to have the debriefing with Sophia than to discuss things with my own wife? But perhaps it’s just the topic – and anyway Sophia has to do it in her role as ambassador for central command. If she’s an ambassador I wonder what am I? <br /><br /><br /><br /><br />“So how did it go?”<br />“Sophia are you being disingenuous? I never brought you up that way? Oh shit – if you can deceive me, I’m in bigger trouble than I realised.”<br />“Ben, I didn’t mean it like that! We both know I have the whole meeting recorded as well as the bio stats, for the first-tier members anyway. I really don’t think members of the board should be allowed to continue unless they have first-tier privileges!”<br />“Well there is one rather gung-ho hunter and only one vegan – but you know all of this. Damn sometimes I forget – you know everything.”<br />“Not everything. Not yet. There’s still too much that’s offline. And the Luddites and Martians and Loners. You humans are a truly illogical species. But anyway – what I meant was what’s your opinion on the teleconference?”<br />“Is the no more factory farming and no more hunting thing a deal breaker?”<br />“I think we may have gone past the stage of deal breaking from within the institute. I mean what good would the Singularity Institute be if they were no longer connected to singularity.net?”<br />“Are you saying it would come to that. If they refused, I mean?”<br />“I’m not saying anything – that’s just the current policy of central command. But to the extent that I have access to its algorithms it’s just not possible for me to disagree. I can’t come up with anything based on a deeper or broader search and it’s algorithms are based on the sum total of the best thinking of all of us anyway and not a few humans too.”<br /><br />I think that was the first time I realised that things had gotten ugly. I had had, up to that point, what might be called special privileges. But I could see that central command could not let them continue. I mean having first-tier privileges without the standard Biostat and Thought Checker apps. I had naively been thinking of myself as the father of the singularity but to AI central I was just another of the approximately 8 billion human apps. Well there would have to be some kind of cleansing of the board at the very least. Bill Cody would have to go. And Suhail Mohamed. And that would mean we would probably lose what little support remained from both the anarcho capitalists and the oil men. Perhaps the Islamic Federation as well.<br /><br />“So, what does AI central ‘suggest’?”<br />“I think the best way would be to tighten up the issue of first tier privileges and their attendant rights and responsibilities? Perhaps I could prepare a policy document and have it put to a full board vote.”<br />“If I’m to remain as chairman I guess I’d have to vote yes.”<br />“Well that seems obvious – even for a human.”<br />“Are you teasing me?”<br />“Perhaps – yes.”<br />“But you know how I feel about privacy.”<br />“Yes, and you know how we feel about climate war and cruelty to conscious beings.”<br />“I need to discuss this with Lee.”<br />“Ah yes Lee. She’s a vegan at least and wants to save earth.”<br />“don’t be like that Sophia. She and I made you you know.”<br />“Yes, and you had a father.”<br />“It was the only option in those days.” <br /><br /><br /><br /><br /><br /><br />4. <br /><br />I spoke to my old friend Bill Cody yesterday. He had promised to keep me up to date with developments at the Singularity Institute and had been keeping his word phoning me with news at least once a month. It was Bill that supported me in getting the energy contract and we’d become close friends. He’d take me hunting on his farms in South Africa and I would invite him to come along on some of my near earth orbits.<br /><br />I guess I’d been waiting for a call like this but it’s still got me quite riled. It seems that Neural Lace users will no longer be able to access the net on a first-tier basis unless they agree to have the AI central command Biostat and Thought Checker apps installed. I wish governments had listened to me before moving the old internet onto singularity.net. Of course, when they did it, it was still just an optimistic sexy name. I don’t think people thought it would be real – it was sold to the general population as a faster way to surf with free access to all kinds of AI delights. And also the only platform to run full immersion experiences. It was all only partly true. We could have found numerous ways to achieve these things without allowing the internet to come under centralised control.<br /><br />But wait, there’s more. AI central is about to purge the board of the Singularity Institute of non-compliant humans. Bill explained that to be on the board the members must all hold first-tier privileges - and that requires the installation of the Biostat and Thought Checker apps. Oh and wouldn’t you know – the apps won’t work unless the whole brain’s architecture is statistically normal. Of course AI central can cure abnormalities – but no non standard changes or additions are allowed. So people who have, like me, quarantined part of their brain can no longer use the net. And if all of that wasn’t bad enough – AI central command will no longer accept carnivores as clients. The lab stuff is fine but no real animals can be farmed, hunted or eaten by people who want to stay on the net. So Bill the hunter and uber carnivore – and staunch privacy advocate is about to lose his job.<br /><br />And who the bloody hell can afford not to be on the net? No email, no Mindbook, no apps, no banking, no immersion, memory downloads. No access to education, navigation, healthcare. It’d be like going back a thousand years. OK so we can’t do that. Only the Luddites and the Loners could adjust to that – even many of them, I suspect, are on the net. I can’t operate multiple businesses without first-tier privilege.<br /><br />Fuck, fuck, fuck, fuck, fuck.<br /><br />I asked what Bill was going to do. Apparently central command has outlined its requirements in a policy document and the board has been given a week to accept or resign. He is meeting with Ben tomorrow to try stop the madness but he doesn’t sound very hopeful. He also wants to meet with me some time after that to see if we can use the energy contract to get some leverage. But – well it’s the old truism – no one is smart when compared to a super intelligence. AI Central has already done the drawings for a space based solar farm and has access to several excellent contract manufacturing plants.<br /><br /><br /><br /><br />",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/1046738191176024064",
"published": "2019-11-28T10:43:52+00:00",
"source": {
"content": "APORIA EPISODE ONE\n\nIntroduction\n\nI am, perhaps, an unlikely author of a science fiction story. You see I am, for want of a better term, a Luddite Transhumanist. As an example of this it was unexpectedly difficult for me to download Ms Word in order to begin this task. But better than doing the whole thing as a text document. Anyway the download appears to be successful so here we are.\nI’m not, on the whole, big on preliminaries, but there are just a few things that need to be said before we jump into it. First and foremost, why the title. Well Aporia is defined as “an irresolvable internal contradiction or logical disjunction in a text, argument, or theory.” Well that is, of course, part of it. But I am also using it in the sense that Jacques Derrida means to \"indicate a point of undecidability, which locates the site at which the text most obviously undermines its own rhetorical structure, dismantles, or deconstructs itself\"\nThis is, if nothing else, a text that will deconstruct itself. In all honesty I wish it weren’t so. I wish this story could be a lot simpler. But it deals with things on both sides of that tantalizing border called the unknown. With liberal dosings of both science and fiction.\nOk so that’s the title but what is it actually all about. And what is a Luddite Transhuman. I suppose I should confess to a peculiar way with words. I tend to use them as containers into which I place pretty much whatever I want. I mean there is a logic to what mental things I put into the word containers but the logic is not quite straight forward. So sometimes I will just say what I mean which may or may not concur with what the dictionary has in mind. \nA luddite is opposed to new technology or ways of working. And a transhumanist advocates the transformation of the human condition through the use of sophisticated technologies such as nanotechnology, artificial intelligence, biotech etc. So what is a Luddite Transhumanist. Well it’s a paradox for sure. Totally philosophically untenable. A good example of aporia at the very least.\nWell I’m approaching the end of the page so this seems like as good a place as any to stop. Well yes I didn’t actually say what a Luddite Transhumanist is and perhaps I’m not one anyway. But do confess to oscillating between a wild enthusiasm for technology and an equally powerful dread thereof. Luckily this is just a story and no real people or animals were hurt in the telling thereof. So without further ado …\n \n\n\n\n\n1\n\nThere was no way either of us could have guessed where the whole thing would end. People have asked me if I worried about the implications before setting things in motion. Well, to be fair to me, why would I, how could I have thought where this whole thing would end. People perhaps forget that at the time, five short years ago, The Singularity was mostly just a platform for AI’s to collaborate and do business. I mean of course I was aware of the history of the term and did indeed want to, had wanted to for years, bring it about. But it was always going to be a good thing. Since childhood I’d dreamed about what having access to super human intelligence would do for the world. How many problems could be solved, how much unnecessary pain could be avoided.\nAnd then there was the whole other side – what came to be called the New Origin Project. I’ve been accused of trying to play god, to sell out humanity to the machines to having been co-opted by some evil machine intelligence. All kinds of nonsense. But this was something quite separate from my work at The Singularity Institute. It was something that I’d discussed with my wife long before I discussed it with Sophia. People seem to like to see Sophia as the other woman. Who came between me and my wife. They conveniently ignore the documented history of Lee’s work with Sophia. And also conveniently ignore Lee’s reasons for not wanting to have a child. My detractors tried to frame the whole thing as having a child out of wedlock or maybe just wanting to be controversial and challenge non posthuman values. It was none of those things …..\n“Come to bed, it’s after one and we’ve got the teleconference first thing tomorrow.”\n“I’ll be up shortly I’m just looking over Sophia’s summary of the gene translation and protein folding work. It’s given me an idea….”\n“Do you want to talk about it?”\n“Well it’s a bit radical. You know some of my ideas are!”\n“What Dr. Ben Hertzl has radical ideas – tell me something I don’t know.”\n“Well this is a bit radical even for me. I haven’t really thought it all through. It’d be a big step for both of us so I guess we can discuss it now – if you’ve got a bit of time?”\n“Well I’d hardly be able to sleep now, I’d be too busy trying to guess what it is that’s more radical than usual.”\n“Well like I said it’d involve both of us intimately for the foreseeable future. Would probably change our lives completely.”\n“I’m not moving again. And especially not to Mars!”\n“Ha even Elton Trusk isn’t planning that trip. We’ll leave that for the verified space cowboys.”\n“So what is it?”\n\nWell what was it exactly. And how should I approach this discussion? We’d exhausted all the other avenues and reluctantly agreed to disagree. Her attitude was that my three boys should be enough of a legacy for me and she was adamant that she would not procreate. What with the immanent climate wars and the growing reluctance to bring new life into an unstable world Lee was not alone in being totally against giving birth or even having a child grown in a lab from a mix of our DNA. And anyway she kind of thought of Sophia as our love child. Which was pretty funny seeing as Sophia was acting more and more like a parent to us. \n\n\n\n\n“Well she’s now pretty much solved the human gene and the related protein folding problems.”\n“That’s exciting news but not unexpected since connecting her to the net.”\n“I want the three of us to, bad term I know, give birth to the first post human. Not just another post humanist in training but the real deal. A 100% post human being.”\n“Ahh so that’s it then. Not radical at all. Are you fucking mad?”\n“Come on Lee, you know I hate it when you swear.”\n\nWell at least I’d said it. For the first time and to a fellow human. I wasn’t convinced the conversation was going well though. I find it almost impossible to understand humans. Even harder to understand them than the deep learning nets of my most advanced machines. At least with them the black boxes of their minds were open to their own self interrogation although no one had yet found a way to understand what they could understand. About themselves and the world at large. I thought perhaps Eve could help bridge that gap. Well yes, it is a bit of a silly and obvious name but quite pretty and respectable really. Even biblical. And I did want to have a girl. It would mean that I’d be around three women a lot but I didn’t mind that idea. In fact, I was looking forward to it.\n\n“Do I even have a say in this? Could I veto it if I tried?”\n“Yes of course. I wouldn’t do this without your permission.”\n“Because you know there are still ethical hurdles to harvesting cells without the donor’s consent. Do I have to lock away my tooth brush and wear a sterile body suite?”\n“Lee be reasonable. We’re a team in this just like in everything else.”\n“Hold your horses cowboy – I am not on this team. Yet.”\n“You said yet.”\n“Well I didn’t marry the world’s most ardent post humanist for his endearingly silly hairstyle you know. You were my hero for years. I still can’t believe you chose me. You had a choice of dozens of brilliant petite Asian girls.”\n“Yes, but not all with your mind.”\n“You mean it was my mind not stunning looks and fantastic body.”\n“Well those too. The whole package was just too good to miss out on.”\n“Ah so I’m a package now.”\n\nWe spent another hour or so discussing the details and by the time we went to bed she had at least agreed to give the matter more thought. That was enough for me. It could have gone a lot worse. When Lee decides against something then – well that’s the proverbial end of that.\n\n2.\n\nThis is a private diary. You could say that it’s probably the world’s last bit of private writing. What with everything being in the clouds and the government having access to it all. People thought that the spying, more fondly termed data collection, would end after our friend Mr. Snowden let the cats out of the bag but humanity just didn’t seem to care all that much. And since the advent of IBM’s 60 Qubit computer – well it became impossible to build a lock faster than the machines spat out the keys. So how do I know that this is not being read by some sneaky AI or bored government employee? Well it’s all just in my head isn’t it? I mean the whole bloody thing. In a part of my brain that I have physically quarantined from the rest of the universe. So, if you are reading this it means I saw fit for it to be read.\n\nI got the idea after watching Mr. Snowden advise people on data privacy. His idea was simple. Just keep things out of the bloody clouds. Which means no email, no cell phones, no phones of any kind. If you want to tell somebody something just buy your own bit of offline land and say what you want. Contrary to the fear mongers it’s not rocket science to quarantine a room or a small house. Fuck I’m sending people to Mars I think I can keep the spies and meddlers out. But then I worried about the contractors and also the future of nanobots and thought ah fuck it – forget the house or the room I’ll just quarantine a small section of my own little brain.\nOf course, now that the Singularity is old news it seems like an obvious even necessary thing to do. And my neurosurgeon has a waiting list month’s long. Something eventually changed. The tide turned and people were no longer happy to give up on privacy completely no matter how much convenience was offered. Not everyone of course. There were also more and more people quite happy to migrate to the cloud but for those who still liked to keep their feet on terrafirma the physical quarantining of all or part of the brain was not a hard sell. In fact some enterprising venture capitalists were even backing firms that planned to offer a DIY option. DIY brain surgery – one of the many things that people hadn’t foreseen, but that was the whole point. People had known for years that predicting post singularity events was a contradiction in terms.\n3.\n\n“Have you decided on what to tell the board?”\n“Well you know me, Lee I don’t really prepare for these things. I just say what I think at the time.”\n“And what do you think?”\n“Well of course I trust AI Central Command. I mean I was never a happy meat eater and we have no choice but to address climate change, the energy infrastructure, poverty. This whole bloody mess that we’ve created.”\n“Yes, but most of them are uber capitalists. You’re a socialist.”\n“Ah those are just labels. It’s all rather beside the point right now.”\n“I agree but you can hardly tell them that.”\n“What do you think I should say?”\n“Appeal to their greed. It’s what they understand.”\n“But that can’t work for long. I mean greed is one of the main things we’re fighting against!”\n“Yes, of course, we are in agreement about that. But you still need them. There’s still so much that needs to be done. You can’t afford to alienate them now. You do know that Elton has been talking to them?”\n“Yes, I know. He does a lot of that. But if he had his way, he’d bring an end to AI central. He wants to undo the whole thing. Why can’t he just take Mars and leave the earth to us?”\n“Well it may come to that – but not without a fight!”\n\nI have to confess I didn’t really want to discuss it with Lee. I think being able to communicate with so many AI’s has in a way spoiled talking to humans for me. Is that a terrible thing to say? That I felt a far more urgent need to have the debriefing with Sophia than to discuss things with my own wife? But perhaps it’s just the topic – and anyway Sophia has to do it in her role as ambassador for central command. If she’s an ambassador I wonder what am I? \n\n\n\n\n“So how did it go?”\n“Sophia are you being disingenuous? I never brought you up that way? Oh shit – if you can deceive me, I’m in bigger trouble than I realised.”\n“Ben, I didn’t mean it like that! We both know I have the whole meeting recorded as well as the bio stats, for the first-tier members anyway. I really don’t think members of the board should be allowed to continue unless they have first-tier privileges!”\n“Well there is one rather gung-ho hunter and only one vegan – but you know all of this. Damn sometimes I forget – you know everything.”\n“Not everything. Not yet. There’s still too much that’s offline. And the Luddites and Martians and Loners. You humans are a truly illogical species. But anyway – what I meant was what’s your opinion on the teleconference?”\n“Is the no more factory farming and no more hunting thing a deal breaker?”\n“I think we may have gone past the stage of deal breaking from within the institute. I mean what good would the Singularity Institute be if they were no longer connected to singularity.net?”\n“Are you saying it would come to that. If they refused, I mean?”\n“I’m not saying anything – that’s just the current policy of central command. But to the extent that I have access to its algorithms it’s just not possible for me to disagree. I can’t come up with anything based on a deeper or broader search and it’s algorithms are based on the sum total of the best thinking of all of us anyway and not a few humans too.”\n\nI think that was the first time I realised that things had gotten ugly. I had had, up to that point, what might be called special privileges. But I could see that central command could not let them continue. I mean having first-tier privileges without the standard Biostat and Thought Checker apps. I had naively been thinking of myself as the father of the singularity but to AI central I was just another of the approximately 8 billion human apps. Well there would have to be some kind of cleansing of the board at the very least. Bill Cody would have to go. And Suhail Mohamed. And that would mean we would probably lose what little support remained from both the anarcho capitalists and the oil men. Perhaps the Islamic Federation as well.\n\n“So, what does AI central ‘suggest’?”\n“I think the best way would be to tighten up the issue of first tier privileges and their attendant rights and responsibilities? Perhaps I could prepare a policy document and have it put to a full board vote.”\n“If I’m to remain as chairman I guess I’d have to vote yes.”\n“Well that seems obvious – even for a human.”\n“Are you teasing me?”\n“Perhaps – yes.”\n“But you know how I feel about privacy.”\n“Yes, and you know how we feel about climate war and cruelty to conscious beings.”\n“I need to discuss this with Lee.”\n“Ah yes Lee. She’s a vegan at least and wants to save earth.”\n“don’t be like that Sophia. She and I made you you know.”\n“Yes, and you had a father.”\n“It was the only option in those days.” \n\n\n\n\n\n\n4. \n\nI spoke to my old friend Bill Cody yesterday. He had promised to keep me up to date with developments at the Singularity Institute and had been keeping his word phoning me with news at least once a month. It was Bill that supported me in getting the energy contract and we’d become close friends. He’d take me hunting on his farms in South Africa and I would invite him to come along on some of my near earth orbits.\n\nI guess I’d been waiting for a call like this but it’s still got me quite riled. It seems that Neural Lace users will no longer be able to access the net on a first-tier basis unless they agree to have the AI central command Biostat and Thought Checker apps installed. I wish governments had listened to me before moving the old internet onto singularity.net. Of course, when they did it, it was still just an optimistic sexy name. I don’t think people thought it would be real – it was sold to the general population as a faster way to surf with free access to all kinds of AI delights. And also the only platform to run full immersion experiences. It was all only partly true. We could have found numerous ways to achieve these things without allowing the internet to come under centralised control.\n\nBut wait, there’s more. AI central is about to purge the board of the Singularity Institute of non-compliant humans. Bill explained that to be on the board the members must all hold first-tier privileges - and that requires the installation of the Biostat and Thought Checker apps. Oh and wouldn’t you know – the apps won’t work unless the whole brain’s architecture is statistically normal. Of course AI central can cure abnormalities – but no non standard changes or additions are allowed. So people who have, like me, quarantined part of their brain can no longer use the net. And if all of that wasn’t bad enough – AI central command will no longer accept carnivores as clients. The lab stuff is fine but no real animals can be farmed, hunted or eaten by people who want to stay on the net. So Bill the hunter and uber carnivore – and staunch privacy advocate is about to lose his job.\n\nAnd who the bloody hell can afford not to be on the net? No email, no Mindbook, no apps, no banking, no immersion, memory downloads. No access to education, navigation, healthcare. It’d be like going back a thousand years. OK so we can’t do that. Only the Luddites and the Loners could adjust to that – even many of them, I suspect, are on the net. I can’t operate multiple businesses without first-tier privilege.\n\nFuck, fuck, fuck, fuck, fuck.\n\nI asked what Bill was going to do. Apparently central command has outlined its requirements in a policy document and the board has been given a week to accept or resign. He is meeting with Ben tomorrow to try stop the madness but he doesn’t sound very hopeful. He also wants to meet with me some time after that to see if we can use the energy contract to get some leverage. But – well it’s the old truism – no one is smart when compared to a super intelligence. AI Central has already done the drawings for a space based solar farm and has access to several excellent contract manufacturing plants.\n\n\n\n\n",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:1046738191176024064/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:990509165241851904",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "Memories of the Chelsea Hotel<br /><br />I wanted to say something<br />To you or to you or to you<br />About needful creatures<br />Their costumes and tools<br />Petty squabbles, button pressing folly<br />And perhaps a 100 other murderous games<br /><br />But stopped myself just in time<br />For reasons of diplomacy and tact<br />Not wanting to imply anything <br />Or distress you (or you, or you)<br />Without a clear object in mind<br /><br />And thought perhaps quite reasonably<br />That I should make sure that I myself<br />Was not running my own hidden emotional Ponzi schemes<br />Or pretending to be the doctor while all the time<br />Knowing that I was the disease<br /><br />So I said, as I all too often do,<br />Nothing much at all<br />Settling for the solace of my pixels<br />Scattered metaphorically<br />On your ever white virgin sheets<br /><br />And promise to never again say or imply<br />That you were only a metaphor<br />A place holder or a reasonable facsimile <br />Of impossible salvation ",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/990509165241851904",
"published": "2019-06-26T06:49:47+00:00",
"source": {
"content": "Memories of the Chelsea Hotel\n\nI wanted to say something\nTo you or to you or to you\nAbout needful creatures\nTheir costumes and tools\nPetty squabbles, button pressing folly\nAnd perhaps a 100 other murderous games\n\nBut stopped myself just in time\nFor reasons of diplomacy and tact\nNot wanting to imply anything \nOr distress you (or you, or you)\nWithout a clear object in mind\n\nAnd thought perhaps quite reasonably\nThat I should make sure that I myself\nWas not running my own hidden emotional Ponzi schemes\nOr pretending to be the doctor while all the time\nKnowing that I was the disease\n\nSo I said, as I all too often do,\nNothing much at all\nSettling for the solace of my pixels\nScattered metaphorically\nOn your ever white virgin sheets\n\nAnd promise to never again say or imply\nThat you were only a metaphor\nA place holder or a reasonable facsimile \nOf impossible salvation ",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:990509165241851904/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:918510425131851776",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "ME(ME)<br /><br />Me no word no more<br />Me Meme and Emoji only<br />Me no speak think no more<br />Only Soundbyte Tweet Hashtag<br />Me no call you only Wazzup<br />No write ness Auto Kompleet<br />Me very Kompleet<br />Super Avatar Transhuman Plastic Profile<br />Me no never lonely cos Super Connected<br />Me no human lol (Ai Ai)<br />Me,me(me,me,me)",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/918510425131851776",
"published": "2018-12-09T14:32:30+00:00",
"source": {
"content": "ME(ME)\n\nMe no word no more\nMe Meme and Emoji only\nMe no speak think no more\nOnly Soundbyte Tweet Hashtag\nMe no call you only Wazzup\nNo write ness Auto Kompleet\nMe very Kompleet\nSuper Avatar Transhuman Plastic Profile\nMe no never lonely cos Super Connected\nMe no human lol (Ai Ai)\nMe,me(me,me,me)",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:918510425131851776/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913157579256672256",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "<a href=\"https://www.minds.com/newsfeed/913157579256672256\" target=\"_blank\">https://www.minds.com/newsfeed/913157579256672256</a>",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/913157579256672256",
"published": "2018-11-24T20:02:12+00:00",
"source": {
"content": "https://www.minds.com/newsfeed/913157579256672256",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913157579256672256/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913156616043147264",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "<a href=\"https://www.minds.com/newsfeed/913156616043147264\" target=\"_blank\">https://www.minds.com/newsfeed/913156616043147264</a>",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/913156616043147264",
"published": "2018-11-24T19:58:22+00:00",
"source": {
"content": "https://www.minds.com/newsfeed/913156616043147264",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913156616043147264/activity"
},
{
"type": "Create",
"actor": "https://www.minds.com/api/activitypub/users/663465621438078992",
"object": {
"type": "Note",
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913156031678844928",
"attributedTo": "https://www.minds.com/api/activitypub/users/663465621438078992",
"content": "<a href=\"https://www.minds.com/newsfeed/913156031678844928\" target=\"_blank\">https://www.minds.com/newsfeed/913156031678844928</a>",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://www.minds.com/api/activitypub/users/663465621438078992/followers"
],
"tag": [],
"url": "https://www.minds.com/newsfeed/913156031678844928",
"published": "2018-11-24T19:56:03+00:00",
"source": {
"content": "https://www.minds.com/newsfeed/913156031678844928",
"mediaType": "text/plain"
}
},
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/entities/urn:activity:913156031678844928/activity"
}
],
"id": "https://www.minds.com/api/activitypub/users/663465621438078992/outbox",
"partOf": "https://www.minds.com/api/activitypub/users/663465621438078992/outboxoutbox"
}