I’ve been binge-watching Cyberpunk 2077 reviews and playthroughs. The game looks amazing, and also amazingly glitchy. I have an older generation console, which means I’d either be getting a buggy mess or have to install massive updates, which with my internet connection would take forever. But what can one make of a game that gives us this:

And also this:

I’m sure I wouldn’t mind the bugs, though. My favorite game of all time is KotOR II. Fallout: New Vegas and Mass Effect: Andromeda are not far behind. Kids these days are spoiled; expecting their immersive sandbox RPGs to work perfectly. Why, in my day, we played KotOR on a backwards compatible Xbox 360 and the frame rate would slow down to, like, three frames per second during combat and the sound would randomly cut out. And we were grateful!

No, my real issue with this game is the same thing that makes me so interested in it: it’s too big. The time that I could spend in the neon dystopia of Night City, fighting street gangs and cyborgs, driving futuristic cars and customizing my avatar’s weaponry, abilities and appearance, is daunting

But of course, that’s the whole selling point of the thing! We want to distract ourselves from daily life by escaping into an anarchic world polluted by drugs and depravity, and where the only law is made by ruthless mega-corporations. [Editor’s note: something about that sentence seems off–consider revising.] Who wouldn’t want that? But then I look at these skill trees and shake my head. There’s no way I’d have time for all this. 

An aesthetic, at least to someone who appreciates it, is like a drug. The more of it you get, the more you need to get that same feeling it gave you on the first hit. The thrill of a bizarre techno-city that filmgoers of 1927 could get from Metropolis now requires 100GB of a high saturation battlefield with police chases and cyborgs. And Keanu Reeves–although somehow I feel if he were transported to 1927, he could have fit in to Fritz Lang’s world, too. Maybe he’s a time-traveling cyber-wizard.

There are a couple ways this trend of accelerating aesthetic experience can go: one is total immersion in VR worlds. The technology is almost there. I wouldn’t be one myself, but I can believe there are people who are so obsessed with aesthetic experiences, they would submerge their physical bodies in some type of suspended animation and don the headgear to live 24/7 in a simulation. If this sounds creepy–and/or reminds you of another film starring Mr. Reeves–it should. But not surprising. The hallmark of the classic decadent is, frankly, a weird obsession with aesthetic authenticity. As the crazed protagonist of Lovecraft’s 1922 short story The Hound put it:

Wearied with the commonplaces of a prosaic world; where even the joys of romance and adventure soon grow stale, St John and I had followed enthusiastically every aesthetic and intellectual movement which promised respite from our devastating ennui. The enigmas of the symbolists and the ecstasies of the pre-Raphaelites all were ours in their time, but each new mood was drained too soon, of its diverting novelty and appeal. 

Only the somber philosophy of the decadents could help us, and this we found potent only by increasing gradually the depth and diabolism… till finally there remained for us only the more direct stimuli of unnatural personal experiences and adventures. 

If you’re a horror fan who has never read The Hound, do yourself a favor and check it out–it’s one of HPL’s best. If, on the other hand, just reading that has skeeved you out so much you don’t even want to know what the narrator and St John got up to next, well, I can’t blame you. Let’s just say it ends badly for them.

While researching this post–yes, believe it or not, I actually do research on these things before I vomit them forth into the blogosphere–I found this fascinating US ad for Metropolis:

This looks like a poster for a musical comedy. Maybe even a parody of a musical comedy, as written by P.G. Wodehouse: “What ho! It seems young Freder has fallen into the soup once more…”

Can you imagine going to see the movie you think you’re getting based on that ad and actually seeing Metropolis? I’m not sure I can, although I’d sure like to. The dissonance is so strong that it’s actually beautiful. I don’t care if you’re running it on the best gaming PC with the best monitor in existence, I’ll bet you anything that Cyberpunk 2077 can’t hold a candle to the pure, unexpected, unadulterated shock of techno-decadentism you would get from that experience.

Which brings me to the other path I could see this aesthetic going: a future where cyber-aesthetes learn discipline, focus, and restraint. Values which are, admittedly, fundamentally not the values of a decadent… but hear me out on this.

I’ve used this analogy before, but think of entertainment media as a sort of expansion pack for your imagination. A book gives you raw words that you have to imagine. A black-and-white silent film like Metropolis gives you some images to work with, but it’s still not a whole visual world. A color film gives you more, a video game still more, and VR games are basically taking the place of your imagination. 

The more you’re relying on media, the less you’re using your imagination. Which I totally understand, by the way. Using your imagination is hard. It’s awfully tempting to just sit back and let the computer do the work. And we’re right back to option one, where we lose ourselves in VR world.

Except, as we have seen, we’re already at the limits of how much of this we can take. At least, I am. I admit that your mileage may vary. But for me, the prospect is overwhelming. Immersion is supposed to be a feature, not a bug, and yet it’s also my principle reason for not buying this game.

But all that is really needed to experience an aesthetic–whether techno-decadentism or anything else–is imagination. Like Arthur Conan Doyle’s famous fictional detective said, “From a drop of water… a logician could infer the possibility of an Atlantic or a Niagara without having seen or heard of one or the other.” Likewise, a cyberpunk future can be inferred simply from reading a single sentence describing it. All that’s left is the imagining. The hard part is sharing it. But since the entire thesis of decadents is that art is personal, does that even matter that much? Surely anyone like-minded enough to care will be able to understand it well enough to share it.

Either we will learn enough self-discipline and mental concentration that we can satisfy our desire to create whole worlds made purely of imagination, without the need to manifest them through technology, or we will create simulation technology so powerful that our own most human qualities become obsolete. 

As Paul Graham once wrote, “we’ll increasingly be defined by what we say no to.” Aesthetics, which are really just the crystallization of moods that we imagine, are wonderful and strange. But saying “no” to the ability to have such things created for us in simulations is increasingly our only hope of retaining the ability to imagine at all. And after all, saying “no” to what is normal and routine is the essence of counter-culture, punk, and decadence.

hand old retro phone
Photo by Tookapic on Pexels.com

I still use an old flip phone. It makes calls. It can send texts, albeit not long ones. It even has a camera, although the lens is so smudged it’s basically useless.

Would it be fun to have a phone with apps and a better camera and a connection to Cloud storage? Sure, it would. In fact, that’s exactly the problem–I’d spend all of my time on it. 

Carrie Rubin tweeted this earlier today:

By coincidence, I was reading Paul Graham’s 2010 essay, “The Acceleration of Addictiveness” earlier in the day, in which he says:

“Most people I know have problems with Internet addiction. We’re all trying to figure out our own customs for getting free of it. That’s why I don’t have an iPhone, for example; the last thing I want is for the Internet to follow me out into the world.”

He’s right. Our challenge now is to get away from all the technology. Like I wrote the other week, it’s getting harder and harder to avoid the ever-increasing growth rate of technology. We are getting swamped by it.

The flip phone is bad enough as it is. Recently, I read that keeping your phone in your pocket (where I’d always kept it) can cause male infertility.¹ So I started keeping my phone in a briefcase, and leaving it behind when I go for a walk or go to the gym. It was amazing how liberating this felt—rather than checking the time every couple minutes, or looking to see if I had new messages, I just figured “it can wait”. And it can. 

I realize that sometimes you want to have your phone. I’m fortunate in that my gym is practically next door to where I live. If it were farther, and I wanted to take my phone, I’d take a gym bag. But I’m rapidly getting addicted to going for walks without it. If you feel unsafe walking alone without your phone, I suggest trying to find a friend or group of friends to go with you—you can have better conversation and get some exercise as well.²

When I wrote The Directorate, I ran up against the problem of how to devise some even more powerful and omni-present technology than smart phones for the characters to use. It seemed like they’d have that by 2223. But the more I thought about it, the more I started to think our current technologies dominate life to a degree that already seemed like something out of sci-fi. And at that point, I realized the really futuristic innovation might be if people would opt out of being constantly attached to their communication devices.

I’m not anti-technology by any stretch. I couldn’t do most of the stuff that I do for work and for fun without computers, game consoles and, of course, my trusty iPad. I wouldn’t have anybody to write this for if the internet didn’t connect me with wonderful people all over the world. But as with all good things, you need to have some discipline so you don’t overdo it. A smart phone just makes it that much harder for me to maintain that discipline.

Footnotes

  1. To be fair, the evidence on this is mixed. When I researched it, I found plenty of places saying there was “no clear link” as well. Cell phones are relatively new; it’ll probably be a while yet before the researchers come to any definite conclusions. But I’m playing it safe on this one.
  2. I know, there’s something to be said for solo walks, too. Believe me, I’m a misanthrope an introvert; I get it.

FullSizeRender[1] 2

The number one issue that humanity faces today is technological growth. If you look under the surface of most political issues, what drives them is the way technology has given us abilities we did not previously have, or information we could not previously have accessed.

What makes this especially powerful is that technology evolves much faster than human beings do. Technology can go through many generations in the course of one human’s lifetime.

This is important, because in evolutionary biology, new traits usually emerge over the course of generations. (This is why biologists usually study traits in organisms with short generations. You can observe multiple generations of flies over the course of a year.)

But since technology moves faster than humans can evolve new traits, it means that we are always playing from behind. When I was born, cell phones were huge, unwieldy things used by rich people, sales reps, and techies. Now they’re everywhere, and are more powerful than the top-of-the-line supercomputers of three decades ago.

For the last 200 years, technological progress has been increasing at an incredible rate. And humans have often suffered by being slow to adapt. This is illustrated most dramatically by wars: in World War I, the officers had all been trained in tactics derived from the Napoleonic era. This resulted in huge massacres, as cavalry and infantry charges–which would have worked against men with inaccurate single-shot rifles–were torn to pieces by machine guns. Technology had made a huge leap in the century between the battle of Waterloo and the battle of Artois. And that was as nothing compared to the leap it would make in the next thirty years, with the advent of the Atomic Bomb.

The thing is, while it may seem to us like a long time since the days of cavalry charges and flintlock rifles, in terms of human history, that’s a drop in the bucket. Homo sapiens first emerged roughly 200,000 years ago. On that scale, Waterloo might as well be yesterday, and the Roman Empire was just last week.

For the vast majority of our existence, life was pretty much the same: people worked the land and hunted and raised families. Periodically, they organized into larger tribes to make war or trade. If you took a citizen of Athens from, say, 450 BCE and transported him to Renaissance Italy—nearly 2000 years later–he’d still have a pretty good handle on how things worked once he got past the language barrier. Whereas if you transported somebody from 1890s London to the present day—a mere 128 years!—he’d have no idea what was happening.

When you read history, it’s easy to be struck by how human nature seems unchanged over the centuries. We can recognize things in the texts of the earliest historians and philosophers that seem analogous to modern phenomena. While it may seem like this means human nature is eternal, what it really signifies is that it hasn’t been that long, in biological terms, since what we think of as “ancient times”.

It’s commonplace to observe the technology changes, but human nature remains the same. But observing it is one thing; grasping the full implications is another.

For instance, there is a major “culture war” debate in the U.S. over the issue of transgender rights. Those who favor transgender rights view their opponents as closed-minded bigots. Those opposed see the others as radicals bent on destroying the social order. What both sides ignore is the fact that until very recently, transgender people had no medical treatment available to them. For hundreds of thousands of years, transgender people had no option but to live in the body they were born with. And the rest of the population scarcely even knew they existed; and so built whole societies premised on two rigid gender roles. It wasn’t until very recent breakthroughs in medical technology that any other option became viable.

Once you view it in these terms, you realize it isn’t a liberal plot to destabilize society, but simply a group of people able to access treatment that previously did not exist. Likewise, you also realize the reason so many people are responding with fear and suspicion is that history and tradition provide no guidelines for how to deal with the issue. It simply wasn’t possible in the past.

A number of social conflicts, I suspect, are in fact the result of people being optimized for a very different world than the one we actually live in. Ancient prohibitions against homosexuality, sodomy, and other non-reproductive sexual behavior made some sense in the context of their time—in the past, when mortality rates were high, people needed everyone who was physically capable of reproducing to do so, personal feelings notwithstanding. It was about survival of the group, not any one individual.

Nowadays humanity is threatened more by overpopulation than by extinction—but we’re still adapted to the world of thousands of years ago. That’s just one example. I think people in the developed world still have a slightly-irrational fear of famine; simply because we evolved over millennia where food was, in fact, extremely scarce. (This is why philosophies like the so-called “abundance mentality” seem so counter-intuitive. In the past, it would’ve been suicide to assume there were enough resources for everybody.)

Instinct is a powerful thing, and incredibly dangerous when it gets outdated. To borrow an example from Paul Graham: because human beings haven’t had the power of flight until recently, it’s easy for our senses to be fooled in bad visibility.

Of course, this is something where we use technology to make up for our own shortcomings. A human being would have no idea how to fly a plane if not for instruments that correctly show the position of the aircraft. And this leads to another obvious point about technological evolution—it is, in many ways, nothing short of miraculous for humans. It allows us to accomplish things our ancestors could never have imagined. Whatever bad side effects it has, no one could ever rationally argue that we’d be better off getting rid of all of it and returning to primitive life.

The saving grace is that technology has been designed by humans and for humans, and so generally is compatible with the needs of humans. The things that conflict with human needs aren’t usually a direct result of this, but rather side-effects the designers never thought of.

But side-effects, almost by definition, are insidious. Any obvious, seriously harmful side-effect gets fixed early on. The ones that don’t usually fall into one or more of the following categories:

  • Not obvious
  • Don’t seem harmful at first
  • Can’t be fixed without destroying the benefit

The designers of automobiles probably never thought the exhaust would cause pollution; even if they had, they probably wouldn’t have realized that cars would be widely used enough for it to matter. Marie and Pierre Curie had no idea the new element they had discovered was dangerous. It seemed like just a useful illuminative substance. And pretty much every communications technology in history, from the printing press on, has the potential to spread pernicious lies and propaganda just as much as news and useful information. But no one can figure out a way to remove the bad information without also getting rid of the good—the only option is censorship, which can pose a danger in its own right.

I’ll say it again for emphasis: technology is evolving faster than humans. As a result, our instincts will often lie to us when it comes to dealing with technology. It’s the same way modern junk food is engineered to please our taste buds while poisoning our bodies—it’s designed to set off all the right sensors that tell us “get more of this”.

The rise of nationalism throughout the world in the last decade has gone hand-in-hand with the rise of social media. It’s not a coincidence. Social media plays to an old instinct that takes human society back to its most basic state: the tribe, and the desire to win approval from that tribe. But in the past, when we were in tribes, we didn’t know what the rival tribes or nation-states were doing—they were in far-off lands and rarely encountered each other. But now, we always know what they are doing—they are just a click away. And because they are a different tribe, our instincts tell us to fear them, as our ancestors feared invaders from distant places.

What can we do about this? We can’t get rid of technology; nor would we want to. And I don’t think it’s a good idea to make it into a political question. Politicians want easy, non-nuanced issues, where they can cast themselves as leaders of a huge, virtuous majority against a tiny, vaguely-defined band of evildoers. That would be a terrible thing to happen on this issue. As we’ve already seen in the social issues I’ve mentioned earlier, politicians tend to cast these things as moral questions rather than technological change ones.

We’re going to have to deal with this one on our own. But how? After all, technology brings huge benefits. How can we keep getting those while minimizing the side effects? We don’t want to completely ignore our instincts—not all of them are outdated, after all—but we can’t always trust them, either.

The best advice I can give is to always be on the lookout for what side-effects technology produces in your own life. Always ask yourself what it’s causing you to do differently, and why. Then you’ll start to look for the same in the wider world. We know human nature doesn’t change that much; so when you see or read about a large number of people behaving in an unusual way, or a new cultural phenomenon, there’s a decent chance that it’s in response to some new technology.

It’s easy to look at the dangers of technology and decide you want to opt out, throw it all away, and return to the simple life. This is probably healthy in small doses but it’s impractical on a large scale or for an entire lifetime. What I’m advising is cultivating an attitude of extreme adaptability, where you are so flexible that you can both use new technology and see the potential problems with it coming before they hit you. Listen to your instincts, but know when you need to disregard them. Remember, your instincts are optimized to give you the best chance at survival in a world of agrarian societies and/or tribes of hunter-gatherers. And they are damn good at it; but their mileage may vary in a world of computers, nanomachines, and space travel.

First of all, thanks are in order to loyal reader Natalie of boatsofoats.com. She notified me about a problem with the annotations on this page. I’m not even sure if I’ve completely fixed it yet, but I figure if not, I can at least make it up to her by directing some traffic to her excellent blog.

As for the annotations: I know nothing about HTML. But doing the original annotations for that page was not bad–it was just this:

<span text=”Whatever blithering comment I had”>Actual story text</span>

I then highlighted it in red to make it obvious which parts to mouse over.

But the problem was, it wouldn’t work on mobile devices–tablets, phones etc. And this bothered me. I tried to tell myself it was ok. But it was the sort of thing that would nag at me.

There must be a better way, I thought.

After consulting with a family member who does web design, downloading some plugins, and experimenting with CSS and JavaScript, I think I’ve got something.

Mind you, I said I think. I’m not actually sure if it works on all devices yet. It definitely works on my iPad, which it didn’t originally when I was just using HTML.

That’s where you come in. I am calling on readers to come to my aid and check out the page to see if the annotations work for them. In exchange…

Uh…

Let’s see,… I will teach you something about weird fiction from the 1890s?

How’s that sound?

Oh, another thing; some of the modifications I did seemed to (temporarily) play merry hell with the comments. (e.g. reducing my all-time comment count to zero, removing comment ‘likes’, stuff like that.) I think it’s fixed now, but if you notice any comment issues, let me know… unless the issue is that you are unable to comment, in which case you can use the form below or tweet at me

 

.

There’s a lot to hate about social media.  From idiot trolls to widespread fake news stories, there’s some reason to believe social media is responsible for many of the problems in the world today. In fact, I’d say social media is a net negative for humanity.

(This is pretty ironic, because I used to be in charge of social media for my employer.  And also I’m writing this blog, and I’m going to tweet the link after I’m done.)

But social media does sometimes have benefits.  The other day I was doing what most millennials do with Twitter: using it to look for some good Gilbert and Sullivan information.  Quite by chance, I came across Dr. Alison Vincent’s Twitter account.

Dr. Vincent is the CTO for Cisco UK and Ireland, and an all-around cool person. Her C.V. is very impressive, but the reason I recognized her was from some very enjoyable performances of Gilbert and Sullivan by the Southampton Operatic Society that I had seen many years ago.

I tweeted my thanks to her for the performances, and she very kindly replied.  Then, the Southampton Operatic Society replied as well, with the above clip of one of their performances. Then another one of the performers, Mr. Mike Pavitt, also kindly responded. It was a thoroughly nice exchange all around.

I’d seen those performances about eight years ago on Youtube, but it had never occurred to me in all that time to thank the people involved.  Without social media, I never would have been able to do so.

I was thinking today about some of the great thinkers in history, and how the vast majority of the great minds had so little access to information compared to the average person in the present day.

It’s sort of sad when you think about it.  Take any great thinker from history, and then think about the logistics required for him or her to get the level of education they received.  They had to go to school, study, get books from libraries–if they were available at all.  If you were reading and you found a word you didn’t know, you had to go find a dictionary and hope you could find it in there. Not to mention that the mundane day-to-day tasks also took longer and were more difficult.  And yet, there were people thinking deep philosophical thoughts, inventing new technologies, writing great books, founding nations, etc. etc.

Compare them to me: I have almost instantaneous access to all the recorded knowledge in human history via the internet, I can have it translated instantly if need be, and I can do it while sitting at my desk.  On paper, I should probably be more well-educated and accomplished than the entire population of the world in the 1600s.  But I’m not.  If somebody from past times came to the present, they’d be appalled by how little I’d done with the wealth of resources I have.

Suppose John Locke had been able to access the internet.  He probably would have invented the perfect system of government in 10 minutes, if he kept up his past rate of productivity. How many times over could the great economic minds have solved the U.S. economic crisis in the time I spent watching cat videos?

I feel like an under-achiever, I guess is what I’m saying.

Shamus Young had a good post about the history of the internet. It introduced me to a phrase I’d never heard before, describing when the internet came to be how it is now, full of trolls and imbeciles. It’s from someone named Dave Fischer, who said: “September 1993 will go down in net.history as the September that never ended.”

What did he mean by that? Young explains that prior to ’93:

September was a big deal for the internet back in those days. As you can imagine, etiquette was important in a world where there were no moderators and everyone was on the honor system. Every September a flood of college freshmen would be given internet access for the first time in their lives. Then they would blunder online and make a mess of things by posting things to the wrong place, or typing in all caps, or failing to read the FAQ…. So every September was this chaotic time where the net had to assimilate a few thousand newcomers all at once, and it usually took about a month for things to calm down again.

It’s funny to read about the internet as a civilized place where ideas could be discussed in a thoughtful manner.  I came later to the internet, so I feel like somebody in a post-apocalyptic setting reading about the lost Golden Age before the great collapse.

Still, there are pockets of intelligent discourse–I like to think of this blog as one of them. Shamus’s is another (although he manages that by banning any talk of religion or politics.) But it’s funny to think that there was a time when it wasn’t a problem trying to find sites where people could have discussions without sinking into a Topix-like morass of name-calling.

So, no doubt even non-gamers have heard the fuss about the new gaming consoles coming out this month.  It’s the first new console generation when I have had no desire to buy any of the new consoles.  Here’s why:

Now, graphics aren’t all that matters, and if there were a good launch title–say, a Fallout 4, made by Obsidian–on these consoles, I would likely get one.  But there isn’t. All there is is Madden and Call of Duty: Ghosts.   (So named, I assume,  because everyone is a ghost after all the apocalyptic world wars depicted in previous Calls of Duty.)

I am not seeing any reason to upgrade.

The internet has been making fun of super-rich Dallas Cowboys owner Jerry Jones because of this picture of him using a flip phone at a game the other day.

I’m not usually one to defend Jerry Jones, mostly because his team has humiliated mine many times over the years.  But I also use a flip phone–it works well enough for making calls, so why shouldn’t I?  Actually, this is probably why Jerry Jones is so rich–by not paying extra for useless stuff, like a phone with lots of superfluous bells and whistles.

Then again, maybe not.

Anyway, I didn’t realize having a flip phone was so weird.  That makes feel extra cool for having one.  It’s probably because I don’t like to talk on the phone–I don’t use it much, so it’s not like I would want to spend a lot of money on it.

Despite the fact that I like history and I like movies,  I don’t think a lot about about the history of the movie industry.  But I was reading the other day about the 1964 movie The Fall of the Roman Empire, which I’d never even heard of, but sounds very interesting, as it has a very strong cast.  (Too bad Edward Gibbon didn’t get screenwriting credit.)

The film was a fairly bad box office failure, reminding me of another epic historical film that famously lost money: Cleopatra, which I blogged about here.  It wasn’t that people didn’t want to see Cleopatra; it was just that it was so expensive it couldn’t make back its massive cost.

It seems like “epic” movies were big in the 1960s, until they ran into bombs like Cleopatra, at which point the industry turned towards smaller, more “personal” movies, until George Lucas and Steven Spielberg came along and turned things back toward the epic scale.

I think “epic” movies–think movies with ornate sets and large crowds–became prohibitively expensive to make, so they turned away from them in the ’70s.  Then the advent of CGI made it possible for the genre to be resurrected.  Look at the Wikipedia article on historical epic films, and take note of the dates:

Examples of historical epics include Intolerance (1916), Gone with the Wind (1939), The Ten Commandments (1956), Ben-Hur (1959), Spartacus (1960), Lawrence of Arabia (1962), Cleopatra (1963), Doctor Zhivago (1965), Barry Lyndon (1975), Gandhi (1982), Braveheart (1995), Titanic (1997), Joan of Arc (1999), Gladiator (2000), Troy (2004), Alexander (2004), Kingdom of Heaven (2005), and Les Misérables (2012).

Now, the “new” epics are not as really the same as the “old” epics–it’s hard to put your finger on exactly how, but there is a feeling of unreality about the new CGI based movies.  They lack “grittiness”–a term normally associated with the non-epics made in the 1970s, but which applies to the macro scale as well.

“Capriccio Romano”, by Bernardo Bellotto. 1740s. Image via Wikipedia.

It can be done–one reason I think the Star Wars prequels are better than people give them credit for is that they do a better job emulating the “feel” of the bygone epic films than most other modern epics do.  George Lucas may be over-reliant on CGI, and he may have done more than anyone else to usher in the era of cheap epics, but he himself knows what he’s doing when it comes to CGI effects.   This could just be because Lucas (and Spielberg) are old enough to remember the era of the original epic movie era, and so can understand them enough to imitate them expertly.

But now that CGI is so prevalent, and makes epics so easy (relatively speaking) it makes all epics too overdone, too focused upon spectacle, and loses the deeper meaning.  I believe that some historians feel the same thing happened to cause the decline of Rome.   “Bread and circuses” indeed…