Coding vs. fiction


At work, I sometimes interact with computers using code. This can be a strangely soothing exercise for anti-social types like myself, which I think has to do with the unambiguous nature of code. Computers don’t seem to be very good with dealing with ambiguity, at least for now. Tell a computer that the Answer to the Ultimate Question of Life, the Universe, and Everything is 42, and the computer will freak out and complain that it was expecting a string instead of an integer (passing ’42’ instead of 42 should hopefully clear that up). The actual human world is full of hypocrites using ambiguous phrases and systems, and code can be a nice escape into a different sort of mindset.

Learning to code is generally praised by policymakers and Your Parents because it can allow you to get some jobs that are safe (both in terms of physical danger and industry trends) and also pay decent wages. But driving this prosaic public image is the fact that code gives people the power to interact creatively with information using the tech infrastructure that has transformed the world but is still in growth mode. So it seems to me that more and more people will be required to code in some way or another as time goes on—even the types of people who have no reason at all to interact with code in the year 2016. And it might be interesting to think of the possible cultural ramifications of such a shift in society.

The mindset required for code is diametrically opposed to something like fiction, for example. Fiction celebrates ambiguity, whereas in code, ambiguity is a defect. I am pessimistic that something like fiction will be able to resist the economic clout of code in the future. The code mentality might increasingly exert its influence on pop oriented artistic media. We may start to find stories with strong, morally ambiguous characters like Princess Mononoke to be confusing, or conversely, we may start to see the illogical increasingly referenced, but merely as a gag, à la Family Guy. Obviously people will continue to have diverse tastes and viewpoints in the future, but depending on how widespread coding becomes it may give a not insignificant nudge to the tastes and preferences of a not insignificant number of people.

I did get the impression during college that my fellow students did not have much patience for ambiguity in fiction, or any other field for that matter. Everyone, myself included, was much too eager to find the “point” of a piece of writing, or impose some outside structure on it. The professors seemed much more level headed on this point, which I’m betting is because it’s impossible to spend so much time researching a particular area without seriously dealing with the question of ambiguity at some point, but students who actually let themselves be influenced by professors on topics like these seemed rare. Perhaps this sort of mindset has some sort of relationship with technology, but perhaps not.

There is nothing necessarily wrong with coding as a refuge of sorts for those troubled by ambiguity, but there can be a danger when the desire for consistency becomes unnaturally strong and takes on a social dimension. The novelist Haruki Murakami has done some interesting research into Japanese cult movements, and from his experiences he identifies the inability to accept contradiction as a trait that is exploited by cult leaders to gain devotees. Cult leaders can promise a fictional world that is free of contradiction, and some people are pained enough by the contradictions in society that they will accept anything that offers them relief.

An observation Murakami makes is that the perpetrators of the Aum Shinrikyo Tokyo subway attack were smart and educated, but despite, or perhaps even because of this, they were corrupted into committing horrific acts of violence. It would of course be wrong to read too much into the perpetrators’ biographies, but personally, I can’t shake the feeling that smart people often tend to have weird political views (my latest confirmation of this being Peter Thiel’s strange endorsement of Donald Trump). Smart people are, of course, better with dealing with the mental gymnastics of enforcing consistency, which is why they are better at writing code, or any other text that accurately hews to the rules of a prescriptive grammar. However, in the social realm, I get the feeling that forming political views based solely on intellect has the possibility to lead a person into territory that most normal people would recognize to be “bad” on a gut level.

Murakami’s antidote against the destructive fictions meant to exclude ambiguity is a different kind of fiction — the kind that is found in his novels that celebrates ambiguity. Despite Murakami’s conflict of interest in making such a statement, it seems valid to me, and it seems that fiction would also be an important balancing force against any cultural influences from technology. Given the inevitability of the march of technology, I hope it progresses in such a way that it enriches rather than displaces art and its social functions.