The destiny of Western culture: An open letter to Peter Kingsley



"Well over two thousand years ago, science as we know it was offered to the West with a warning tag attached to it: Use this, but don't be tricked by it. And of course, impatient little children that we are, we tore off the tag and ignored the warning."
Peter Kingsley, in Reality (2003).

One of the more salient intellectual events of 2019 for me, personally, was my discovery of the work of Peter Kingsley. Earlier this year, a friend gifted me a book by Kingsley, having perspicaciously suspected it would resonate with me at some level. And it surely did, for which I am deeply grateful to my friend (you know who you are). Unlike most of the books I read—which I tend to regard rather soberly and coolly—Kingsley's work left me irate, inspired, bemused, delighted and a few other things, all at the same time. Whatever the case, I am anything but indifferent to it, which is probably the greatest compliment I could pay to any author.

More importantly, Kingsley's work helped me situate my own in a historical context. It has become clearer to me what it is, exactly, that I am trying to do, where it fits in the long line of Western thought, what role it is supposed to play in our culture, and what the ultimate purpose of doing it is. I see the path ahead more clearly, have a sharper sense of direction for my work, and recognize how it ultimately comes together with that of others. This is what I'd like to discuss in this long post, which is doubtlessly the most important of the year.

In what follows, I refer to two books by Kingsley: Reality (2003) and Catafalque (2018). For the sake of simplicity, I shall cite them as 'R' and 'C,' respectively.


      


A culture's source and telos

Kingsley's central premise is that all cultures have a sacred source and purpose, including our own Western civilization: "everything, absolutely everything, anyone can name that makes our so-called civilization unique has a sacred source—a sacred purpose" (C: 228). The seed of every culture, including our own, is planted not through mere chance, habit or deliberate planning, but instead through visionary experience in altered states of consciousness. It is prophets who learn, and then inform us of, what our purpose is: "western civilization, just like any other, came into being out of prophecy; from revelation" (C: 231).

In our case, we can trace our roots back to visionary Greek philosopher-poets living in southern Italy about two and a half thousand years ago, particularly Parmenides. In Parmenides' poem On Nature we can find the origins of our Western culture. Uniquely, however, we are the only civilization that has neglected and forgotten its origin: "nowhere on this planet are you going to find one single traditional culture that doesn't remember ... having its sacred purpose and source" (C: 230).

Misunderstanding Parmenides

Indeed, Kingsley claims that we in the West have been misinterpreting and misrepresenting Parmenides' ideas since Plato, and modern scholarship has compounded the problem even further. Parmenides is seen as the founder of logic and rationality, of our particular way of discriminating truth from untruth, fact from fiction, through reasoning. According to this mainstream view, the Promethean powers of Western science, as embodied in technology, are the culmination of a way of thinking, feeling and behaving that can be traced back to Parmenides' manner of argumentation in his famous poem.

But Kingsley argues very persuasively (R: 1-306) that what Parmenides was trying to say was nothing of the kind. According to him, logic for Parmenides wasn't a formal system based on fixed axioms and theorems, meant to help us discern true from false ideas about reality; it wasn't grounded in some metaphysically primary realm of absolutes akin to Platonic Forms; it didn't derive its validity from some external reference. In summary, Kingsley argues that, for Parmenides, logic wasn't what we now call reason, but something much broader, deeper, unconstrained by fixed rules and formalisms.

True logic as incantation

As a matter of fact, according to Kingsley Parmenides' logic was a kind of incantation. The idea is that we live in a world of illusions, caught up in our own internal narratives and made-up categories about what is going on, completely oblivious to the true world that surrounds us and from which we derive our very being—i.e. reality. This illusion is unfathomably persuasive, has tremendous power and momentum. So to help one see through it and ultimately overcome it, an even more persuasive rhetorical device is required, a kind of spell or incantation woven with words, meant to disrupt our ordinary mental processes by poking them in just the right spots. This incantation is the true logic Parmenides gifted us: "We were dragged into this illusion by a force far greater than ourselves. Something even stronger has to drag us out. That's what logic is" (R: 143). True logic is thus a kind of spell meant to trick our internal story-telling, make it catch itself in contradiction and thereby release its grip, so we can escape the illusion. But unlike ordinary logic or reason, true logic is not grounded in fixed or absolute axioms and rules of derivation. It is malleable, flexible, not bound to external references. A 'logical' argument in this sense is whatever argument will actually persuade its target, whatever it takes.

This is a critical point, so allow me to belabor it a bit. If I were to use Parmenides' true logic on you, I would weave whatever argument line I felt would be compelling to you, irrespective of whether the argument is strictly rational or not, strictly consistent with a given set of fixed axioms or not. The ultimate goal of true logic is way too pragmatic for that: it is to get you out of the bind in which you continuously tie yourself up. True logic, thus, is a rhetorical incantation meant to be more persuasive than our inner narratives and categories. In essence, it is a semantic trick meant to break the spell of illusion, like cracking a crystal by gently tapping on it in just the right spot.

Kingsley explains that, for Parmenides, there were only two ways to approach reality: either we judge that everything we feel, think, perceive, imagine or otherwise experience exists as such—regardless of any correspondence with objective facts—or we must ultimately dismiss everything as non-existing. The latter option goes nowhere, for obvious reasons, which leaves only one viable path. The bind we find ourselves in is due to our hopeless attempt to find some compromise or middle ground between those two canonical options: we try to discriminate which of our mental states correspond to actual existents—i.e. to some external reference—and which don't. This, according to Kingsley's interpretation of Parmenides, is the core of the illusion. And true logic is a rhetorical tool meant to show that all such discriminations—if pursued consistently to their final implications—are ultimately self-defeating.

Parmenides' metaphysics

The implicit metaphysics being adopted here is, of course, subjective idealism: "for Greeks, the world of the gods [i.e. reality] had one very particular feature. This is that simply to think something is to make it exist: is to make it real" (R: 71-72). Therefore, "whatever we are aware of is, whatever we perceive or notice is, whatever we think of is" (R: 77). Everything that has mental existence exists as such—i.e. as a mental existent—and there is no other way in which it can exist: "There is nothing that exists except what can be thought or perceived" (R: 78). Therefore, the use of reason to discriminate between what exists from what doesn't exist is, well, ultimately unreasonable: "To choose good thoughts is to reject the bad ones—and to reject something is to entertain it, is to make it exist" (R: 80). The act of deciding that something does not, or cannot, exist immediately backfires and makes it exist, by the mere fact that the act forces us to think it into existence to begin with. Reason, as we normally apply it, is thus ultimately incoherent, even though it has its practical applications within the context of the illusion.

It is the subjective idealism he attributes to Parmenides that renders Kingsley's interpretation plausible and internally consistent: subjective idealism does away with the correspondence theory of truth, according to which mental states that correspond to objective facts are true, whereas those that don't aren't. Once these external references are done away with, all criteria of truth and existence become internal ones, and thus logic boils down to persuasion: what exists or is true is whatever mind has been persuaded to make exist or true. There is nothing outside mind, no objective facts out there, to make it otherwise. This is important, so allow me to repeat it: without external references, such as objective facts, logic boils down to persuasion; there is nothing else it can be.

Kingsley explains: "facts are of absolutely no significance in themselves: it's just as easy to get lost in facts as it is to get lost in fictions. ... All our facts, like all our reasoning, are just a façade" (R: 21-22), they hide something more essential behind them. And this 'something' is reality: pure stillness, a realm in which nothing ever moves or changes, in which everything is intrinsically connected to everything else in an indivisible whole, and where no time but the eternal present exists. That's why true logic is "a magical lure drawing us into oneness" (R: 144)—i.e. back to reality. But what is the metaphysical ground of this reality? It is consciousness: "Wherever it seems that you go, or come, everything happens in your consciousness. And that consciousness never moves, is always the same" (R: 80).

Notice that Kingsley's attribution of subjective idealism to Parmenides is based on the implicit assumption that the consciousness in question isn't just your or my personal consciousness alone; it is, instead, a transpersonal, universal consciousness within which all existence unfolds. Kingsley: "our thoughts are not ours; never have been. They are simply reality thinking itself" (R: 80); reality, or consciousness, is "utterly impersonal" (R: 160). Therefore, from the point of view of seemingly personal, individual minds, such as yours and mine, the idealism in question is actually objective idealism, such as the one I pursue in the body of my work. It is crucial to keep this understanding in mind, otherwise you will dismiss Kingsley's story way too quickly. His metaphysics isn't solipsism; he isn't saying that reality is your personal dream, or the materialization of your egotistic fantasies; he is not giving the ego divine powers of creation.

Reason is not true logic

Kingsley explains that, because we have historically misinterpreted and misrepresented Parmenides' intended meaning, we've ended up conjuring up reason out of what was meant to be true logic. But reason is a tool precisely for discriminating between mental states that correspond to ostensive external facts from those that don't. Under the metaphysical view that to think is to make exist, such discrimination is incoherent.

Therefore, by misunderstanding true logic, we've also departed from what was meant to be Western culture's foundational metaphysics. We've invented external references outside consciousness—i.e. outside reality—such as matter, energy, space and time. And then we've forced true logic "to operate, distorted and disfigured, in the world it had been designed to undermine" (R: 144). The result is reason, the rational discrimination of fact from fiction in an ostensively autonomous material world independent of consciousness.

For Kingsley, it is reason that keeps us stuck in the middle ground between the two canonical paths—namely, between judging either that everything we can conceive of exists as such, or that nothing exists. This, according to him, is the seminal mistake that has put our entire culture on the wrong footing. Logic is no longer regarded as a magical incantation meant to persuade us out of illusion, but has turned into a tool for perpetuating the illusion: "All our attempts to discriminate between reality and deception or between truth and illusion are exactly what keeps on tricking us" (R: 211).

The telos of Western culture

But what was it that we were originally supposed to do? What goal are we supposed to pursue? What is the "burning purpose at the heart of our Western world" (C: 205)?

Kingsley is not terribly explicit about it, but he does drop enough hints. For instance, he says that the modern attitude towards the divine can be summarized in the words,
“Let’s make sure the divine takes good care of us. But as for finding what, in reality, the divine might possibly need: let it look after itself.” From here onwards one can sit back and watch how the idea of looking after the gods starts, almost by magic, vanishing from the western world. ... And now it never for a moment occurs to us that the divine might be suffering, aching from our neglect; that the sacred desperately longs for our attention far more than we in some occasional, unconscious spasm might feel a brief burst of embarrassed longing for it. (C: 29-30)
The suggestion is that the meaning and purpose of our lives is to help fulfill some divine need, which can only be fulfilled in, or by means of, the state of consciousness we call life. This is reinforced by the fact that Kingsley overtly associates himself with the thought of Swiss psychiatrist Carl Jung, particularly Jung's book Answer to Job. And in that book, we find Jung saying:
what does man posses that God does not have? Because of his littleness, puniness, and defencelessness against the Almighty, he possesses ... a somewhat keener consciousness based on self-reflection: he must, in order to survive, always be mindful of his impotence. God has no need of this circumspection, for nowhere does he come up against an insuperable obstacle that would force him to hesitate and hence make him reflect on himself.
It seems to me that all cultures have the purpose to serve the divine by means of the state of consciousness we call life, the latter not being available to the divine itself. But each culture is meant to fulfill this sacred task in its own particular way, according to its own particular dispositions or strengths. In the case of Western culture, our strength is our sharply developed meta-cognition, or self-reflection; our introspective ability to turn our own thoughts, emotions, perceptions and fantasies into objects of thought, recursively. Western culture is thus meant to serve the divine by contributing to it the meta-cognitive insight of self-realization: through us and our Western science—"a gift offered by the gods with a sacred purpose" (C: 229)—the divine recognizes itself.

The failure of the West

However, Kingsley ultimately concludes that we, in the West, have failed in our divine task. We've failed not only because we've misunderstood Parmenides—and thus bungled our metaphysics and became unable to properly use the sacred tool we were given, i.e. true logic—but for other, more insidious reasons as well.

Indeed, to serve the divine requires "a deeply religious attitude, the sense that it's all for the sake of something far greater than ourselves" (C: 122). But to nurture and sustain such religious attitude, people must "step out of their personal dramas" (Ibid.). Yet we, in the West, indulge in personal dramas, having conflated individual freedom and expression with egocentrism, even subtle forms of narcissism. We've forgotten that, "as humans we are archetypes" (C: 143), instances of a universal template of being, so that "Whatever we think of as personal is in fact profoundly inhuman, while it's only in the utter objectivity of the impersonal that we find our humanity" (Ibid.).

We've immersed ourselves in the dehumanizing "brutality of our western society with its normality and triviality as well as the hollow emptiness of its surveillance" (C: 230). And "when a culture forces a human to act so automatically, talk so robotically, the humanity inside the person is lost ... Everything can seem to go on working and functioning, for a while. But our role in existence has been hollowed out; our human purpose on this planet turned completely upside down" (C: 434-435). By losing touch with our own humanity, which is what links us with the divine, we've forfeited intimacy with our sacred destiny.

Worse yet, Kingsley maintains that there is no fixing the problem, no rescuing Western culture, no finding our path again: "this world of ours is already dead. It existed for a while, did the best it could, but is nothing more than a lifeless remnant of what it was meant to be. ... And this is the moment for marking, and honouring, the passing of our culture ... to keep on indulging in optimism is a shameless dereliction of our duty" (C: 442).

Well, I am not an optimist... But I don't agree.

De Facto Western culture & the value of error

The first thing to notice is that, although Kingsley has convinced at least me that we did misinterpret Parmenides, and that the correct interpretation is that offered by Kingsley, the fact of the matter is that what we call 'Western culture' embodies and is based on the values, premises and modes of cognition set by Plato, Aristotle, and the rest of the post-socratic philosophers and scientists. According to Kingsley himself, Parmenides was misinterpreted already within a single generation, so there has never being a 'correct' Western culture, so to speak. Factually, even if it is based on a seminal misunderstanding, being Western effectively means what Plato and his successors defined it to be; it has never really meant anything else.

Western culture, it seems to me, has three central, differentiating characteristics:

  1. More than many other cultures, its approach to reality is based on self-reflection, critical meta-cognitive reasoning, so as to discriminate between fact and fiction, truth and falsity; (Empiricism is a relatively recent invention of the late renaissance or early enlightenment, so I won't list it as a central characteristic of the West. We have had, for instance, well over half a millennium of scholasticism, when empiricism played hardly any role.)
  2. More than many other cultures, the West's metaphysics unreservedly acknowledges the existence of personal, individual minds and, therefore, the existence of an objective world out there, outside such individual minds;
  3. More than many other cultures, the West fully embraces the illusion we call the world.
Notice that, although the view that objective facts are material has dominated Western culture for the past couple of centuries, over the more than two millennia of its existence the West has also entertained other possibilities: Western idealists, for instance, posit that objective facts are grounded in a transpersonal mind, whereas Western computationalists posit that they are grounded in pure information.

Now, I acknowledge that the three characteristics listed above are not what Parmenides intended. Moreover, I also acknowledge that they are all ultimately illusory: logic is largely a mental invention, not a Platonic absolute; the very distinction between my personal mind and the world out there is ultimately illusory; and the physical things I perceive are mere representations, not essence.

But I don't think that these Western errors are a waste of time either. Wisdom sometimes comes only with error, as any wounded healer will know. Sometimes a misstep is more useful and important than the correct way forward, because of the experiences and insights it creates the space for. Getting to the right answer only after having exhaustively tried, and failed with, seductive but wrong ones arguably leads to a deeper, fuller insight than getting things right first time round. For in the former case, one is more intimately acquainted with why and how those seductive answers are actually wrong, and therefore has an equally fuller comprehension of the right answer.

More specifically, by having embraced objective facts and reasoning fully, unreservedly, we are making sure that every stone is turned, particularly the most seductive ones; we are laying the ground for a deeper future insight than what those shooting straight for the end can achieve. For in the latter case, there may always remain a residual seed of doubt or temptation to go and have a look under that beautiful, round, shiny stone over there on the corner, which has never been fully turned.

The destiny of Western culture may, for all I know, intrinsically entail experimenting with extremely seductive but wrong answers first, exhausting the alternatives, and only then setting itself straight. Of course, the price we pay for this is unfathomable. Generation upon generation have endured grief, despair, unspeakable suffering of every kind for having followed the siren song of illusion. This is the West's sacrifice. The only question is whether we will eventually get it right or not.

Prison break

But just how can we eventually get out of this bind and unveil reality? Kingsley talks often about μῆτις (mêtis), a kind of cunning wisdom that can be used to trick, enchant or persuade. The illusion we live in is a product of μῆτις, and only more persuasive μῆτις, such as true logic, can get us out of it.

Now ask yourself: What would be truly persuasive for the Western mind? What kind of story could short-circuit our inner narratives, expose its inner contradictions and force us to review our unexamined assumptions? The answer seems absolutely crystal clear to me: reasoning consistently pursued to its ultimate implications.

The Western mind only acknowledges reasoning as a valid story. It will dismiss anything else without even looking at it. So if one wants to use true logic to trick the West out of illusion, this true logic must come disguised as reason; it must entail embracing the illusion fully, objective facts and all, and judiciously applying reason within it. That's the μῆτις required here; there's just no other way. And Kingsley himself left space open for this approach: "when we live the illusion to the full, to its furthest limits, we are nothing but reality fulfilling its own longing" (R: 258).

Kingsley could counter this argument by claiming that those who use reason today aren't at all aware of true logic; they aren't trying to get us out of the bind, but simply hand-waving and gesticulating furiously and frivolously within the illusion, which only makes things worse. But is that really the case?

With deep and absolutely sincere respect for Kingsley, I should like to suggest the following: If one doesn't have affinity with hard-nosed reasoning, one will probably not become acquainted with present-day efforts to use hard-nosed reasoning in the spirit of true logic. And in failing to notice these efforts, one may become unjustifiably pessimistic, concluding that true logic has died. Maybe it hasn't; maybe it's still alive, just disguising itself as reason—a tactic of μῆτις—so as to not be immediately recognized and dismissed by the vulgar spirit of this time.

To free the West from illusion, we must first break into the prison wherein the West finds itself, and then break out again carrying the rest of the culture with us. We must fight the duel with the weapons chosen by the opposition, for those are the only weapons the opposition recognizes as real. Kingsley himself is well aware of this approach: "there are methods that reality can use to work its own way into our illusion and start to draw us out" (R: 255). Ditto. What a fantastic movement of μῆτις it would be to use pure, strict, sharp reasoning to undermine reason itself... wouldn't it?
Share:

The nuclear option is inevitable


Our civilization faces tremendous challenges today, and its very survival is at stake. The population is expected to stabilize at over 11 billion people at around mid-century. Given that the average person's standard of living—with associated resource consumption and pollution—is also increasing, this may more than double the already unsustainable strain we put on the planet. The resulting human-induced climate change is a big threat, but isn't the only one. In a couple of decades large cities are forecast to run out of drinking water, the so-called 'water crisis.' The velocity and ferocity with which we extract resources from the planet far outstrips our ability to recycle these resources. Our current waste management strategies soon won't be able to cope with the load. Food production will have to more than double, although the planet's surface isn't getting bigger. The challenges are many.

You see, the planet itself will do just fine even if we throw our very worst at it: give it a million years or so—the blink of an eye for a rock that's been around for 4.5 billion years—and it shall have luxuriant forests, rich oceans and abundant fauna again, after we are gone. As a matter of fact, even our species will survive: there are a few of us in Africa, Australia, the Amazon and the arctic circle who have the skills to ensure human survival even if technological life ceases to exist.

I am not concerned about the planet or even our species. My concern is our civilization, our culture. Letting these die would be a waste of, literally, planetary proportions. We've striven and suffered for thousands of years to learn a thing or two, have an insight or two, and now we are about to reset the clock on all that. Despite the deplorable state of our metaphysics today, we have made progress. True insight is only achieved when we've turned every stone and flirted with every vaguely attractive but ultimately stupid idea conceivable, at great cost to ourselves. And now that we've finally done much of the suffering and are about to emerge into daylight, to reset the whole process and go back to square one would be just unspeakably, unthinkingly catastrophic. All the wars, all the famine, all the despair... for nothing? Just to start over before we bank anything?

No, we must survive. But to escape catastrophe we require what in military jargon is called a 'forward escape.' Technology—used for resource extraction, industry, transportation, manufacturing, etc.—carries much of the responsibility for the crises we now face. Yet, to overcome these crises while preserving the positive things about our culture and civilization, we have no other option but to deploy more technology. If we were just a billion or two, perhaps we could do without technology, but not with over 11 billion people on such a small rock.

To effectively address the many challenges we face, we need energy; no, abundant levels of energy; no, even more: we need practically inexhaustible and cheap sources of energy everywhere. The reason is simple: recycling consumes huge amounts of energy, and we need to recycle a whole lot more than we do now, for the planet is not getting any bigger or richer; desalination of ocean water consumes enormous amounts of energy, and we will soon need to do a lot more desalination, for only about 1% of the planet's water is suitable for drinking (that is, after it is treated and pumped to the people who need it, which also requires significant energy); waste management, from sewage treatment to incineration to air pollution control, requires a lot of energy; vertical farming—of which we will need to do much more to keep a growing population fed—requires a lot of energy because of its reliance on artificial lighting and automated systems; and so on. You get the picture. Abundant cheap energy everywhere is the key to addressing our problems through the use of advanced technology, in a forward-escape to avoid catastrophe.

But wind farms, solar panels and the other sustainable, non-polluting energy sources embraced by eco-conscious people today cannot provide it. Sun and wind aren't reliable or abundant sources of energy, even if we project significant advances in the associated technologies. And they have their own cost for the planet, given the huge areas they require. These otherwise sustainable energy sources face enormous challenges to merely meet our current energy needs, let alone what is required for a forward-escape. I know this isn't a popular opinion, but I have had the chance to look at the numbers. For a forward-escape, we will need a lot more energy than we currently consume; wind and solar just won't do, I'm afraid.

Yet we do have the knowledge to solve any conceivable energy challenge within my life time, or even earlier (I am 45 years old as I write these words): nuclear energy.

Okay, before you dismiss me, please continue reading just a little further. I am keenly aware of the problems associated with nuclear energy, not the least of which are safety and radioactive waste. I know why you probably despise this idea. But perhaps what you don't know is that there are extremely robust and effective solutions to the problems of nuclear energy.

The nuclear reactors we despise—think of Chernobyl and Fukushima—are from an old generation, technology from the 1950s and 60s. These reactors require active-safety: unless one actively intervenes to keep the reaction under control, the reactor melts in a nuclear runaway. These systems are inherently unsafe, no matter how many levels of redundancy one builds to prevent a runaway reaction; there can always be an unfortunate alignment of circumstances that leads to catastrophe. And catastrophe in these cases is unacceptable even if it happens only once. So I believe we should eventually phase out all reactors that depend on active-safety, which is just about all reactors in operation in the world today.

But there are also passive-safety reactors: these require active intervention to stay running. They are inherently incapable of a runaway reaction. If you shutdown all power to the reactor and/or if every system in the plant fails, the reactor just stops; it just can't keep itself running unless it is in some way poked or stimulated to do so from the outside. Such reactors are inherently safe; they just can't go out of control. And as if this weren't enough, there are passive-safety reactors being developed that use, as fuel, what current nuclear reactors produce as waste! Many passive-safety reactors do not require uranium enrichment, so the technology also cannot be used for weapons. It's hard to think of any significant risk or disadvantage associated with these technologies.

The holy-grail of passive-safety reactors is, of course, fusion reactors, which produce no harmful waste products (mostly helium, an inert gas used to fill party balloons). Many groups are now actively doing research to develop nuclear fusion power plants. The problem is that we are still decades away from large-scale commercial deployment, time we may not have. Right now, China, for instance, isn't waiting: the Chinese are building new active-safety nuclear fission reactors at a very fast pace.

There are options to bridge the gap between now and the time when fusion reactors can be deployed. Liquid fluoride thorium reactors come to mind, although a more prominent recent example is the TerraPower reactor, pushed by Bill Gates. This latter one is a fission reactor with passive-safety. The problem is that 'nuclear energy' has got such a bad name in our culture that many people, including politicians and regulators, aren't even aware that these new developments effectively solve the problems of older technology. To simply assume that all nuclear energy is bad is, frankly, a dangerously uninformed position. Here we have the most promising—perhaps even the only viable—way to effectively address the many incredibly difficult challenges we now face, and we dismiss it unthinkingly. We don't have the luxury to act based on slogans and prejudices here; the issue requires thoughtfulness and a rather pragmatic attitude.

I believe governments and regulators must aggressively facilitate research and development of passive-safety nuclear technologies; we must allow prototypes to be built, which right now isn't possible in the West. Moreover, Europe, the USA and Japan must use their technological lead and entrepreneurial culture to not only allow, but also foster and accelerate these developments. Significant government funds must be allocated for it, for we are dealing with a matter of survival here. Passive-safety nuclear reactors can potentially solve our world's growing energy needs in an inherently safe way, without significant pollution or waste.

We have a way out, but we must want to explore it.
Share:

What we get wrong about democracy


In a previous post, I've discussed the fact that elite-thinking and monolithic, mainstream narratives no longer hold as much sway as they once did in determining the general views and ethos of our culture. I shared my opinion that this is, by and large, a positive development in human history, but one that invests us, the people, with more responsibility than ever before. In this context, and in view of the latest UK election, a proper understanding and use of the system of democracy—the power (kratia) of the people (dēmos)—becomes a matter of survival for organized human activity.

The legitimization of demagogy

We have become extremely desensitized for what, in my view, are naked abuses and distortions of the democratic system. We hear politicians, parties and pundits alike talking very matter-of-factly about the need to 'listen to their bases,' to 'take the pulse of their constituencies,' to 'understand what the voter wants,' etc. We have focus groups, polling organizations, marketing consultants and whatnot, all trying to grasp what most voters want, so the candidate's program and rhetoric can reflect those wishes and win elections. In this latest election cycle in the UK, such approach was so extreme that a party's very position on the defining issue of the election was left ambiguous for fear of alienating part of its base.

As you read this, you may be saying to yourself "of course, that's the point of a democracy, isn't it? We want politicians to listen to what the people want." It sounds so self-evident, doesn't it? Yet it isn't; in my view, it is in fact a fatal error. We misunderstand democracy so drastically that we don't even know anymore when we contradict it. I don't want a government formed by people without convictions or views of their own, who won an election merely because they were best at doing focus groups so as to mirror my own tentative and uninformed views back to me. This is manipulation, not campaigning; marketing, not politics. This is telling the people what they want to hear anyway, for the sake of winning. We have a word for this: demagogy. Google's dictionary defines demagogy as
political activity or practices that seek support by appealing to the desires and prejudices of ordinary people rather than by using rational argument.
The point of a democracy is not to choose those who are best at persuasively telling back to us what we want to hear. Such a system merely elects the best liars, manipulators, actors, tacticians, demagogues. Yet we seem to have legitimized demagoguery through the normalization of the notion that politicians should be 'attuned to their base,' or 'know what the people want,' or 'listen to the voters' wishes.' We've replaced convictions, reasoned views and integrity with focus groups, polling and naked manipulation. We've conflated democracy with demagogy. This is a disaster. Not only are we deceived, we now think it is good and proper to be deceived.

In a democracy, I want to vote for people who actually think generally like I do, who share my values and understand my difficulties, for that's the compass that will actually steer them in their job. This is very, very different from wanting politicians to tell me what I want to hear, without any conviction or rationale behind it. An election is not a competition to win a prize or a job (well, at any rate it shouldn't be that), but an opportunity for the population to identify which politicians hold sincere, heart-felt views and positions that happen to resonate with the people. For an election to work properly, it must reveal what views these politicians actually hold, and why they hold them, as opposed to evaluating how well they can attune their program to the results of a focus group.

What else, then?

Okay, so if politicians shouldn't campaign by promising the people what they think they want, what then? How should a democracy work? Who should I vote for, if not those who are telling me what I want to hear?

The core of the democratic system is deputization: to invest a politician with the power to act in our name. To deputize someone entails trusting that he or she will make decisions in our own best interest. Normally, this means that we choose deputies that have our wellbeing at heart, are competent to carry out their responsibilities, and adhere to core values and views that overlap generally with our own. None of this, however, means that the ideal deputy will simply do exactly as we think we would, for in such case he or she wouldn't be a deputy at all: we might as well do the whole job ourselves, instead of entrusting someone to do it in our name.

Why do we deputize attorneys, accountants, financial planners, insurance brokers, doctors, etc., to act in our behalf, instead of dealing with all associated issues ourselves? Because we think they are in a better position—due to education, expertise, experience, availability of time, information, infrastructure, etc.—to deal with the issues, in our best interest, than we ourselves are. I don't have the time, expertise, information or infrastructure to deal with all my tax, insurance, financial planning and health issues myself. I don't have the time or expertise to follow the latest developments in the law, regulations, economy, jurisprudence, medical research, etc., so as to best act in my own self interest. I think my attorney, tax advisor, financial planner, accountant and doctor can best do it for me, so I entrust them to do it; and I think I will be better off for it, instead of trusting my own tentative and uninformed opinions on these matters. Moreover, if it turns out they fail me, next time round I will simply choose someone else, instead of adopting the unreasonable and rather naive position that I could, say, defend myself better in a court of law than a defense attorney, or treat my own medical condition better than a doctor.

In doing so, the very last thing I expect from my attorney, accountant, doctor, etc., is that they will simply do exactly what I tell them to do! That would defeat the point of the whole exercise. Why pay them to help me out if ultimately they won't leverage their own expertise? I don't want to hire monkeys, but thinking human beings instead, who have more knowledge and time to deal with the issues in question. After all, I am busy with a number of other things—my actual life—besides the details of my taxes, mortgage plan, insurance package, the guy suing me, or my medical condition.

That's the essence of a democracy: to deputize someone we trust, who has more expertise, sound judgment and time to deal with society-level issues than we would as individual citizens. In choosing my candidate, I would expect him or her to leverage this time and expertise to look more carefully at the data and arguments in question than I ever could as an individual citizen, and then come to his or her own informed conclusions and decisions in my name, as opposed to merely echoing my own partial judgments and tentative opinions.

The fallacy of direct democracy

The alternative is to do away with deputies and organize ourselves according to some extreme form of direct democracy: every issue would have to be decided by a direct referendum.

Imagine for a moment that this could be logistically practical, which of course it isn't: it would still require that each and every one of us be sufficiently informed about all the relevant data and arguments associated with each issue, in order to make an informed choice. It would require that the entire population—as opposed to a parliament or a cabinet—be properly apprised of everything of significance for the decision. It would require that you took the personal time needed to acquaint yourself with, and ponder, everything of relevance. It would also presuppose that every citizen has the education and cognitive capacity to carry out all these extensive and rather overwhelming evaluations.

This is just impossible. And that's why democracy is based on deputization: it is possible to ensure that a parliament or a cabinet is sufficiently apprised of all data and arguments relevant to the issues, in order to make informed choices. It just isn't possible to ensure that an entire population can play such a role. So we have to entrust deputies, who we believe share our values and general views and are competent to do the job, to act in our name in government.

The consequence, of course, is that these deputies may make choices that differ from the ones we would make, based on our tentative and partial knowledge and understanding of the issues. This is their very job; that's what happens when people have more time and expertise to study an issue: they choose differently than those who don't. Entailed in the trust we grant to our elected officials is the trust that, when they choose differently from what we would, they probably have good reasons to do so; reasons we would understand better if we were as privy to the relevant data and arguments as they are.

The point I am trying to make is not that we should give carte blanche to our elected officials; no. The point is to judge them based on the results they achieve for us, as opposed to whether each particular decision they make matches with the decision we would make ourselves. I don't judge my doctor or my tax advisor based on the specific technical choices they make about my treatment plan or tax filing, but on the results of this treatment plan and tax filing. If, based on these results, their choices prove to consistently go against my best interests, I will pick a new doctor and tax advisor next time round. But if I get good results, I will stick with them even if I don't quite understand or initially agree with the choices they make. That's the point: I trust they have more expertise and time to study the issues and make better choices for me.

Final thoughts

As I discussed in an earlier post, it is a good thing that delusions held by previous generations are being shattered: elites don't always know best; we must make up our own minds about the issues we care most about in our lives, such as e.g. our metaphysical positions. However, we can't possibly expect ourselves to be sufficiently informed and cognizant of everything. Running a city, a state or a country is a very complex task that requires deputization; it is a demanding, difficult, full-time job, not something we can do on the side, next to everything else in our normal lives. It is frankly quite naive and presumptuous to think otherwise, in a world that is facing so many extremely complex problems.

The power we hold in a democracy is not to decide on every issue ourselves—or to expect that a politician always does exactly as we would, despite his or her having more time and access to information to ponder the issues—but to deputize someone we trust and resonate with to look after our interests. Our power resides also in our ability to choose differently next time round, if the results achieved disappoint us.

The problem is that, now more than ever before, many of the relevant results are long-term ones that can't be judged properly at the end of a term. Often, reckless economic choices may in fact improve the economy on the short term, just to wreck it on the long run. Global issues, such as climate change, are also long-term ones: they can be comfortably ignored until, suddenly, catastrophe is upon us. This increasingly significant reality doesn't fit naturally with relatively short political cycles and may become a key problem for our democratic system: it creates more space for demagoguery and threatens the very survival of organized human activity.

Democracy has many other problems as well. Arguably, it levels society out at its lowest cognitive degree, since only arguments that can be understood and embraced by a majority can win elections. Unfortunately, however, truth doesn't always correlate with simplicity and appeal. And the room democracy creates for demagoguery, as discussed above, isn't a new problem either: while the long-term character of our issues today may have worsened the problem, democracy is structurally vulnerable to those who prey on people's prejudices and simplistic views. After all, it's much quicker and simpler to check whether politicians promise to do what we think we would do in their place, than whether they can achieve the (long-term) results that will actually improve our wellbeing. It is so simple to high-five, just before falling into a precipice, the guy who took us exactly in the direction we wanted to go, although he knew (or should have known) of the precipice, whereas we didn't.

So let us not be naive here: democracy is a deeply and structurally flawed system. What it has going for it is that everything else we can think of is even more structurally flawed, which is a rather decisive differentiation. So yes, we may have the least problematic governing system available, but that doesn't mean we have a good system. We shouldn't inadvertently translate our pride in our democracy—a form of government achieved at great cost over generations—into naive and dangerous complacency with its gargantuan flaws. On the contrary, I believe we should be very critical and alert, for democracy can take us straight to hell if we don't continuously pay attention to its inherent flaws in an almost paranoid manner. We are not driving a reliable vehicle here, so we should always take precautions, such as having a tool box and parts for repairs on the road, first-aid kit at hand, extra fuel in jerrycans, and perhaps even a satellite phone to call for help as a last resort. Our situation is rather precarious.
Share:

A suggestion for Church reform


A polemical initiative for reform of the Catholic Church in Germany is under way, as reported by the Deutsche Welle. The context is all the recent scandals about child abuse and sexual misconduct by priests, as well as a continuing, significant decline in Church attendance. The latter has been going on for decades, but is now reaching a point where the very survival of the Church is at stake. Many parishes have already closed. In my country, even the Cathedral of Utrecht, home of the archbishop, has had to close last year. It is fair to say that the situation is coming to a head and the future of religion in the Western world looks bleak.

In my book, More Than Allegory, I have stated my views on religion: I think it is a valid and important part of human life that we neglect at our own peril. Religious mythology, although obviously not literally true, is symbolic of something that, while transcending our rational faculties, is integral and critical to being human. The primordial religious impulse reflects, in my view, a true, transcendent aspect of reality; it must be nurtured if we are to be complete human beings. As such, I believe the Catholic Church, whose history has been inextricably intertwined with that of the West since Constantine, has a critical role to play. The European collective mind, obfuscated by the rational and secular spirit of the Enlightenment as it may have been, continues nonetheless to rest on Christian mythological foundations. The continuing erosion of these foundations will exert—well, is already exerting—a heavy toll on our psychic balance and health, as the modern epidemics of depression, anxiety, ennui and despair attest to.


I am thus very interested in the survival and revitalization of the Church. Without extensive institutional support (more specifics on this below), it is difficult to see how the flame of a religious life can be kept alive in the West. However—and to merely state the obvious—the Church can only be saved with uninhibited, extensive, far-reaching, courageous reform, for it is completely out of synch with the spirit of this time. Should it continue on its present course, it doesn't take a genius to see that the Church will be relegated to irrelevance and become, at best, a kind of museum or tourist attraction (anyone visiting e.g. Cologne Cathedral for Sunday mass will see that this, in fact, is already happening). In this post, I dare to offer a suggestion for what this reform should entail; must entail.

In times past, the Church has performed the function of social control through its moral dogmas. Priests used their Sunday sermons to keep people straight, so to speak. Religious moralizing may have had a role to play in those times, absent the proper rule of law. Today, however, things are very different. Ever fewer people will take that kind of moralizing seriously, and many will think it pathetic. To be judged and absolved for their alleged sins is not what people today are looking for. They have a whole new attitude to life in which the very idea that they are sinners doesn't resonate. I don't feel like a sinner; do you? I do feel confused, but not guilty. I miss a more personal relationship with transcendence, but not judgment. I would like to experience a deeper meaning in my life, but not to be given an outdated list of behavioral norms. Moreover, we have perfectly good, secular rationales for our laws, as well as law enforcement. We don't need the Church to keep society working at an operational level.

What we do need the Church for is meaning, contact with something transcendent. Our daily, secular lives lack in depth and true purpose. Ordinary goings-on are banal and ultimately pointless. Consumerism offers an ostensive escape route, but it doesn't work for long, for mere things do not have the numinous power of religious symbols. We've replaced the altar with cigarettes, alcohol, porn and new pairs of shoes, but it didn't work quite well for us, did it? A doorway to transcendence and meaning is what the Church could help us with, if only it would drop the moralizing and focus on liturgy, i.e. the ritualistic part of a religious life.

So here is my suggestion for the Church authorities: drop the focus on moral codes, judgment and guilt trips. Nobody is looking for that today and nobody will go to the Church on Sunday to get that. Jesus Himself did not focus on judgment, so why should those who labor on His name do so? Replacing judgment and moralizing with the attitude of tolerance and understanding characteristic of modern psychotherapists is, in my view, entirely consistent with Christianity.


Focus on liturgy, on the ritual of the mass. Be conservative in that regard, go back to using Latin and the elaborate rituals of times bygone. The mass doesn't need to be understood, for it is not meant for the intellect. Goodness knows we have enough stuff keeping our intellect engaged already. The mass should be precisely a way for us to defocus from the intellect and open space for other psychic faculties, such as transcendent intuition and feeling. If the mass achieved such goal, I, for one, would attend it every Sunday and contribute more to the Church. For then the Church would nurture an aspect of my humanity that nothing else in this secular society can.

Priests already receive extensive training on philosophy and counseling. They are already well equipped to play the role of helping, understanding, non-judging guides to a life of meaning, without all the moralizing that puts people off. They could play a role that no secular psychotherapist today could, for priests can navigate the waters of metaphysics. They are also invested with the formidable energy of tradition; an energy that constantly circulates—unnoticed—through the deepest layers of our psyches and, if mobilized properly, could have a huge positive impact in our lives.

So here you go: priests as counselors. But priests also as actors in a symbolic drama staged as a ritual—the mass—whose purpose is to reawaken within us our dormant but innate link to transcendence; a symbolic ritual that evokes transcendence in us, so the attendants of the mass can have a direct religious experience facilitated by the Church. This, in my view, is the vital role of the Church: to point to transcendence, as facilitators, so we can find our way there. The notion of the Church and its priests as intermediaries, or spokespeople for God, is not one that will thrive in the 21st century. We don't need to regard priests as superhuman beings with privileged access to God; they have never been that anyway, and today we all know it. Trying to maintain that implausible image is a dangerous waste of time for the Church. Yet, priests have vital roles to play in our society; and they can play those roles at the drop of a hat—for many are already equipped to do so—if only the Church would reform its orientation and purpose accordingly.

Will my suggestion be heard? Of course not. It won't even be noticed. More than likely, the Church will die a slow, agonizing, sad death into irrelevance, because those in it who pronounce themselves adherents of tradition fail to see that the core of the tradition has itself been buried under layers of social moralizing. Christianity became the foundation of the West's spiritual life not on account of its dogmatic prescriptions, but because, originally, it touched something alive deep within us. Now it will only survive and thrive if it, once more, re-learns to touch us again.

This post is not an attempt to patronize anyone. I am no authority in these matters anyway. But I am very sincerely interested in seeing the vitality of the Church restored, while I despair at being confronted with its decline everywhere around me. So this is my somewhat clumsy attempt to do something about it, for what it's worth. Whatever faults this post may contain, it is at least sincere and heartfelt.
Share:

Neo-skepticism and post-truth: a call to reason


This is a relatively long essay in which I address a variety of highly polemical topics, such as science skepticism, post-truth, climate change, etc. You will not really know what my positions are until you read this post through. Partial reads will likely lead to misinterpretation.

Story control

Until not so long ago, our cultural mindset about most issues of importance was largely determined by only a few outlets of the mainstream media. These outlets were, by and large, trusted implicitly and rather uncritically by our parents and grandparents, perhaps even by our younger selves. Their reporting, even if occasionally suspected of bias, was mostly seen as an expression of the truth. Indeed, these outlets were our key channels to perceived truth: what was actually happening in society, who was friend or foe, which countries were good and which were bad, what were the proper values to live by, systems to comply with, philosophies to give credence to, etc. They exerted what I call 'story control': editorial power over a mainstream narrative that massively influenced how we thought and lived.

This relative monopolization of a society's view of the truth goes back to the Church's firm grip on the hearts and minds of the people in the middle ages and, much farther back still, to when emperors set the tone for how entire populations were to think. The very few free, critical thinkers that managed to raise their head above the story control were anathematized throughout most of history. The power of centralized, monolithic broadcasting systems was as formidable as it was unnoticed: so pervasive and taken-for-granted it was, most of us didn't even notice how deeply manipulated we all were by their editorial choices and subliminal suggestions.

I am not talking about premeditated conspiracy theories here. Human beings can hardly extricate themselves from their own beliefs and views, and so their actions inevitably reflect such beliefs and views. The people responsible for mass communication in yesteryears—be them priests holding Sunday sermons in the middle ages or editors of the 8:00pm news in the 20th century—did their job informed by their own perspectives and biases, because doing so is only human.

Be that as it may, the result is that the storyline they enforced reflected the particular prejudices, in a particular point in history, of the intellectual and economic elites that held control over the centralized broadcasting infrastructure of the time. One could even make the case that metaphysical materialism itself spread beyond academia only with the popularization of newspapers in the 19th century and of radio—and later television—broadcasts in the early 20th century.

Centralization and the elites

For economic and technological reasons, the means to persuasively broadcast views to the general population—and, thereby, exert story control—have been limited and centralized for most of recorded history. In the middle ages, the Church not only retained control of scholarship (everybody else was too busy fighting wars or toiling the fields), but was also in a unique position to broadcast its message through its formidable logistical infrastructure and, frankly, marketing appeal. In the 20th century, the knowhow and investment required to start a significant radio or television broadcast operation rendered it feasible for only a very few. And so the general storyline that informed entire civilizations reflected the particular perspectives of relatively few individuals with privileged access to both knowledge and economic power. I am not passing judgment on whether this was good or bad; it just was, for reasons we can easily understand.

As a result, entire societies were subtly subjugated to the views developed by the elites, who had more access not only to the knowledge of their time, but also to expensive, centralized broadcasting infrastructure. Again, I am not passing political judgment on this state of affairs; in my crazier daydreams, I even imagine that some form of enlightened absolutism—if it were realistic, which it is not—would be the ideal governing system. Nonetheless, I believe it to be an ascertainable fact that our culture's mainstream views were developed and maintained—for the past many centuries—as I've just described: enforced top-down by an elite with privileged access to knowledge and centralized broadcasting infrastructure.

Knowledge drunkenness

The problem is that knowledge isn't a very reliable or even stable thing, nor does it always come hand-in-hand with economic power. Even science—the most reliable method for the development of objective knowledge ever devised by humans—is done by humans and, as such, vulnerable to the entire gamut of human shortcomings: ego, ambition, pride, prejudice, etc. Chronicling the early days of science in the 17th and early 18th centuries, Ernst Benz wrote:
The findings of modern science were forged in this atmosphere of passionate conflict. The favor of the court, the intrigues of ministers, the rivalry of colleagues, the competition of university chairs, personal dislike and self-justification, social concerns, political cabals and covert influences, the pride and triumph of inventors, human weakness, gossip and convention all played their part in this drama. (p. 45)
If this sounds familiar, it is because little has changed. We are, after all, still human. Most of science is done by academics who have families and egos to feed and, therefore, a vested interest in the social recognition of their work. Doubtlessly, the vast majority displays integrity and honesty. Often enough, however, human weaknesses translate into false or biased research results that are nonetheless published. Though there are safeguards to keep such spurious results in check, chaff does pass through the filters. The raging replication crisis in science is the result.

Equally concerning is what I shall call the 'drunkenness of knowledge.' Acquiring more knowledge—for instance, upon achieving a doctorate—exposes one to a broader horizon of things still unknown. In principle, this should have a humbling effect. In practice, however, one often starts believing that one's mere opinions or intellectual dispositions are in some sense privileged or superior.

The problem is that accumulating knowledge within a certain field is one thing, but being able to sensibly interpret and apply this knowledge beyond the restricted boundaries of that field is another thing entirely. Many scientists fail miserably in the latter challenge. When one becomes drunken with one's own limited knowledge, one starts making unjustified—and often outright ridiculous—extrapolations of that knowledge beyond its boundaries of validity. This is particularly visible amongst the self-appointed spokespeople of science when they inadvertently venture into the untamed horizons of philosophy. In recent times, painful examples have been provided by Lawrence Krauss and Neil deGrasse Tyson. It is understandable—though unfortunate, as I shall discuss shortly—that, in face of such raw stupidity raging amongst PhDs, one might throw one's arms up and become a science skeptic.

The intellectual and economic elites are not immune to bias, delusion and even in-your-face stupidity. The highly specialized stupidity of PhDs is particularly pernicious, and I say this as a double PhD myself. Knowledge-drunken doctors are dangerous because of the self-confidence with which they extrapolate their limited understanding, the authority they command while doing so, and the access they have to centralized broadcasting infrastructure. Their foolishness and hubris infects entire societies and ways of life.

Decentralization and neo-skepticism

Since the turn of the century, however, things have been changing fast. Old idols are being burned, old illusions seen through. There is a new level of skepticism about the mainstream narrative, which has been rendering story control less and less effective. People see through the pompous but ultimately hollow attitude of arrogant elites. They realize they have been, to some significant extent, systematically manipulated. With a renewed critical attitude, they realize their emperors have no clothes.

I am not claiming that skepticism about the mainstream narrative is a new phenomenon in history. There have always been free thinkers and skeptics; there has always been doubt about what people are told from the higher echelons of society. But there is something happening now that nurtures this skepticism to levels never before seen: the decentralization of broadcasting technologies enabled by the Internet and social media.

For the first time, people are able to broadcast their skepticism, their own alternative views, and connect with other likeminded people so to build entire communities. For the first time free thinkers no longer find themselves in social isolation and can reach an audience. Would you have heard of my own work if not for this? One no longer needs to be invited for an interview at a major television channel in order to be heard: an engaging blog post can go viral and make one's non-mainstream ideas popular overnight, without any kind of editorial control. The social and cultural dynamics this 'neo-skepticism' is introducing in our civilization are dizzying, and—I suspect—will only be fully appreciated decades from now, with the context and perspective that only hindsight can provide.

A new level of responsibility

Undoubtedly, there are tremendous positive aspects to this new dynamism. The chains of story control are being broken by a democratization of broadcasting technologies and a widening of unmediated social interactions. There is less editorial control by elites effectively censuring what one can hear. We have more options, more hypotheses to consider about everything: from the very nature of reality to how best to live our lives. Our individuality—our ability to choose how and what we think, and how we act in the world—has been empowered to levels only accessible to the aristocracy in previous ages.

However, this means that never before in history has each one of us carried so much responsibility for our choices. The empowerment of the individual—of our personal reasoning, opinions, beliefs and actions—places the future of organized human activity in our hands. We are now actors, not mere audience. The conclusions we arrive at, out of our personal and sovereign assessment of our situation, determine our collective future. The steering wheel is now in the hands of the many perhaps as much as in those of the elite few.

Throwing the baby out with the bath water

And here is where things can go terribly wrong. It is definitely naive, in this day and age, to hold on to the belief that the authorities are infallible; that the opinions of PhDs must always be right; that science never gets it wrong; that what the mainstream media says is always true. Of course it isn't. We are all just humans, plagued by prejudice and bias and desperately trying to make sense of things. Nobody has the final answers. Everyone is confused and, frankly, afraid in this extraordinarily strange situation of being alive in the 21st century. The illusions of sobriety and control held by our forefathers have been broken for good and we must learn to live with it.

But we have made progress over the past centuries. We are not starting from scratch. As fallible as many scientists are, science itself, as a method of inquiry, is one of the greatest achievements of human civilization. As often as it gets things wrong, the entire technology infrastructure that surrounds us everyday, from the moment we wake up to the moment we fall asleep, owes its existence to past scientific accomplishments. You wouldn't be reading this if science didn't get things right, and chances are you wouldn't even have survived your childhood. If you trust that the car you drive will bring you to where you need to be, or that the phone you use will put you in contact with the people you want to talk to, or that the medicine you take will help cure your health condition, you implicitly trust science. And you do implicitly trust science every day.

By the same token, as much as knowledge drunkenness is a scourge, knowledge itself has obvious and undeniable value. To disregard the value of learning is to place our future in the hands of complete ignorance, to give the steering wheel to a blind man. That some PhDs proudly pronounce stupidities doesn't mean that there is no role for PhDs to play in our society. If tomorrow I require surgery, I will want to be operated by a very learned and experienced doctor who knows what he or she is doing; not by a butcher. Next time I fly across the Atlantic, I will want my plane to be piloted by a very learned and experienced pilot who knows what he or she is doing; not by the guy sitting next to me.

I don't know how to perform surgery or to fly a plane. That's why I shall continue to entrust these tasks to those who know how to perform them. By the same token, I think I know a thing or two about philosophy and computer science; more than Lawrence Krauss or Neil deGrasse Tyson. So when it comes to the essential nature of reality or issues around artificial intelligence/consciousness, I trust my judgment over theirs. Knowledge matters.

If neo-skepticism ends up leading to an aversion to knowledge itself, our civilization will end and we will live like apes. Some people do know more than others, especially when it comes to the subjects to which they have dedicated their lives, and that is a good and very important thing. These people should be appreciated and respected for what they do know. To ignore or deny this reality is deadly: try flying in a plane piloted by your surgeon. It is legitimate that knowledge commands authority, provided that such authority be granted within its appropriate scope.

Objective facts

The value of knowledge resides in its apprehension of objective facts; that is, facts that obtain whether we like them or not, believe them or not, are aware of them or not. Even metaphysical positions that deny materialism and grant primacy to mind don't reject objective facts: my own analytic idealism grants that there is an objective world out there, beyond our personal mentation, even though I maintain that such world is constituted by transpersonal mental states.

The denial of materialism is not a denial of objective facts. Neither does it entail or imply that reality is entirely what we make of it. While I grant that the observer's role in perceived reality goes much beyond what metaphysical materialists accept, I don't think my own personal ego is constructing my entire life out of its own innate whims and dispositions. There is something out there that doesn't care about what I personally wish or think. The salient aspects of this something is what we call objective facts.

Science is the best method ever devised for giving us knowledge of the behavior of the objective world out there. Its essence resides in the following axiom: if you want to know whether a statement about nature's behavior is true or false, look at nature's behavior. Put this way, it's a truism. Yet, it is surprising how often such truism is neglected. When Aristotle claimed that heavier objects fall faster than lighter objects—a statement about nature's behavior—he forgot to look at nature to see if it's really true. It took hundreds of years until Galileo famously dropped his balls from the top of the leaning tower of Pisa to prove that it isn't true.

By looking at how the world behaves—preferably according to the scientific method—we acquire knowledge about the objective facts that constitute the world. As complex and imperfect as this process may have become, it is still essentially correct, for the same reason that any sane person will believe Galileo over Aristotle in the question of falling bodies. As such, there are no such things as 'alternative facts.' Believing that a heavier body falls faster than a lighter one (if air resistance can be disregarded) doesn't make it so, no matter how strong the belief.

The value of being skeptical about certain views is precisely the renewed space it opens up for the contemplation of other views that may, in turn, correspond better to objective facts. Skepticism that denies all objective facts themselves is just madness, and defeats the very spirit of skepticism.

Climate change

Everything discussed thus far comes together in the debate about climate change. It has been claimed that scientists have been caught manipulating data about it, some of the other data available are contradictory, politicians make preposterous statements, activists go to extraordinary lengths to mobilize action, etc. Represented in this wild discussion one finds the (very human) shortcomings of doing and reporting science, story control by elites, knowledge drunkenness, etc. It would be naive not to acknowledge this, as it would be naive to simply believe, uncritically, what any individual scientist may say about a subject of such enormous complexity as the behavior of our planet's climate. It is legitimate to be cautious and guarded about climate change. It is legitimate to be taken aback by Climategate.

But if the reaction to this appropriate skepticism is to simply disbelieve that human-caused climate change is taking place, then one ultimately betrays skepticism. After all, disbelief is just another form of belief: negative belief. Proper skepticism should prompt us not to outright reject a hypothesis—and thereby effectively adopt its counterfactual alternative, i.e. that humans are not causing climate change—but to investigate the issue more thoroughly and thoughtfully. While many research results are flawed, misleading, and even outright wrong, these failures can be discerned and overcome if one looks at a more complete body of research. Science does have a knack for ultimately correcting itself.

Although the observations required for figuring out whether we are causing climate change are much more complex than dropping balls from the leaning tower of Pisa, the essence of the approach is the same, and the reasons for trusting its validity are also the same: we want to look at nature's behavior to see if we have any reason to believe we are screwing up the climate. The proper way to go about it is studying the question from multiple different angles, looking at a variety of independent sources of data, applying multiple different models, all of which should ideally be done by multiple independent research groups, funded by multiple independent parties. It is this global overview of a body of research that gives us confidence in a given conclusion, even in the presence of spurious results: the reliable conclusion is that which emerges independently from multiple lines of investigation.

I don't want to make this post about climate change. I am simply using it as a carrier to illustrate my previous points. But by looking at the body of research in the way described in the previous paragraph, I have convinced myself, to my own satisfaction, that human-caused climate change is a reality. You may agree or disagree, but this is my own sovereign conclusion, and I live my life accordingly. This is my way of taking responsibility.

Climate change, in my view, is the most critical case in which runaway neo-skepticism can overshoot the boundaries of reason, throw the baby out with the bath water and, given the level of responsibility we are now personally invested with in the world of social media, eventually lead to the collapse of our civilization.

Conclusions

The dynamisms underlying the rise of neo-skepticism are, in my view, primarily a positive development in human history. They open the door to a new, broader, uncensored relationship with truth. They help our society move more quickly away from entrenched but ultimately wrong views held by the elites. Delusions and deluders are seen through and given the appropriate treatment. Masks are removed. A more caustic, perhaps even cynical, but truer view of reality is achieved after centuries of sweet and sober manipulation. To consider neo-skeptics foolish or deplorable ignores the important and, in my view, valid realizations that underly and motivate their attitude.

Yet, swinging the pendulum all the way to the other extreme overshoots reason, betrays skepticism and ultimately may bring catastrophe upon our civilization. In modern Western democracies, we have the power to elect demagogues that prey on our frustrations at having been deceived by elites in the past. These demagogues may ultimately allow the world to be destroyed in the interest of maintaining the image of being skeptic of everything, even the existence of objective facts or the validity of science as a method.

Skepticism can be preyed upon by demagogues. The way to do it is to infer far too broad and generalized conclusions from the realization that something believed before is actually untrue. For instance, from the realization that scientists are flawed human beings and many scientific results are spurious, one infers the far too general and irrational conclusion that a specific scientific result is also spurious. The latter just doesn't logically follow from the former. Similarly, that knowledge drunkenness renders certain learned individuals pernicious doesn't entail or imply that knowledge itself isn't valuable and important. Finally, that the elites have manipulated society doesn't logically imply that all positions they hold are untrue. It would be rather surprising if they all were, wouldn't it?

I urge neo-skeptics to remain alert and truly skeptical, even—perhaps particularly—about those who have something to gain from the emergence of neo-skepticism. To escape from one form of manipulation just to fall head-on into another, perhaps even more dangerous one is tragic. Let us not throw the baby out with the bath water and protect the future of our civilization with reason and level-headedness.
Share:

Brain image extraction: Is it metaphysically significant?


Brain image extraction technology has been around for years now: researchers measure brain activity patterns and are then able to translate these measurements into an approximation of the imagery the subject is either seeing or imagining. This way, one can 'read your mind' or 'extract images' from your brain, so to speak: one can make inferences about your first-person visual experience based purely on objective brain activity measurements.

A new study in Russia on brain image extraction may again—understandably, but nonetheless regrettably—lead lay people to the following conjecture: if we are able to translate brain activity measurements into the visual imagery the person is actually experiencing from a first-person perspective, doesn't that mean we have bridged the explanatory gap? Philosophers have maintained for decades now that we cannot deduce the qualities of experience from objective measurements. There is an 'explanatory gap' between these two domains, in that we can't explain qualities in terms of quantities. But if—as shown in the Russian study—technology can translate EEG measurements into visual imagery, surely we have eliminated the gap; haven't we?

Surely we haven't. The conjecture—understandable and forgivable as it may be—is totally wrong; it is based on a deep misunderstanding of what is going on here. This is what I shall attempt to explain in this post.

But before we start, let me clarify first that I won't be judging the quality or accuracy of the Russian study, as reported in this preprint. I will simply assume that it is accurate, as reported. Even if this particular study turns out to be flawed—which I have no reason to believe—something along the same lines is or will surely be possible. In addition, the general public summary prepared by the Moscow Institute of Physics and Technology is quite accurate, level-headed and well written. The popular science media in the West—with some honorable exceptions—could learn a thing or two from them on how to communicate science in an accessible but non-hysterical and non-misleading manner. So you don't really need to read the full technical paper to follow this post; the popular summary will do.

The first thing the researchers did was to train an artificial neural network (ANN) to link certain patterns of brain activity, as measured with an EEG, to certain images. This sounds complicated but it really isn't. All they needed to do was to take EEG readings of a subject as he or she was looking at a known set of images displayed on a screen. Researchers then knew, by construction, what brain activity pattern corresponded to each image, since the subject was actually looking at the image as his or her brain activity was being measured. Next, the researchers provided each EEG measurement as input to the ANN and trained it to produce the corresponding image as output. Again, the latter image was known—it was what the subject was looking at when his or her brain activity was measured—so the trick consists merely in getting the ANN to produce a similar-enough copy of the image. We say that the image is the target output of the ANN during training, which it should produce when given the corresponding EEG data as input.

The ANN's training goes something like this: imagine that the input is just a number—say, 5—and the target output another number—say, 21. What you then want is to configure the ANN such that, when it is given 5 as input, it produces 21 at the output. The function the ANN is configured to perform could be as simple as to multiply the input by 4 and then add 1. In other words, the ANN could simply implement the function f(input) = 4 x input + 1. When the input is 5, we get f(5) = 4 x 5 + 1 = 21. 'Training' the ANN consists in finding this function f(input) through directed trial and error, so the ANN matches the target output. Once it's found, the function constitutes an ad hoc mapping between input and output data. It enriches and processes the input until it adds up to the target output.

In the case of the Russian study, instead of a single number as input, the ANN receives an array of numbers corresponding to each EEG measurement. Instead of a single number as target output, the ANN receives an array of numbers corresponding to the images. And then, instead of just one pair of input / target output, it receives several training pairs—that is, a series of EEG measurements, each with its corresponding image—so the function f(input) generalizes for a variety of inputs. Yet, the essence of what happens during training is what I described in the previous paragraph. The ANN implements an ad hoc mapping between EEG data and target image. It enriches and processes the EEG data until it adds up to the target image.

The figure below, from the Russian paper, illustrates the images the ANN was trained to produce (two upper rows) and the images the ANN actually produced. Notice how training gets the ANN to produce images pretty similar to the target ones.


That the ANN manages to do this is no miracle; it is in fact trivial, the straightforward result of having been trained to do so with actual images. The ANN doesn't magically deduce visual qualities from electrochemical patterns of brain activity; it doesn't bridge the explanatory gap; it already receives images from the researchers to begin with, who knew what the subject was looking at. The ANN outputs images because it was already shown images during its training, so it just learned to copy them when given EEG data as input. That's all. It generates roughly the right images because it has been forced—during training—to find an ad hoc mathematical way to process and enrich EEG data so as to produce certain sequences of numbers that can be visualized, by you and me, as images. As a matter of fact, as far as the ANN is concerned there actually aren't images at all, just sets of numbers that—it so happens—you and I, conscious human beings, can interpret as images.

The next step in the Russian study was to—after training—present the ANN with new EEG patterns that it had not yet seen during training. The idea is to check if the ANN has learned enough to extrapolate from what it has seen and make inferences when it is presented with new inputs—that is, to check if the ad hoc mapping between EEG data and images, produced during training, remains valid for data not used during the training. If the training was effective, the images the ANN will then produce will be similar to the images the subject was actually being shown when the new EEG measurements were taken. If the training was poor, it will produce images that don't correspond to what the subject was experiencing.

In the figure below, also from the Russian paper, we can see how well the ANN managed to infer the new images. The two upper rows show the images the subject was actually looking at when EEG measurements were performed, and the two lower rows show the images the ANN produced in response to these new EEG readings. The match, though still reasonable, isn't as good as that obtained during training, since now the ANN is trying to guess from data it has never before seen.


By explaining how this whole thing works, I hope to have made it clear to you that none of it has anything to do with the explanatory gap or the hard problem of consciousness; the Russian study, in fact, has no new metaphysical relevance. All it establishes is that there are correlations between patterns of brain activity and inner experience, but this we already knew. Such correlations are also entirely consistent with many other metaphysics aside from materialism (e.g. different versions of panpsychism and idealism account for the same correlations; even some versions of dualism do), so it doesn't privilege materialism at all.

The ANN produces images because it was trained with known images to begin with. It succeeds in linking EEG data to images because it was trained on the EEG measurements of subjects who were actually looking at the images. So it merely leverages the fact that the researchers already knew what the subjects were experiencing to begin with. The ANN presupposes the subject's experiences in its training set, it doesn't explain them at all. Do you see the point?

Insofar as it merely assumes the qualities of experience to begin with, brain image extraction technology doesn't explain these qualities. It can't explain that which it presupposes. All it does is to find a mathematical function that links two sets of data (inputs and outputs); it doesn't even begin to explain how qualities can emerge or be produced by quantifiable physical parameters.
Share:

Introducing 'Decoding Schopenhauer's Metaphysics'


My new book, Decoding Schopenhauer's Metaphysics (DSM), is now available for pre-ordering from amazon UK, amazon USA, and other retailers as well. In this post, I want to give you a brief overview of the book, tell you why I wrote it and why I think it is important.

Introduction

After I finished The Idea of the World—over a year before the book was actually published—I started an effort to trace my ideas back to their historical predecessors and anchor them in the Western philosophical tradition. In regard to 19th-century philosopher Arthur Schopenhauer, I took it light at first and read Christopher Janaway's little book Schopenhauer: A Very Short Introduction. I describe this experience, and what happened next, in DSM:
In the many quotes of Schopenhauer’s works included in [Janaway's] book, I believed to discern—to my surprise—clear similarities with the metaphysics laid out in my own work. Naturally, I felt his points were compelling. Yet, Janaway peppered his book with criticisms of Schopenhauer’s metaphysics. What he seemed to be making—or failing to make—of Schopenhauer’s words was quite different from what I thought to discern in them. Janaway saw problems and contradictions where I thought to see clarity, elegance and consistency. But since Janaway is the professed expert and I was just perusing quotes out of context, I initially suspected I was reading too much into them.

The only way to clarify the issue was to sink my teeth into Schopenhauer’s magnum opus: the two-volume, 1,200-page-long third edition of The World as Will and Representation [1859], in the same translation that Janaway himself used. ... In the ensuing months, I devoured the lengthy two-volume set, reading and re-reading it. I recognized in it numerous echoes and prefigurations of ideas I had labored for a decade to bring into focus. The kinship between my own work and what I was now reading was remarkable, down to details and particulars. Here was a famous 19th century thinker who had already figured out and communicated, in a clear and cogent manner, much of the metaphysics I had been working on. What better ally could I have found? And yet, bewilderingly to me, Schopenhauer’s “metaphysics has had few followers” (Janaway 2002: 40). Its utter failure to impact on our culture for the past 200 years is self-evident to even the most casual observer.
With DSM, I try to change this, for I think there is tremendous value in Schopenhauer's legacy for a 21st century readership, particularly in the modern context of quantum mechanics and the 'hard problem of consciousness':
I believe Schopenhauer’s most valuable legacy is precisely his metaphysical views: they anticipate salient recent developments in analytic philosophy, circumvent the insoluble problems of mainstream physicalism and constitutive panpsychism, and provide an avenue for making sense of the ontological dilemmas of quantum mechanics. ... Had the coherence and cogency of Schopenhauer’s metaphysics been recognized earlier, much of the underlying philosophical malaise that plagues our culture today—with its insidious effects on our science, cultural ethos and way of life—could have been avoided. (emphasis added)
In the book,
I offer a conceptual framework—a decoding key—for interpreting Schopenhauer’s metaphysical arguments in a way that renders them mutually consistent and compelling. With this key in mind, it is my hope that even those who have earlier dismissed Schopenhauer’s metaphysics will be able to return to it with fresh eyes and at last unlock its sense.

Value-add

A perfectly legitimate and good question that can be asked of a book about someone else's writings has been put forward by a participant of my discussion forum:
I never understood why would anyone read a book about a book wrote by someone else. ... why not just read the original and use your own mind to decide what the author wanted to say? ... why bother with third parties and not just read the original?
I replied to him by stating that, with DSM, I think I can help to

  1. disambiguate Schopenhauer's conceptually-loose terminology usage;
  2. clarify his argument under the light of modern psychology;
  3. place his ideas in the context of quantum mechanics, inexistent at his time;
  4. relate his discourse to modern issues emerging in ontology and philosophy of mind, which were also inexistent at his time;
  5. summarize and bring together his contentions in a coherent framework articulated in modern language, which people today can easily relate to.
All this said, I do think the best is indeed to read Schopenhauer's own words, if people are willing to face them: Schopenhauer's The World as Will and Representation alone has 1,200+ very dense pages in tiny fonts, written in an accessible but old-fashioned style. Because I suspect that most people don't have the time or the interest to plow through that, I felt an alternative would be valuable, for I want to make Schopenhauer's thought available to them too. DSM has only 144 pages and costs a fraction of Schopenhauer's original. After reading it, if their curiosity is piqued, the more interested readers can approach Schopenhauer himself with a solid basis for making sense of his words.

Goals

I have two main goals with DSM:
on the one hand, I aim to rehabilitate and promote Schopenhauer’s metaphysics by offering an interpretation of it that resolves its apparent contradictions and unlocks the meaning and coherence of its constituent ideas. On the other hand—and on a more self-serving note—I hope to show that my own metaphysical position, as articulated in my earlier works, isn’t peculiar or merely fashionable, but part instead of an established, robust and evolving chain of thought in Western philosophy.

Polemic

A key element in achieving both goals is my refutations—elaborated upon in detail in DSM—of present-day criticisms and misrepresentations of Schopenhauer's metaphysics, which unfortunately are rampant in academia. As a philosopher who has produced original work myself, the idea of my own writings being one day subjected to the kind of disfiguration and outright abuse suffered by Schopenhauer, at the hands of presumed experts, makes me sick. My sympathy for Schopenhauer compels me to try and improve the standing of his work.

Unfortunately, instead of producing original work of their own, some scholars in academia choose to make a career out of (mis)representing and criticizing dead philosophers' works. That these philosophers are no longer around to defend themselves seems to give license to the scholars in question to pass their own interpretative difficulties for errors on the part of the late philosophers; errors one wouldn't attribute even to a high-school student today. In other words, some critics seem to mistake their own intellectual obtuseness for (completely implausible) shortcomings in the argument of the philosophers they criticize. By presumptuously portraying themselves as intellectually superior, these critics perhaps feel that the recognition hard-earned by their targets—thanks to the latter's original work—rubs off on them.

Christopher Janaway characterizes Schopenahuer's metaphysical contentions as "something ridiculous" or "merely embarrassing," which should be "dismissed as fanciful" if interpreted in the way Schopenhauer clearly intended them to be. He claims that "Schopenhauer seems to stumble into a quite elementary difficulty" in an important passage of his argument. And so on. The freedom Janaway allows himself to bash Schopenhauer, and the arrogant, disrespectful tone with which he does it, are breathtaking. It is so easy to bash a dead man who can't defend himself, isn't it?

Ironically, all this actually accomplishes is to betray the utter failure of Janaway's attempt to grok Schopenhauer. Indeed, his apparent inability to comprehend even the most basic points Schopenhauer makes, and to think within the logic and premises of Schopenhauer's argument, is nothing short of stunning. Here is someone who just doesn't get it at all, and yet feels entitled not only to write books about Schopenhauer; not only to characterize Schopenhauer's argument as "ridiculous," "embarassing" and "fanciful" (Oh, the irony!); but even to edit Schopenhauer's own works! By now Schopenhauer has not only turned in his grave, but strangled himself to a second death.

Even more peculiar is Janaway's suggestion that it is Schopenhauer who is obtuse, for the "elementary difficulties" Janaway attributes to him couldn't be seriously attributed even to a high-school student today, let alone a renowned philosopher. At no point does Janaway seem to stop, reflect and ponder the glaringly obvious possibility that perhaps Schopenhauer does know what he is talking about and it is him (Janaway) who just doesn't get it. Instead, he portrays Schopenhauer as an idiot; how precarious, silly and conceited. He even accuses Schopenhauer of crass materialism, despite Schopenhauer's repeated ridiculing of materialism and the fact that Schopenhauer's whole argument consistently refutes it in unambiguous terms. I discuss all this in detail in DSM. Here it shall suffice to observe that, to be an expert on anything, it takes more than just study; for if one can't actually understand what one is studying, no amount of scholarly citations will turn vain nonsense into literature.

I richly substantiate my criticism of Janaway in DSM: I carefully take his contentions apart, while clarifying Schopenhauer's points in a way that should be clearly understandable even to Janaway. So if you think I am exaggerating in this post, please peruse DSM: it can be leisurely read in a weekend or, with focus, in a single sitting, so it won't cost you much time at all to see whether I actually have a valid point.

Tackling other misrepresentations

Amazingly, some attribute dual-aspect monism to Schopenhauer. Indeed, as of this writing, Wikipedia listed his metaphysics as an instance thereof. I can only imagine two reasons for such a vulgar misunderstanding: either one has read only the title of Schopenhauer's main work (The World as Will and Representation) and arrived at conclusions from it alone, or one doesn't actually know what dual-aspect monism means. Again, I elaborate much more in DSM.

Conclusions

Despite all this, DSM isn't primarily about polemics and refuting misunderstandings and misrepresentations, even though it is about that too. Primarily, it is about elucidating, in a concise and easily-accessible manner, Schopenhauer's extraordinary and sophisticated ideas on the nature of mind and reality; ideas whose plausibility, explanatory power and importance have only increased over the past two centuries. Schopenhauer's work is a veritable metaphysical treasure that deserves much more recognition than it has gotten. Even more importantly, we, 21st-century readers, deserve the gift Schopenhauer has left us as inheritance.

Sometimes, those who preceded us weren't just naive and ignorant, 'primitive' versions of ourselves—as some scholars conceitedly seem to think—but in fact saw farther than most of us do today (including scholars). We ignore and dismiss them at our own peril.
Share: