Valedictorian Speech

Maeser Prep Academy Class of 2017

I am honored to give this speech, I really am, almost as much as I am under-qualified to give it. My speech will be kind of like high school: the best part is when it’s finally over.

Street philosopher and leader of the free world, Drake, once said “Came up, that’s all me, stay true, that’s all me. No help, that’s all me, all me for real.” I just have to say, those lyrics are the opposite of how I feel right now. I am inexpressibly grateful for the people who have shaped me, inspired me, forgiven me, and helped me in endless ways.

First, my parents and family. Their patience with me is inexhaustible, and there is almost nothing that they haven’t done for me. I’m sorry I don’t appreciate you enough. I’m thankful for my friends and fellow students; you almost certainly don’t expect it, but I look up to so many of you. You have made high school an exciting, enlightening, and sometimes strange experience. And of course, our teachers. We have the most inspiring teachers the world has ever seen. There are so many teachers here who could give better advice than I ever could.

Thank you to Mr. Watabe for teaching me that math is simple, and more importantly, life is simple. If you put in the effort, you will get faster, you will get better, and you might even learn calculus.

Thank you to Mrs. Plott for making me realize that velocity is more important than speed; what direction you’re going matters more than how quickly you’re going.

I am grateful to Mr. Dowdle for redeeming humanity despite the horrors of our past, for your effortless humor, and for your life-changing analysis of literature. I’m finally in uniform, by the way.

Thank you to Mrs. Frampton, for showing me that without art, writing, and even poetry, we cannot understand the sacredness of life.

Thank you to Mrs. Martinez, for encouraging us all to be more compassionate. Through her character, she showed us there is no qualification, no skill, and no aspect of your resume that is more valuable than the effect you have on others.

And Mr. Simmons – ah, what can be said about Mr. Simmons that he has not already told in excruciating detail to his freshman Socratic class? Thank you Mr. Simmons for demonstrating the meaning of education as sacred, and for your genuine dedication to your players and students.

Thank you to Mrs. Slade, for showing us the wonder of the natural world, the miracle of the human body, and the power of human potential.

Thank you so much to Mrs. Sidwell, who isn’t here today. You introduced me to the sound of my own voice. The impact you had on my life cannot be underestimated.

I can’t list all of my teachers and mentors, but I have been changed by every class I took at Maeser – yes, even Financial Lit. Especially Financial Lit.

Maeser is one of the at the best high schools in the state. Honestly, it is easy to feel intimidated and inadequate in a school of stars like Maeser.

After all, not all of us have Logan Norris’ relentless determination;

Not all of us are as kind as Morgan Millet;

Not all of us have James Johnson’s flawless ability in acting;

Not all of us have the perfect calves of Felix Resendiz.

Few of us are as fearless as Bennett Felsted and Jared Drake,

None of us can do as many pull ups as Jacob Radmall,

And David Van Horn isn’t here, but few of us are as mature, contemplative, or understanding as him.

Not all of us are as ready to give compliments as Brittany Oveson.

None of us can beat Taylor Brand’s scoring record;

Few of us have the constant cheerfulness and optimism of Ben Hailstone or Maliana Corniea;

Few of us have the easygoing attitude of Braden Christensen;

Few of us have the quiet wisdom of Tashara Muhlstein;

None of us party as hard as Dakota Clubb;

None of us work as hard as Lia Joo;

We can’t all jump as far as Sammy Windley, sing as high as Varia Aird, or play the piano like Anastasia Felt.

None of us can parkour like Olle Hansen.

Few of us have the resilience of Mustafa Hamidi;

And even fewer of us have the hope and strength of Sophie Cannon.

I could keep going, I promise you.

But despite the fact that we can’t compare ourselves to the people around us, each of us has an opportunity – no, an obligation – to share our unique and powerful selves with the world. Growing up with you all in the last few years has been an incredible experience. Because of this, I will ask that you remember one thing after this speech: never think you have nothing original to say, never think you have nothing important to do, and never think you have no purpose.

Often, our endless routine overwhelms us like the numbing smell of burnt popcorn in the cafeteria microwaves. The beating repetitiveness of everyday life causes us to reduce ourselves to just another student, just another worker, just another brick in the wall.

People often say things like this: “I don’t have time for philosophy. I have bills to pay.” If you have not yet said this, then just wait – in college, you will say it or at least think it. How do I know this? Well, I don’t, but I’m up here making sounds, and that’s the important thing. That’s what I’m here for.

But honestly, it is easy to become swallowed up by the mundane parts of living. We should all be terrified that without even noticing, we will slip into a trivial life of grinding for grades, working for mere money, depending on empty praise, obsessing over flimsy worries and meaningless stresses, becoming something we aren’t to impress people we don’t like. Unless we watch closely, the ordinary will consume us. Unless you pay attention, your purpose will disappear like quicksand. Smooth and natural. With barely any sign you can notice or suspect. And you walk on it easily. When you’ve noticed your purpose is gone, it’s too late.

So when you find yourself saying “I don’t have time for philosophy,” remember that this is not philosophy, this is life. You don’t have enough time to not think about it. With every day, you should consider the fundamental, the deep, the sacred, the truth beyond the mundane. Otherwise, why wake up each morning?

Clear away all the debris that separates you from that essential, bright, illuminating part of yourself – that core that is much deeper than personality – your soul, your life-force, your chakra, or maybe it’s just your appendix or something. Avoid the things that reduce you and make you trivial, and look for those things that make you passionate, deep-thinking, and full.

Lin Yutang said that “Those who are wise won’t be busy, and those who are too busy can’t be wise.” Reject the idea that you can afford to merely fill or kill time; your busyness will prevent you from seeing beyond routine.

Well, as Dylan Armstrong said, I’m sorry Maeser, but I’m breaking up with you. Luckily, I’ve already set someone else set up so the rebound will be quick. She’s kind of out of my league, so I proposed to a bunch of backups in case she rejects me. Also, I’m going to need her to pay me like 50,000 dollars or it’s not going to work out. That’s basically what applying to college is.

I’ll eventually forget about my times with Maeser, but I will remember what I learned from the relationship. After all, it lasted four years, and I have done so much with Maeser – my sweat is on her field, my tears are on her desks, and yes I admit it, my popcorn is in her microwaves. I learned that the pursuit of truth, the process and journey itself, is always worth it – even if that truth is as difficult to find as a high-quality meme on Facebook. I learned to always remember my real purpose. To live beyond the mundane. To be inspired by those around me.

Now is the time to put these principles into action. As Epictetus said, “From now on, whenever you encounter anything, remember that the contest is now: you are at the Olympic Games, you cannot wait any longer, and your progress is wrecked or preserved by a single day and a single event.”

And most importantly, pretend my last line was super funny.

Last Lecture

“Somebody once told me the definition of hell: On your last day on earth, the person you became will meet the person you could have become.” — Anonymous


What if a demon slithered up to you after your graduation, and forced you to relive all of highschool?

No, not just the dances and the weekends, but every single moment. Would you scream, fall to the ground, and curse the demon? Or would you call him an angel and plead to do it again? Are you ready to walk down the middle school hallway an infinity more times? Are you willing to put on a polo every morning until Maeser’s logo is permanently stuck to your chest?

Along with the joy, are you willing to reenact the routine, the broken relationships, the depression, the rejection, the failures, the boredom, the ugliness that comes packaged with every beautiful experience? I don’t care if you’re a senior or a 7th grader – could you affirm your life so far without a doubt? Could you live it again and again until each second becomes familiar? Do you want this – all of this – again, and countless times more?

If not, then start living so you can say yes to every single moment.

How can you live this way? Well, as usual, the answer you need is not what you’d like to hear: Don’t be yourself. Be more. Dare to reach outside of what you normally call yourself.

For most of highschool, I was exactly what others expected of me, and what I expected of myself. I was never anything but my usual self. But I needed to let go of those comfortable habits that are wearing away at my potential; I needed to I stop being me and start being more.

Stop wandering the halls aimlessly, thinking of people in terms of stereotypes and first impressions. Stop looking at a person and assuming you know who they are. Start learning names and seeing eyes.

Stop walking past the teacher’s door, glancing in, afraid to enter. Start building relationships with these spectacular people who can be your inspirations and your mentors.

Stop ignoring your Socratic books and stop skimming sparknotes the day of the test. Start learning so passionately that you can’t remember what the assignment was.

Stop listening to to music you don’t love because you feel like you should. Stop pretending you’re a baller because you wear joggers with fake yeezys. Stop thinking you’re edgy because you use words like ‘edgy.’ Stop bragging about how much you’ve procrastinated. Stop thinking that you have enough time. Stop being yourself, and start being more.

In every typical suburban Utah house, right in between the picture of six kids and the religious painting, there’s a cute little sign that says: Remember who you are. But in some ways, that’s bad advice. So much joy can come when you who you’re supposed to be just slips your mind.

Some of your greatest moments will come when you forget who you are. The only reason I tried out for the soccer team was because I forgot I was supposed to be the awkward kid who reads philosophy books and talks too much in Socratic. Because I stopped being myself, I suffered through hours of sprints and San Diegos; weeks of stinging turf burns and soreness; food fights and brawls on a cramped bus; the painful inadequacy of missed passes. At the end of it all, I found unexpected strength and friends I never would have had otherwise.

The only reason I took AP Calculus is because I forgot I’m bad at math. I barely kept up in Math One; I was mocked in elementary school for fumbling at long division and staring blankly at equations. For me, watching people manipulate numbers was like chasing cars on the freeway; I couldn’t possibly keep up. But eventually, I decided to stop being myself, and I signed up for Calculus.

Do what you can’t. Whenever there is something you are afraid of, something you would love but you know you’ll never do – that is what you need to do. You know what I’m talking about. The girl. The performance. The adventure. Stop avoiding them. Start scaring yourself every day. If you jump off the cliff, you have no need to explain yourself to those who stand and watch.

Don’t be yourself.

Stop ‘discovering yourself.’ Stop waiting to stumble upon who you really are. You are more than just you; who you are striving to become. As Nietzsche said, “your true self does not lie buried deep within you, but rather rises immeasurably high above you, or at least above what you commonly take to be yourself.”

Stop looking for yourself and start building yourself.

 

Moving Out

Never again will I have to bear

the shrill of screams

As some pitiful sibling endures a bath.

The mess of a house tearing at the seams,

The grateful hugs of a fragile sister.

Nights cut short by a sharp curfew,

Brothers wrestling as dirt flew,

And you know no one will ever hurt you.

Parents give unwanted advice, incessant

Always treat you like an adolescent.

Home falls far short of ideals.

Less than perfect homemade meals,

Every bedtime raucous with squeals.

Floor festooned with debris, never clean

But at least it’s never empty.

Every week another crisis,

Mom stressing about rice prices,

Laughs absorbed into yellowed sidewalls

Stepping on the floor where a child sprawls.

How can I walk out a door I’ve always walked in?

Where I’ve slept every year since birth,

Where I know every edge and margin

Where six placentas are buried in the garden.

Someday soon I’ll come home,

To walk again the hills I once would roam,

To see if the cracks in the ceiling have moved,

And show somehow my worth I’ve proved.

To meet with my moth-eaten bed so old,

And test if its springs still recognize my mold

Someday I’ll walk as always,

through the same hallways,

Look in the same eyes and see the same worries,

But it won’t ever be the same.

A Kierkegaardian Approach to Philosophy: Replacing an Objective Career with a Subjective Relationship

Philosophy has been firmly and comfortably institutionalized. It exists primarily to teach useful, marketable, career-building skills, buzzwords like critical thought and complex reasoning and clear writing. Philosophers are another cog of capitalism, mass-produced in university departments to either join the economy or perpetuate the academic study of philosophy. Like physicists and biologists, they are judged by their ability to produce original, peer-reviewed work. Philosophy is a specialized field along with the sciences, practiced in research institutions.

Any ‘serious’ philosopher is found solely in the university; no longer can philosophy be practiced by any audacious questioner. Measured by gross product, philosophy is more successful and productive than at any other time in history. Thousands of philosophers throughout the world produce well-researched, logical papers that engage with the traditional problems. Many of these papers are ingenious marvels of philosophical reasoning. Who can deny the brilliance of Slavoj Zizek’s cultural criticism, John Searle’s philosophy of mind, the late Hilary Putnam’s rigorous analysis?

And yet perhaps we are missing something in this Cambrian explosion of philosophical activity. Philosophy has gained a permanent place in the academy. To paraphrase an introduction to “Socrates Tenured” by Robert Frodeman and Adam Briggle, philosophy has its own arcane language, its hyper-specialized concerns, a network of undergraduate programs, and an ecosystem of journals. Like Bertrand Russell, I wonder whether institutional philosophy is “anything better than innocent but useless trifling, hair-splitting distinctions, and controversies on matters concerning which knowledge is impossible.”

The goal of philosophy is no longer to guide individuals on their attempt to become paragons of wisdom and virtue. Its purpose is not to assist the philosopher in reconciling the absurdity of the world, nor can it instigate the creation of meaning out of this nihilism. It is not meant to be related subjectively to the individual who is actually living the philosophy and experiencing its effects. It is meant to produce knowledge. An honest, valuable task – yes. But it is not enough.

After all, this knowledge is produced in the typical academic fashion. It is based on rigorous, ‘impartial’ research by graduates trained in cleverly analyzing arguments. This research is then applied to produce objective knowledge that has no moral impact. I mean: the knowledge is not meant to make one a better person, but to be used as a “de-moralized tool” for civilizational progress (source). Throughout the process, the producers of knowledge remain separate from the knowledge they produce. It is sterilized, abstracted – without impact.

The philosopher argues passionately for his/her thesis, employing every art of logic available. But what if the thesis is true? No matter. It will not change the philosopher’s life nor anyone else’s. It is merely another postulate verified until refutation, another step on the track to tenure. As Kierkegaard wrote, “not until he lives in it, does it cease to be a postulate for him.” It is ridiculous to imagine the philosophers living by their theses.

All this production and institutionalization is merely disguising a catastrophe. We are facing the same problem in philosophy that Kierkegaard faced in religion: the demise of the subjective relationship. We have lost the duty to philosophy that Socrates felt and in some ways died for, the intimately personal, individual relationship to the content of our study. At his final Apology, Socrates said “as long as I draw breath and am able, I shall not cease to practice philosophy.” He had so fully embraced philosophy that any other form of life was “not worth living.”

Philosophy is not merely a set of practices and shared academic norms. It is a way of living. And if it fails to be a way of living, all its academia will be unmasked as hollow. Then could Nietzsche’s madman declare with certainty that along with God, the Philosopher must be buried as well – “We have killed him — you and I. We are all his murderers” (The Gay Science). Divorced from the lived experience of the philosopher, how can philosophy be meaningful? If philosophy is to have any value, it must be “through its effects upon the lives of those who study it” (Bertrand Russell).

Perhaps no one but Kierkegaard has articulated this problem in all its scope. In his Truth is Subjectivity, he wrote:

Our discussion is not about the scholar’s systematic zeal to arrange the truths of Christianity in nice tidy categories but about the individual’s personal relationship to this doctrine, a relationship which is properly one of infinite interest to him. (source)

To be a philosopher, one must take a radical leap. It is like Kierkegaard’s leap of faith, the jump into the abyss. Namely: one must be willing to live by one’s conclusions. After all, philosophy claims to both describe and prescribe reality, the ethical life, the social sphere. If this is the case, then how should we live differently because of it? If Plato is indeed correct about the immortality of the soul, what then should we do? Every philosophical premise, when carried the full length, has an ethical conclusion. They dictate what should be done.

But even here we encounter a problem. We are supposed to live by our conclusions, yes, but we are also supposed to live by the method of philosophy. And yet this method questions every conclusion. How can we live by a conclusion that can be called into question and invalidated the day after? Philosophy is built upon the dialectic, the constant shift in thought and relentless doubt of each and every premise. An objector might claim that the dialectical is fundamentally antithetical to the meaningful, as a life cannot be built upon an ever-shifting foundation.

Perhaps, then, Kierkegaard’s approach is only sensical in religion. After all, religion is not dialectical. It remains solid, and thus a subjective relationship can be built. A subjective relationship to religion builds a bridge between rock-solid cliffs; a subjective relationship to philosophy builds a bridge between wind and tossing waves. One must be able to stop somewhere if a meaning of life is to be built, and religion provides the stopping-point.

Philosophy, however, is not incapable of providing meaning. It has merely been so often misapplied that it seems impossible to truly live by. The solution is this: one must be willing to set a direction for the dialectic. Kierkegaard himself did not set a rock of Christianity and declare “now, build a relationship with this!” His project was to advance, not end the dialectic. His intention was to “create difficulties everywhere” (Concluding Unscientific Postscript). He sought to push individuals to recognize the flaws of dogma, and through this to create their own relationship with Christianity independent of the traditional Nicene doctrine. But this dialectic was not unguided: its goal was to become a Christian.

This aim, in a somewhat paradoxical sense, could only be achieved by first negating it: recognizing that I am not a Christian now. Thus, the dialectic became essential to the process. The creation of a true relationship with Christianity was made possible only through destruction – through eliminating the bromides and dependence on institutions.

This same process of guided dialectic applies to philosophy. By focusing on a specific end goal, our dialectic gains a foundation. New information does not destroy the foundation. Rather, it clarifies and polishes the foundation and assists in its construction. For example, I may decide my end goal is to become a virtuous person. When I discover that one of my practices was not after all virtuous, this does not destroy my ability to live by my philosophy. After all, my end goal is not called into question. I still aim to be virtuous. But my process has been refined, as I now know one more thing that I should not aim for.

Therefore, we now have a basis for developing a personal relationship with philosophy. Before anything, an aim must be set as the goal of all dialectic. Then, we must live by this aim, constantly seeking to refine and expand it. The aim cannot be called into question. This aim is the fundamental, subjective truth, one that is lived by the individual so intensely that it cannot be invalidated. It must be of infinite significance to the individual. It should not be taught in universities, but pursued by the individual.

We must assume the end goal in order to justify the process itself. After all, if we open up our end goal to justification – and therefore criticism – how can we truly be devoted to it? It becomes merely another premise among many, one that may be quickly invalidated by a new paper in an academic journal. Only through the powerful, subjective force of faith can we believe in the end goal. (To clarify, I do not at all mean faith in the religious sense. You’ll catch my meaning as I develop the idea.)

Faith, after all, is inevitable in life. Over the centuries, the most powerful trend in philosophy has been skepticism. We have certainly not completed Aristotle’s task of classifying and demonstrating everything, but we have made tremendous progress in Socrates’ task of calling everything into question. In a philosophy where there is no convincing demonstration of the possibility of knowledge itself, how can we claim to have a comprehensive system that eliminates the need for faith? Rather, we have merely shown the overwhelming need for faith.

Faith is a teleological suspension in the face of uncertainty. The need for this suspension derives from three fundamental and undeniable aspects of existence. First, we are uncertain at the most basic level, and unable to make a decision based on the process of systematic reasoning. Second, despite our uncertainty, we must make a decision. Third, and too often ignored, we have ends, desires, dreams – things that we feel we must accomplish or at least strive for. We may know objectively that these dreams are irrational, unjustified – and yet every individual feels a need to search after some aim. In the face of this telos, it is unacceptable to merely deny agency and revoke our ability to make a choice, and it is unacceptable to choose arbitrarily.

How can we reconcile these three facts of our existence – uncertainty, agency, and the telos? Only through faith. We ignore our uncertainty, and make the decision based on the telos, the goal. We decide our ends and believe them by faith. Every decision must be directed by this touchstone. It is constantly refined through the dialectical process, but the telos itself is never subject to the dialectic.

In his Existenzphilosophie, Karl Jaspers wrote, “Philosophic meditation is an accomplishment by which I attain Being and my own self, not impartial thinking which studies a subject with indifference.” The philosopher should be intimately engaged with the philosophy; it should not be an object of study, but a way of living that has infinite impact. The goal of philosophy must be reexpressed in Kierkegaard’s terms, as the search to “find a truth that is truth for me, to find the idea for which I am willing to live or die.”

The Fetishization of Individuals: From Hitler to Ken Bone

Humans have a relentless tendency to treat individuals as microcosms for the world. If we can identify a certain individual who fits into a group, we generalize this individual and make him/her representative of the group or concept as a whole. When we speak about these concepts or groups, we are implicitly thinking of these fetishized individuals. Thus, the ‘philosopher’ becomes Plato; the ‘drug lord’ becomes Pablo Escobar; the ‘autocrat’ becomes Hitler. These people that stand as concrete symbols for entire ideas are what I call ‘fetishized individuals.’

There is a constant political battle for control over these fetishized individuals. If someone humanizes and normalizes Pablo Escobar, they successfully humanize and normalize the drug trade as a whole. They take control of the image of the drug trade – the vivid, personalized, and individual representation. Then, when someone thinks of the drug trade, they think of Pablo Escobar – the friend of the poor, the anti-corruption, anti-communist activist, the family man.

Pablo Escobar as a criminal – the negative fetishization of a drug lord.
Pablo Escobar with a child – the positive fetishization of a drug lord.

When another representation is introduced, it is considered in the context of the existing fetish. Thus, it is extremely difficult to argue that El Chapo is terrible to convince someone who has internalized a positive version of Pablo Escobar as the representation of drug lords. Any logical argument is subordinate to their personal ideology-based ‘experience’ of Escobar. Perhaps a poor man heard Escobar gave out money in the streets and built schools for the impoverished; this gives them an emotional attachment – a fetish in a non-sexual sense – to the narrative of Pablo Escobar.

Modern political conflicts have begun using fetishized individuals in more obvious ways than ever before. The most clear example of this is Hitler, for he is the most completely fetishized person in the world. For almost everyone with an elementary education, mentions of autocracy, fascism, dictatorship, and genocide generate immediate images of Hitler with arm raised. One cannot win the ideological battle of making autocracy acceptable until one has made Hitler acceptable.

The first ideological step of neo-Nazis, therefore, is making the fetish of Hitler positive. This can be done in a variety of ways. For example, the extreme right-wing and anti-semitic site Rense.com published a series of images of the ‘hidden’ Adolf Hitler. Using these images of him – holding children, walking in gardens, smiling – makes it much harder to imagine him other contexts. We find it conceptually difficult to unite the many disparate aspects of a person into a single unified identity. How could the same Hitler that ordered the Holocaust also kiss babies? Psychological research shows that cognitive dissonance like this causes tangible pain. The drive to eliminate the dissonance, then, leads some to fetishize Hitler in a wholly positive way.

A positive fetishization of Hitler

The Netflix original Narcos powerfully represents our difficulty in categorizing individuals. You see Escobar in a variety of contexts – at home with his family, in drug labs, on a farm working, and at war. It becomes difficult to remember his horrific crimes when he is watching clouds with his young children. We can’t really conceptualize a ‘whole’ person – only the person we are seeing at the time. Uniting all the different Escobars into one unified individual is almost impossible. Ideologies take advantage of this inability to unify, and summarize individuals by a single aspect. For some, the need to resolve cognitive dissonance means forgetting Escobar’s crimes to enable a positive fetishization of his figure.

In the most recent presidential elections made fetishization a key aspect of political strategy. In 2008, Samuel Wurzelbacher asked Obama a simple question about small business tax policy – almost instantly making him a key symbol of the presidential election. He mentioned something about wanting to buy a plumbing company, and the McCain campaign leaped at the chance to relate to an ‘ordinary American.’ They coined his new name – Joe the Plumber – and repeatedly used him as an example in campaign rhetoric. McCain used the symbol of Joe the Plumber to show that Obama was ‘out of touch with the average Joe.’ It didn’t matter that Samuel wasn’t really a plumber and his name wasn’t really Joe.  Throughout the campaign, writes Amarnath Amarasingam, “A fictional plumber’s false scenario dominated media discourse” (source). 

In the modern election, it seems that the myth of the ordinary Joe has taken hold even more firmly. America has a need to believe in the normal citizen, a 9-5er who wants only find his dreams, stick to his moral standards, and support his family. And yes, this citizen is a he – we seem unable or unwilling to use a female figure as a symbol of American life.

Why do we feel a drive for the ordinary? After all, we are obsessing over the nonexistent. There is no ‘ordinary Joe.’ Every citizen has quirks, mistakes, sins, hidden lies, and extravagant dreams that prevent them from being ordinary. Joe can only exist as an idealized symbol, not a concrete individual. And yet the idea of the ordinary citizen is permanently entrenched in our minds. In some way, many people aspire to be average. This aspect of the psyche creates political battles over the ability to protect the ordinary individual, who stands as a metaphor for the whole American citizenry.

Thus, Ken Bone was created. He was a symbol of an ordinary person – appropriately but not excessively involved in politics, working the day job, dreaming small dreams, providing for the family. He was 2016’s version of 2008’s Joe the Plumber. He represented simple authenticity, the everyman – as his Twitter profile proclaims, he is merely an “average midwestern guy.”

He did not decide to become a meme. The media did not make him a meme; they merely capitalized on the attention once Ken Bone had already gone viral. He was not mass-produced by campaign offices and political propagandists. In an act of near-randomness, he was dubbed a meme by the distributed irrational network of sensation-seeking individuals we call the Internet.  The random series of viral creations in 2016 revealed that memes are fundamentally uncontrollable. After Harambe, damn Daniel, Ted Cruz the zodiac killer, how could we be surprised that Ken Bone was crowned a meme? 

Ken Bone could not even control what he himself symbolized. He attempted to control his own signifier by consistently exhorting people to vote and make their voices heard. But all his efforts, for the most part, failed. Ken Bone does not symbolize democratic participation. After all, memes are inherently dehumanizing. To become a meme, an image must be dissociated from its reality and turned into something else. In linguistics terms, it’s a sign whose signifier is malleable — the image’s meaning, thus, is created by those who share it. The meme itself has no power over its meaning.

This is the danger of living memes – they are tossed around by the whims of the Internet. And when these whims turn sour, the person suffers. Ken’s slightly quirky reddit history was revealed, and he was painted as a monster. 

I expect this process to continue endlessly: an individual becomes a sign that stands as a placeholder for a piece of political ideology. The individual is the object of immense attention, and then is tossed out like discarded trash. We should be careful that our memes do not make us think this is what people truly are. And we should not be surprised when the myth of the ‘ordinary citizen’ is shattered by the reality of the individual’s life and being. 

 

The Agent-Age Problem for Consequentialism

Suspend your disbelief for a moment, and imagine the 6-year-old daughter of a major world leader travels with her father to a major nuclear launch site. She is left unsupervised, and happens to wander into the launch room. There, out of curiosity, she presses the big red button.

This launches a nuclear weapon that immediately kills millions of people. Before the weapon has even detonated, other nations have launched missiles of their own. A single launch rapidly escalates to nuclear war. Billions of humans and nonhumans are killed, and the planet is left barely habitable.

This scenario is clearly implausible to the point of impossibility – the big red button, after all, doesn’t even exist. However, it is a useful archetype that raises serious questions for consequentialism. Consequentialism inadequacy in certain moral issues is clear when the accidental action of a small child leads to immense suffering. I’ll add another example that deals with similar issues, but that is far more likely.  

A young boy happens to find a few matches on the floor of his family’s garage. While playing with them and scraping them across the rough floor, one of them ignites. In panic, the child rushes to the garbage and throws the match in. Then, losing interest, he walks inside and finds something else to do. The match lights a fire into the garbage which spreads into the house. The house burns to the ground, killing everyone inside. The fire spreads to nearby houses and kills or injures several more people.

This type of counterexample to consequentialism is demonstrably plausible, as there is empirical documentation of similar cases. According to the Washington Post, at least 265 Americans were accidently shot by children in 2015. Many of these shootings resulted in tragic deaths. Meanwhile, the number of American fatalities due to terrorism in 2015 was about 20, depending on certain counting methods.

In a truly consequentialist atmosphere, accidental shootings by children would be discussed far more than terror attacks – precisely 13.25 times as much. Moral deliberation on an action would be indexed to the amount of pain or happiness caused by the action. But in reality, the ethical issue of terrorism is discussed prolifically, while accidental shootings by children are virtually ignored. Why is this the case? I argue that while the amount of discussion on terrorism doesn’t reflect consequentialism, it does reflect our moral intuitions. We assign greater condemnation to actions not based on the numerical impact of these actions, but based on the intention of the actor, the nature of the action, and the emotional impact of the action.

The probability of child accidents will only increase in our modern world, as dangerous technologies proliferate and become more available, the population expands, and systems become more interconnected. A single accidental action by a child can result in unfathomable pain. However, our moral intuitions indicate that accidental actions by children are not blameworthy. Can consequentialism reconcile this problem?

Interpretation

I will use this definition of consequentialism, based loosely on the one from Stanford Encyclopedia of Philosophy:

“The belief that consequences are the only normative property that affects the rightness of an action.”

Or, in simpler terms, an action is made right or wrong only by its consequences. Consequences are the only morally relevant consideration.

Thus, if this essay shows that there are non-consequential normative properties that affect the rightness of an action, then consequentialism is false. I will use primarily an intuitionist approach to prove this claim – that is, showing that consequentialism is incompatible with clear moral intuitions. I will not touch on whether intuitionism is true; I will just discuss the consequences of its assumed truth.

Normative properties are defined as any ethical aspects of an action. This is a simple and non-rigorous definition that would be considered inadequate by many metaethicists, but it works well for the essay. For example, “rightness of intent” is a normative property, as it is an aspect that could impact the ethics of the action. Furthermore, this would be a non-consequential normative property.

What are the relevant normative properties in the examples above? I will consider the following:

  1. Intent – the actor’s purpose or intended goal in a certain action.
  2. Actor – the individual who commits the act.
  3. Consequence – the morally relevant impacts of the action.

Different moral theories place different emphases on these properties; consequentialism is the theory that only the third property is relevant to the rightness of an action.

In the case of the child pressing the red button, I believe we have clear answers as to the ‘value’ of these properties. The consequences are certainly bad. The intent is morally indifferent, as the child did not intend for anyone to suffer nor for anyone to benefit from her pressing the button.

The most interesting property is the second. Our moral intuitions agree that the age of the actor is morally significant. If a child commits a crime, they are considered less morally responsible than adults. This intuition is ingrained in law – individuals are not usually morally responsible until the age of 18. Some religions have an ‘age of accountability,’ which makes people accountable to God for sins after it is reached. Since children are less capable of complex moral reasoning, they are less responsible for mistakes in this reasoning.

Furthermore, there are also arguments for the moral relevance of the age of the actor that are not based on intuitions. For example, the following deductive argument:

P1. One is not morally responsible for what one does not know.

P2. If one is not morally responsible for what one does not know, then people who know less than others are less morally responsible.

P3. Children know less than adults.

P4. Children are less morally responsible than adults.

Thus, when the child presses the red button, and she does not know that this will fire a nuclear weapon, she is not morally responsible for the nuclear war that ensues. This argument attempts to prove that children in general are less responsible, but it can also be applied in any case where lack of knowledge is involved. If someone does not know the consequences or nature of an action, they are not morally responsible for this action.

Based on clear moral intuitions and the above deductive argument, the action of pressing the button is either (1) less wrong or (2) morally indifferent when the actor is very young or when the actor does not know the consequences. Either case means that the actor – a non-consequential normative property – affects the rightness of the action, disproving consequentialism.

The Gradual Causes and Long History of the ‘Fake News Crisis’

It seems undeniable that the specter of fake news has taken control of the media. It seems that we’ve now entered a dark age of journalism, where the fake is indistinguishable from the real. It seems that we have entered an unprecedented era of hoaxing and counterfeiting.

But journalism has never been free of fake news. The Columbia Journalism Review published a detailed history of fake news in the United States. In short: fake news isn’t new, and it has real impacts. For example, people fled the city in droves and marched into public parks with guns after the New York Herald published a fabricated report that dangerous animals had escaped the zoo.

And fake news existed even before Gutenberg invented the printing press. In 1475, an Italian preacher claimed that Jews had drunk the blood of an infant (source). This led a local Bishop to order the arrest of all local Jews, and fifteen Jews were burned alive. The fake story spawned even more hysteria about vampiric Jews, which spread across Europe despite declarations from the Pope to try and end the panic.

Fake news has unbelievable power. In Journalism: A Critical History, Martin Conboy demonstrated its dramatic role in history. In 1898, the USS Maine exploded off the coast of Havana, killing over 250 people. The cause was never explained. The Spanish government, which controlled Cuba, expressed sympathy for the disaster and denied any involvement. The captain of the Maine, one of the few survivors, urged Americans to withhold judgement to prevent conflict with the Spanish.

The headlines after the Maine explosion – based on fake news.

Regardless, Joseph Pulitzer, editor of the New York World, quickly condemned Spain, claiming that they sabotaged the Maine. The World published a cable showing that the Maine was not an accident – even though this cable was completely fake. Newspapers published imaginary drawings of the explosion – even though no one had seen it.  Sales of the World skyrocketed, and the public demanded revenge. Fake news helped start the Spanish-American War. Maybe we shouldn’t be surprised that the namesake of the highest award in journalism, the Pulitzer prize, was a purveyor of fake news.

So is there anything new about the recent fake news? Yes – because Americans are far more dependent on news. News, both print and digital, takes far more forms than at any other point in history – videos, images, blogs, tweets, posts, articles. Almost all Americans can read basic English (source), 84% of Americans use the internet (source), and 79% of American internet users are on Facebook (source)

Never before the last decades has the vast majority of the population been simultaneously connected to a source of instant news. A meme, story, or fake event can spread across the public awareness in a few hours. The fundamental nature of fake news hasn’t changed. It has just become far more common and accessible – just as the modern transportation system allows viruses to spread far more quickly.

A map of global disease spread – not too far from the transmission of fake news.

Furthermore, perhaps the American public has become increasingly vulnerable to fake news. While this claim is hard to demonstrate and somewhat unverifiable, it’s possible that the average reading level has declined. In 1776, the relatively complex, sophisticated pamphlet Common Sense sold 500,000 copies, roughly 20% of the colonial population (source). Now, less than 13% of Americans are proficient in “reading lengthy, complex, abstract prose texts” like Common Sense (source). It seems that the percent of Americans who can understand Common Sense is smaller than the proportion that owned Common Sense in 1776. Plus, the most recent studies show that American reading proficiency has declined over the last two decades (source). Even among college graduates, the proportion that can understand and reason about complex texts has decreased to less than 31% over the last decade (source).

It’s a viable theory that these two trends – increasing access to news and decreasing reading ability – have shaped a perfect storm for fake news. Americans aren’t as likely, or as able, to make nuanced, reasoned analyses about complex texts. They’re more likely to have access to the oversimplified and sensationalized world of internet news (and news in general). More people can be infected by the virus (fake news), and less people have the vaccine (critical thought). As a result, a single tweet can spawn a flurry of fake news that quickly becomes an accepted part of the American psyche.

However, the concept of fake news is also dangerous in other ways. It has already been used as a political weapon to shut down opposing journalism. The left has used it to deride right-wing sources, and the right has coopted it to attack left-wing news. Already, the LA Times and Washington Post have claimed that right-wing sites like the Ron Paul Institute and Breitbart.com are ‘fake news’ (source).

These sites could be derided as biased producers of dangerous propaganda, but this is not the type of fake news I’m interested in. Breitbart may be skewed, but it does base news loosely on actual events. Fake news is completely counterfeit – without referent in the real world. To avoid ‘fake news’ becoming a tool to eliminate enemy voices, we need to delineate the concept clearly and create solutions carefully.

This is an intro to some of my further research into fake news. This week, I’m going to write another article about the philosophy of fake news, and then one about the solutions to the problem. I’ll try to relate the issue to Baudrillard’s theory of hyperreality, examine the differences between Kantian and utilitarian journalistic ethics, and look back to Plato’s critique of postmodernism. Maybe I’ll even make up some ideas of my own.

Why am I so interested? I think that fake news is a microcosm into the larger issue of the ‘postmodern condition,’ which is what I’m focusing on for my three-week independent study. It relates to the need for classical education, which is what I’m studying in a directed readings class. And it’s a good area for philosophical research that hasn’t been fully delved into.

Why I Love the Office

I love it because it juxtaposes absurd, delusional people against unabashed authenticity. This comparison isn’t exactly subtle, but it is never explicitly said. Jim and Pam become protagonists not because they receive the most screen time, or the story is told from their perspective, or they overcome all their challenges and become exceptional – rather, it’s precisely the opposite. They aren’t heroes. They are merely authentic, and we can only relate to them because they are the only real people within this office landscape of hollow appearances.

Michael’s relentless scrambling to avoid blame, display virtue, and underscore his importance always fall flat. Usually, episodes end with a convoluted explanation from Michael about how he didn’t really fail, how he wasn’t really a bad person. The actual events of the episode, though, create a cringeworthy irony. Michael is never outright condemned as a hypocrite, but he is painted as one by the contrast between his own words and reality.

Dwight indefatigably grapples with the pain of an uncertain existence, where unfortunate realities can’t simply be labeled ‘false.’ He struggles to reconcile lived experience and his emotions with the theoretical constructs he has used to rigorously define the world. For example, he completely misses out on the party while examining the construction of the house. He ignores lived experience if it does not fit his hypothetical framework.

Inauthentic people – and by that, I mean people in general – use elaborate schemes to portray themselves in certain ways and ignore others. In The Office, these schemes are almost as obvious, hilarious, and pathetic as they are in the real world. The Office just points out how funny and cringey they are, usually through Jim or Pam. Now back to binging The Office. 

What Matters in a President, and Why Electability Doesn’t

I’m not going to argue for any of the candidates in this post. That’ll come later. For now, I think there are three main factors that should be considered in a president. They are all interrelated, and in order of importance. However, if a candidate doesn’t meet any one of these criteria, it is practically impossible to meet any of the other criteria.

  1. Character – This consists mostly of the the moral standards and honesty of the candidate. If I do not trust a candidate, their competence becomes irrelevant, as it will not be used ethically. Their positions become meaningless because they will abandon policy and ethical standards at will. Character also includes temperament and personality, as an angry, irrational, and unstable candidate is a danger to the world and ineffective in diplomacy.
  2. Competence – The proven experience of the candidate, their intelligence, and their ability to implement policies effectively. If a candidate isn’t politically competent, their policies won’t matter because they will never be implemented. Intelligence is not measured by IQ, but by the candidate’s understanding of the world, their rationality, their education, and their working ability.
  3. Policy – The stated positions of the candidate. If every candidate could be trusted to follow their policy statements exactly and implement them effectively, this would be the only issue. Despite its importance, policy is by far the least-discussed issue in this election.

On Electability

Electability, for me, is mostly a non-issue. Of course, a candidate must have some chance of becoming president, or we will be divided into minuscule factions and candidates will only have to win a small portion of the vote to take the election. However, “some chance” is a low margin. For example, Zoltan Istvan, the transhumanist candidate, is not on the ballot in any state and is not polling at more than 5% in any state (source). This is below the “some chance” margin, as 25 days from the election, he has no path to the presidency. However, Evan McMullin, an independent candidate, is on the ballot in 11 states (source), has a significant chance of winning Utah (source), and has a growing campaign nationally. If a candidate passes this minimum threshold of electability, we should move on and consider the three most important factors.

Our democratic obligation is to vote for the candidate we support. Otherwise, our system degrades and no longer represents the population, as the contemporary philosopher Slavoj Zizek described:

We have reached the third degree where we devote our intelligence to anticipating what average opinion expects the average opinion to be.

If we do not vote our conscience, we as a population fail to represent ourselves. We do not ‘throw our vote away’ when we vote for an unlikely candidate we genuinely support, rather, we throw our vote away when we do not vote what we believe. We are not voting for ourselves, but for someone else, for the polls, for the average. Popular opinion becomes the popular opinion of what the popular opinion is; democracy devolves into regressive guessing at the average. Furthermore, government is only legitimate when it represents the governed. When we do not represent ourselves, our government becomes illegitimate.

Finally, there are a ton of misconceptions about voting power in our democracy.

First, statistical analysis shows that, in general, your vote has the most power if you vote for a third-party candidate, not for a major party. I don’t really see the point of explaining this, as the linked post explains it very well. I’d definently recommend reading it.

Second, the power of a single vote is extremely close to zero. This election, your vote will probably be around 1 in 125 million. Therefore, the best reason to vote is not really to control the election, but to represent ourselves. Don’t do it merely for the results; do it because you believe in your candidate.

Third, a lot of the time, your publicly expressed opinions matter more than your vote, because these opinions influence a significant amount of votes. Who you support actively matters more than who you vote for quietly.

Fourth, whether or not your candidate is elected is not the only measure of voting power. You could say all the Bernie Sanders votes this year were wasted because he didn’t win, but he still radically influenced the election and changed American politics permanently. Winning ≠ success.

Fifth, when you vote for a third party candidate, you break out of the mold. This draws attention far more than obediently voting for established candidates that adhere to the two-party system. Therefore, votes for a third party candidate are more influential than other votes.

That’s why I don’t think electability matters, and why don’t think it should matter. Vote your conscience this election.