Why GamerGate Needs to Address SJWs

It’s been four months since GamerGate kicked off its efforts, and proponents remain just as dedicated to the cause entering 2015. There is wide agreement on the goal of restoring ethical standards to gaming journalism, but opinions differ on whether GamerGate should concern itself with “social justice warriors” (SJWs) such as Zoe Quinn, Anita Sarkeesian, Brianna Wu, Leigh Alexander, Arthur Chu, Chris Kluwe, and other public figures lambasting gamer culture and seeking to sanitize games by scrubbing “offensive” content from the medium.

This isn’t really a conflict; as a grassroots movement, each of us can prioritize efforts as we see fit. But for those viewing SJWs as a distraction, I urge you to venture further down the rabbit hole for a bit and to consider how SJW behavior fits into a larger, troubling attack pattern across different components of society.

“We got ourselves a progressive problem.”

This is the first time many in GamerGate have faced down the “progressive” political animal, so the viciousness and tenacity of those espousing “tolerance” may come as a surprise. Similarly, they may be ill-prepared to deal with the extreme Alinskyite tactics progressives have used since the early 20th century to isolate and intimidate targets into submission.

GamerGate itself is not a political movement in a partisan or governance sense, boasting membership from all across the political spectrum, but the progressive objective is to politicize gaming to advance their agenda, which makes them a political enemy of gamers. It’s a fine distinction but an important one to make to avoid mistaking opposition of politicization as politicization itself, and to avoid confusing identification of progressive tactics with the identity politics progressives use exploit various social identity groups. Another key difference is that, while progressives use intense negative social pressure and intimidation to suppress content they dislike, GamerGate favors traditional market mechanisms to assess value and provide utility to consumers.

Those who dispute progressive ideology as the source of the cultural rot are hard-pressed to explain why journalists at the heart of the corruption all identify with this particular far-left-wing political orientation. They are all progressives – yes, all of them – which strains the credibility of those ascribing the homogeneity to mere happenstance. To understand why this is the case, a useful conceptual framework comes courtesy of economist Thomas Sowell in A Conflict of Visions (summarized by Wikipedia):

“The unconstrained vision relies heavily on the belief that human nature is essentially good. Those with an unconstrained vision distrust decentralized processes and are impatient with large institutions and systemic processes that constrain human action. They believe there is an ideal solution to every problem, and that compromise is never acceptable. Collateral damage is merely the price of moving forward on the road to perfection. Ultimately they believe that man is morally perfectible. Because of this, they believe that there exist some people who are further along the path of moral development, have overcome self-interest and are immune to the influence of power and therefore can act as surrogate decision-makers for the rest of society.”

Remind you of anyone you know? Like their Marxist cousins, progressives subscribe to a Utopian vision of mankind in which the inherently good nature of man is corrupted by damaging environmental influences around him, whether they be racist institutions; the trappings of poverty; or misogynist, ableist, homophobic cishet video games. To the unconstrained visionary, the solution to all of the above is simple: remove the offending content and the less desirable qualities of mankind will disappear – we’ll all live together in perfect harmony in a world free of privilege, prejudice, violence, and oppression.

If your eyes are rolling at the idealism, then you probably fall somewhere in the “constrained vision” camp, which holds a more tragic and classical view of mankind – ambitious, flawed, and resistant to social engineering. GamerGate is, at its heart, a constrained-view social movement, enthusiastic about video games as a healthy outlet for exploring through fantasy and recreation humanity’s natural compulsions toward lust, greed, violence, and power. As such, GamerGate is incompatible with progressive political philosophy, and the two sides are destined to butt heads over values and visions for the future of gaming.

Progressivism is also, at its core, a utilitarian philosophy, achieving its designs via the Machiavellian principle that ends can justify the means. This is the source of the often atrocious and hypocritical behavior progressives engage in when confronted with obstacles to their vision – when the opponent is racism or sexism incarnate, it’s acceptable to be a little racist or sexist yourself if it means vanquishing your foe; hence, coordinated attacks on #NotYourShield minorities as “Uncle Toms,” or on female GamerGate members as “whores” of males in the movement. Similarly, gaming journalists have no moral qualms about using the power of their positions to advance their progressive cause. If a game uses “tropes” against women, or is too violent for the reviewer’s taste, then it deserves a bad review for being damaging to society, i.e., corruption is an acceptable price to fight the even greater corrupting influence of negligent social messaging. Whatever other value games may offer is eclipsed by the social justice mission to the self-anointed agents of human evolution and progress.

GamerGate can ignore SJWs and succeed in limited aims, but it would do so at its own existential peril. We the constrained face an enemy hellbent on tearing down our very identity and culture in the name of human “progress.” The progressive sees video games as yet another territory to be conquered and used as a launchpad for further political conquest. If you wish to see the result of capitulation, you only need look at the washed-out cultural wasteland Hollywood and academia became when they fell to political correctness decades ago.

Advertisements

Society’s Polygraph

A writer for the Washington Post describes the harrowing experience of receiving stares when collecting welfare benefits in her husband’s luxury car. Her true self-awakening occurs only after deep reflection reveals what a devastating effect collective judgment has had on her feelings of identity and self-worth. The greatest injustice of all, she realizes, is that she has been made to feel ashamed for having a mortgage, car, and children she can’t afford.

In the following passage she describes the utter humiliation to which she was subjected in the process of receiving her WIC and Medicare benefits:

I had to fill out at least six forms and furnish my Social Security card, birth certificate and marriage license. I sat through exams, meetings and screenings. They had a lot of questions about the house: Wasn’t it an asset? Hadn’t we just bought it? They questioned every last cent we’d ever made. Did we have stock options or pensions? Did we have savings? I had to send them my three most recent check stubs to prove I was making as little as I said I was.

 

Proof of identity! Verifying financial status! Horror of horrors!

Isn’t her giving her word enough? Who knew getting your hands on free money would be so difficult?

 

The question-mark guy on TV never mentioned any of this!

Back to the titular Mercedes:

That’s the funny thing about being poor. Everyone has an opinion on it, and everyone feels entitled to share. That was especially true about my husband’s Mercedes. Over and over again, people asked why we kept that car, offering to sell it in their yards or on the Internet for us.

“You can’t be that bad off,” a distant relative said, after inviting himself over for lunch. “You still got that baby in all its glory.”

Sometimes, it was more direct. All from a place of love, of course. “Sell the Mercedes,” a friend said to me. “He doesn’t get to keep his toys now.”

If you look closely, you can see some faint indications of properly functioning cultural norm. Being on welfare is supposed to suck. People are supposed to question your expenses and offer you help and advice. You are supposed to feel ashamed about not working and living off other people, regardless of whether or not you are primarily responsible for your financial condition. These are all components of a healthy society with incentives properly aligned toward working, managing expenses properly, and getting off of public dependence.

All of this is supposed to be the case because the alternative is having a system that encourages dependency and inaction. The other option is having people on welfare who drive Mercedes-Benz sports cars going totally unexamined. Is that the world you want to live in?

Iraq: Let it Go

The trick to repeating history is you’re supposed to aim for the good parts while avoiding what led to devastating loss of life and resources.

With the 2003 Iraq War, the United States failed to heed this principle by repeating its Vietnam experience in yet another decade-long ground war against a country that posed no threat, lacking a clear objective, exit strategy, or any hope of gaining substantial local support.

By April, 1975, the United States had completely pulled out of Southern Vietnam, and Saigon fell shortly after to Northern Vietnamese forces, reunifying the wartorn country under its brutal communist regime. Domino Theory predicted that the fall of Vietnam would destabilize the region, leading to potential communist takeovers in Thailand, Malaysia, Indonesia, Burma, and India (Laos and Cambodia had already fallen in spillover from the Vietnam War itself). What happened in fact was the communist countries fought each other for the following decade, then eventually decided to normalize relations with the United States to improve their struggling economies (communism has a way of dampening the entrepreneurial spirit).

 

We have no way of knowing if the Islamic factions attempting to reclaim Iraq in 2014 would follow this pattern, but they do have a remarkable history of fighting each other and would almost certainly tie their own hands doing so for the foreseeable future. Unless the U.S. wants to be permanent babysitter of the region at the expense of trillions of additional dollars, it should repeat what little it got right with Vietnam by getting out and staying out of where it isn’t wanted.

Millenials the Most Anti-Liberty Generation

On many Objectivist and liberty blogs, it’s almost as if Charlie Sheen has been writing the narrative: individualism is decisively #winning. To those for whom hope truly does spring eternal, Obama and his evildoers are perpetually one scandal away from impeachment (surely the VA scandal will be the one to topple his house of cards). The American people, meanwhile, are just one election cycle shy from taking back their government in a Fifth Great Awakening for liberty.

Stark reality, however, tells a different story in a new report from the Brookings Institute. Based on surveys of thousands of individuals, the report concludes that millenials will bring to the workplace and political scene a major shift in attitudes and voting behavior toward progressive values. Millenials (born 1982-2003), if you believe the data, might very well be the most anti-liberty generation in American history. Reading their responses to questions on America’s culture, politics, and economy, one gets the distinct impression that the culture war has not only been long won by progressivism – it wasn’t even a close contest.

Particularly striking is the growing hostility toward business and for-profit activity generally:

“About two-thirds of the Millennials surveyed in 2012 also agreed that “businesses make too much profit,” which was the highest level of agreement among all generations. At the same time, less than half of Millennials thought “unions had too much power”; by contrast, a majority within all other generations agreed with that statement. Even more telling from a generation noted for its general lack of trust in institutions, 72 percent of Millennials, compared to only 61 percent of Xers and Boomers, agreed with the statement that “labor unions were necessary to protect the working person,” a level statistically significantly higher than that of older generations.”

Millenials are the generation most comfortable with regulation of private market activity, with roughly three quarters agreeing that the marketplace needs government regulation. Telling also is where millenials say they want to work: the CIA, FBI, and NSA were high-ranked across multiple surveys, with the State Department coming in second on one survey and government agencies placing second after high-tech companies like Google and Facebook.

Most troubling is the degradation of societal trust that has occurred across the generations:

In its latest study of the Millennial Generation, Millennials in Adulthood, the Pew Research Center found that America’s youngest adults were the least trusting of any generation. Only 19 percent of Millennials agreed with the statement that “most people can be trusted,” a percentage that was about half of all other older generations.

This is the millenial voter in a mutshell. You make too much profit, you can’t be trusted, and you need to be watched. Liberty has its work cut out for it.

Deserve’s Got Nothing to Do with It

Viewing capitalism through the progressive lens, we can find examples of cosmic injustice everywhere in the marketplace. All around us, the hardworking go unrewarded; the creative lack backing for their visions; the foolish can succeed despite themselves, whether out of nepotism, cronyism, or most commonly, plain dumb luck. It is a natural tendency for many, therefore, to grow frustrated with the whole endeavor and to set out seeking a better way – a way to force this chaos into a more enlightened order. However, this framework misunderstands the proper role of an economy, which is not to dispense justice at all, be it cosmic justice, social justice – even financial justice – or any other sort. As Clint Eastwood’s character explained in Unforgiven: deserve’s got nothing to do with it.

 

Markets are a mechanism for valuing scarce resources and distributing them according to dispersed but critical pieces of information. Besides being more efficient than top-down control over the economy, which lacks any effective means of objectively ranking material wants and needs, a market channels man’s self interest into productive ends instead of the destructive behaviors that inevitably result from political jockeying versus others.

One misconception progressives (and even many capitaists) share is that capitalism is, or should be, rational on the individual level. To the contrary, capitalism relies upon micro-irrational risk-taking to produce the wide array of trials and errors necessary for macro-level advancement.

A perfect example of micro-irrationality/macro-rationality is the Silicon Valley start-up scene, which for the vast majority of young self-starters will prove to be their financial coffin:

The most expensive lottery ticket in the world

It is the height of irrationality for any computer engineer to start a Silicon Valley company, accurately assessing his chances of success versus catastrophic ruin. Yet without this widespread overconfidence among entrepreneurs, we wouldn’t have the transformational breakthroughs of one-in-a-million companies like Facebook and Google. Markets are beautiful because they convert man’s self-delusion not into massive public boondoggles, but into a net-rational evolution by straining out stumbled-upon successes through the law of large numbers. The truly remarkable aspect of the process is it does so entirely according to the consent of the participants. Getting the fuller picture of capitalism through a wider lens than progressivism offers, it turns out to be a highly rational and compassionate arrangement.

Minimum Logic

We’re told that today is Fast Food Workers’ Strike Day, although you might not have known it if, instead of watching sensationalized media coverage of the event, you visited a fast-food restaurant where it was extraordinarily unlikely anyone was actually on strike.

Each year, the world’s largest labor union, Service Employees International Union (SEIU), eggs on or pays a handful of people to stand outside fast food joints protesting while the union contacts the media to drum up coverage. This year’s message was that fast food workers should be paid an absurd $15/hour – more than double the national minimum wage and over 50% more than even the nuttiest progressive states have set their minimum wages.

If you believe a six-figure union boss earnestly cares about the plight of the fry chef, you probably haven’t worked in a union environment before. Two years ago, I found myself in a large collective bargaining unit when a series of budget cuts came down the pike. Rather than all take a slight cut in our pay or benefits so everyone could keep his job (my personal choice), the union took about 10 seconds to instead demand that every contract worker in the organization be laid off so the bargaining unit wouldn’t be affected.

It’s predictably about money in the end – lots of it. Union dues run as high as $400-1000 or more per year, so with millions of fast food workers under the SEIU’s umbrella, that would equate to billions of dollars in guaranteed additional revenue – a whopper return on the initial investment.

“Would you like a side of fries for just $7 more?”

Setting aside what reasonable economists immediately recognize – that there must be an entry level below “living wage” to prevent all still-dependent young adults from being priced out of the job market entirely – if someone is still making minimum wage after more than a few years of employment, that is indicative of a serious underlying problem with that worker. In my brief stint in the fast food industry, I learned this much: anyone who stays put more than 6 months with the same company is guaranteed regular pay raises and a fairly reliable track to lower management. But there lies the rub – most fast food workers flit about from company to company and sacrifice any seniority and experience with every lateral movement they make. Besides getting the equivalent of $15-20 of free food and drink every day – no small perk in itself – it was made quite clear to me that my wage was just a starting point and there would come increased levels of pay and responsibility once I proved I wasn’t a total flake by staying more than a few months (in fact, I didn’t, so there you have it).

At least if Glassdoor reviews are any indication, most fast food workers aren’t buying into the sob story. On the lower end of the satisfaction scale, McDonald’s still manages a respectable employee rating of 3.1 out of 5. Most of the 560 employee reviews echo the sentiments above, stressing the ample room for advancement that comes with just a smidgen of company loyalty. Progressive-vilified Chick-fil-A amazingly scores a 3.8 out of 5, which is 0.2 higher than employees rated working at Microsoft.

There is nothing wrong with organizing together to improve one’s working conditions, but when unions employ the mechanisms of the state to drive their own narrow revenue and agenda, that no longer falls under the tree of free association. If unions really do provide a great value for their members worth their hefty dues payments, they are hard-pressed to explain why the fast food workers on the whole don’t seem very interested, and for that matter why unions tend to lose half their membership as soon as employees are afforded the option of not joining.

Breaking Parkinson’s Law

British historian Cyril Parkinson observed in a 1955 article in The Economist that government bureaucracies tend to grow themselves regardless of need. Using as an example his own experience in the British civil service, Parkinson pointed out that the Colonial Office had its greatest number of employees at the point when it was dissolved because of a lack of colonies to administer.

Parkinson explains the trend by pointing to the natural incentives of public officials to multiply subordinates and thereby increase their own power within the organization, along with the tendency of public officials to make work for each other. Others have since generalized and restated the principle as: “Work expands so as to fill the time available for its completion.”

At the end of his piece, Parkinson states: “It is not the business of the botanist to eradicate the weeds. Enough for him if he can tell us just how fast they grow.” So the difficult task falls to the rest of us to contain or reverse the trend, if possible.

If Parkinson’s Law is in fact a law of economics, does that make it an immutable component of the human condition? Is there an institutional way to shift incentives away from growing one’s sphere of influence in an agency through bloated budgets and unnecessary hiring?

Rhode Island: A Case Study in Progressivism

From the New York Times:

Poor Little Rhody. Not only is it the smallest state, it is often a punch line. And in many state rankings, it comes out on top for the wrong things, like having the nation’s highest rate of unemployment. Now comes yet another blow to the state’s fragile self-esteem. A Gallup poll found that of all 50 states, Rhode Island was the least appreciated by its own residents. Only 18 percent of Rhode Islanders said their state was the best place or one of the best places to live.

Rhode Island is heavily Democratic, heavily Roman Catholic and heavily unionized. Some say the state is beholden to its unions, as evinced by its generous pension system. But the high cost of government, said Robert D. Atkinson, the former executive director of the defunct Rhode Island Economic Policy Council, is not matched by a high quality of services. “Rhode Island has the high-cost structure of Minnesota but the low-quality services of Mississippi,” Mr. Atkinson said.

Over the past 80 years of continuous Democratic control of the state, an unshakable alliance of organized labor, progressive social interest groups, and corrupt politicians have tanked the culture and economy of Rhode Island. In addition to having the highest unemployment rate in the U.S., as mentioned above, over the past five years, the state has consistently ranked as the bottom state for business in the U.S. The Providence commercial tax rate is the highest in the U.S. Welfare benefits pay over $12 per hour, among the highest in the U.S. State temporary disability insurance payments costs are among the highest in the U.S.

Welcome to Rhode Island.

In the last three state election cycles, unions donated more money than all businesses in the state combined. In 2011, the state education board granted illegal immigrants in-state tuition at public universities. One in seven Rhode Islanders are on food stamps. In the shrinking private sector of the state, politically directed public loans and tax breaks have created a planned economy where businesses break bread with the governor, house speaker, and senate president before seeking private investment. Bolstering Rhode Island’s existing reputation for political corruption, progressive-endorsed House Speaker Gordon Fox recently resigned after being raided by the Federal Bureau of Investigation.

Progressive-endorsed House Speaker Gordon Fox is raided by the Federal Bureau of Investigation.

Bringing up the rear in nearly every national ranking there is, Rhode Island is a damning case study in the social and economic effects of longstanding progressive politics. The state serves as an important reminder that, however far liberty has eroded in most states, hand the keys over to progressives and things can get so much worse.

The Real Value of a College Degree

Everyone knows college has degraded into an expensive four-year party and prolonged adolescence for young adults. Sleep in, show up to classes, regurgitate what the professor says onto some papers, and a hundred-thousand dollars (or more) later, all students meeting the barest minimum requirements will walk on stage to collect their diplomas.

animal-house

 

This bemoaned but persisting state of affairs has led some to declare that the college degree is “worthless,” or that it “tells us nothing” about its bearer. Such critics have a point: degrees no longer guarantee deeper understanding in a field of study (if they ever did). Yet employers across the board continue to weight higher education in important hiring decisions. This paradox implies another variable in play – some hidden value of the degree that the credential deniers are missing.

The best and simplest explanation, from the economic perspective, is employers are using degrees as a proxy for information they are unable to obtain through other means. This information could include an applicant’s economic status, home environment, capacity for long-term commitments, and freedom from any crippling physical or mental disorders lurking under the surface.

First and foremost, a diploma means the graduate had access to the substantial amount of money necessary to obtain it, which is an excellent indication of a stable family life and financial background. Some of the most pervasive problems employers face are employee absenteeism, tardiness, and turnover, all of which are closely associated with the trappings of poverty or lack of family support (e.g., single parenthood, second jobs, lack of adequate transportation, changes in living situations, etc.). The colossal price of admission to college is itself the best indication that these complications are less likely to flare up in the workplace and cause problems.

A four-year degree also means the completion of at least one long-term and voluntary full-time undertaking. The commitment might not have involved much day-to-day heavy lifting, but the sheer length of time involved, combined with the requirement that the student stay put for the duration of a program, says a lot about what the applicant did not do instead. He didn’t decide it was too hard and drop out; he didn’t run off and join the peace corps; he didn’t fall into a deep, dark, debilitating pit of alcoholism or depression; and so on. The fact that none of this occurred on the college’s watch, when it was statistically most likely to happen, is the best possible predictor that it will not happen in the next four years of employment.

 

Viewing a diploma not as a reliable certification of subject-matter expertise (which we all know it isn’t), and instead as the most reliable predictor of stability available to employers, the nearly universal preference for graduates over non-graduates reveals itself as entirely rational and self-interested behavior.