I’m No Fancy Military Strategist But…

How is trading five Taliban commanders for a single low-level soldier (who according to most accounts deserted his post) anything but a colossally stupid blunder that makes the United States an international laughing stock?


Abdul Haq Wasiq

Thought to be in his early 40s, Wasiq served as the Taliban deputy minister of intelligence and “had direct access to Taliban and Hezb-e-Islami Gulbuddin leadership,” according to an internal memo that assessed risk at Guantanamo. He reportedly used his office to support Al Qaeda “and to assist Taliban personnel elude capture.” He also reportedly arranged for Al Qaeda personnel to train Taliban intelligence staff. Wasiq belongs to the Khogyani Tribe and began his religious training under his father, Muhammad Saleem, who died in 1981. Three years later, he went to study Islam at Warah, a school located on the Afghanistan-Pakistan border near the Khyber Pass. When the Taliban assumed control in Afghanistan, a number of Islamic students, including Wasiq, went to Kabul. Wasiq has been accused by Human Rights Watch of mass killings and torture. According to a report by the Joint Task Force Guantanamo, Wasiq “arranged for Al Qaeda personnel to train Taliban intelligence staff in intelligence methods.”

Mullah Norullah Noori

As a senior Taliban military commander, Noori has been described in government reports as a military mastermind of sorts who engaged in hostilities “against U.S. and Coalition forces in Zabul Province.” Noori, who is estimated to be around 46 or 47 years old, has developed close ties to Taliban leader Mullah Omar and other senior Taliban officials, according to a JTF-GTMO report. Noori, who was named as the Taliban governor for the Balkh and Lagman provinces, is wanted by the United Nations for war crimes including the murder and torture of thousands of Shiite Muslims. Noori has been able to remain a “significant figure” to Taliban supporters and sympathizers. According to government records, which are based on conversations with Noori, he grew up in Shajoy where he learned to read and write at a mosque in his village. His father was the imam at the mosque. As a boy, he worked as a farmer on his father’s land. In March 1999, he traveled to Kabul where he met with Mullah Yunis, the commander of the Taliban security base, and expressed interest in joining the Taliban. After the Taliban front lines fell in November 2001, Noori traveled to Konduz where he was trained and worked with Omar. Noori has been implicated in the murder of thousands of Shiites in northern Afghanistan. When asked about the killings, Noori “did not express any regret and stated they did what they needed to do in their struggle to establish their ‘ideal state.’”

Mullah Mohammad Fazi

As the Taliban’s former deputy defense minister, Fazi was held at Guantanamo after being identified as an enemy combatant by the United States. Fazi is an admitted senior commander who served as chief of staff of the Taliban Army and as a commander of its 22nd Division. He’s also wanted by the United Nations on war crimes for the murder of thousands of Shiite Muslims in Afghanistan. According to documents, Fazi “wielded considerable influence throughout the northern region of Afghanistan and his influence continued after his capture.” The Taliban has used Fazi’s capture as a recruiting tool. “If released, detainee would likely rejoin the Taliban and establish ties” with other terrorist groups, the Guantanamo report says.

Mullah Khairullah Khairkhwa

Khairkhwa is the former governor of the Herat province and has close ties with Usama bin Laden and Mullah Omar. According to the Joint Task Force Guantanamo file, Khairkhwa “represented the Taliban during meetings with Iranian officials seeking to support hostilities against US and coalition forces.” Khairkhwa and his deputies are suspected of being associated with an extremist military training camp run by the Al Qaeda commander Abu Musab al Zarqawi, who was killed in 2006. U.S. authorities have also accused Khairkhwa of becoming a powerful opium trafficker.

Mohammad Nabi Omari

As a senior Taliban leader, Nabi Omari has held multiple leadership roles in various terror-related groups. Pre-9/11, Nabi, who is estimated to be in his mid-40s, worked border security for the Taliban – a position that gave him “access to senior Taliban commander and leader of the Haqqani Network, Jalaluddin Haqqani,” according to the JTF-GTMO report. Born in the Khowst Province of Afghanistan, Nabi Omari and his family were forced to resettle as refugees though In Miram Shah, Pakistan after the Soviet Union’s occupation in Afghanistan. In the late 1980s, Nabi Omari returned to Afghanistan where he fought with the mujahideen against the Soviets. During the early 1990s, he ping-ponged between Taliban-related positions and others, including a stint as a used car salesman. In August 2002, Nabi reportedly helped two al Qaeda operatives smuggle missiles in Pakistan. The weapons were smuggled in pieces and the plan was to reassemble the missiles once all of the pieces had been brought across. Nabi was caught in September 2002 and eventually moved to Guantanamo.




Millenials the Most Anti-Liberty Generation

On many Objectivist and liberty blogs, it’s almost as if Charlie Sheen has been writing the narrative: individualism is decisively #winning. To those for whom hope truly does spring eternal, Obama and his evildoers are perpetually one scandal away from impeachment (surely the VA scandal will be the one to topple his house of cards). The American people, meanwhile, are just one election cycle shy from taking back their government in a Fifth Great Awakening for liberty.

Stark reality, however, tells a different story in a new report from the Brookings Institute. Based on surveys of thousands of individuals, the report concludes that millenials will bring to the workplace and political scene a major shift in attitudes and voting behavior toward progressive values. Millenials (born 1982-2003), if you believe the data, might very well be the most anti-liberty generation in American history. Reading their responses to questions on America’s culture, politics, and economy, one gets the distinct impression that the culture war has not only been long won by progressivism – it wasn’t even a close contest.

Particularly striking is the growing hostility toward business and for-profit activity generally:

“About two-thirds of the Millennials surveyed in 2012 also agreed that “businesses make too much profit,” which was the highest level of agreement among all generations. At the same time, less than half of Millennials thought “unions had too much power”; by contrast, a majority within all other generations agreed with that statement. Even more telling from a generation noted for its general lack of trust in institutions, 72 percent of Millennials, compared to only 61 percent of Xers and Boomers, agreed with the statement that “labor unions were necessary to protect the working person,” a level statistically significantly higher than that of older generations.”

Millenials are the generation most comfortable with regulation of private market activity, with roughly three quarters agreeing that the marketplace needs government regulation. Telling also is where millenials say they want to work: the CIA, FBI, and NSA were high-ranked across multiple surveys, with the State Department coming in second on one survey and government agencies placing second after high-tech companies like Google and Facebook.

Most troubling is the degradation of societal trust that has occurred across the generations:

In its latest study of the Millennial Generation, Millennials in Adulthood, the Pew Research Center found that America’s youngest adults were the least trusting of any generation. Only 19 percent of Millennials agreed with the statement that “most people can be trusted,” a percentage that was about half of all other older generations.

This is the millenial voter in a mutshell. You make too much profit, you can’t be trusted, and you need to be watched. Liberty has its work cut out for it.

The Rise of Rapism

A preposterous new meme has been snowballing through the progressive blogosphere in recent months: the charge that America fosters a “rape culture” that normalizes, excuses, tolerates, and condones rape and violence against women. Don’t bother pointing out that rape and sexual assault are by all credible accounts way down in the United States – rape culture is stronger than ever, festering under the surface, the progressives will tell you. Not only that, something has to be done about it as a compelling public policy issue of our time.

While the false statistic often disseminated by women’s groups that 1 in 4 women survives rape or attempted rape has been widely debunked (it derives from a series of 1980’s surveys in which women were asked if they ever had sex when they didn’t want to, not if they had ever been “raped”), the White House has jumped in the wagon train recently with its own bold-faced lie that 1 in 5 women is sexual assaulted while in college.


Life imprisonment – a slap on the wrist in rape culture.


Radix Journal has an excellent piece that sheds insight into what lies behind the “rape culture” accusation reaching a fever pitch in op-eds across national media. Besides the tried-and-true progressive strategy of pitting social-identity groups against each other with government set up as savior, the “rape culture” hypothesis operates on an even more fundamental level by totally subjugating the individual to collective judgment. Unsurprisingly, the campaign is about power and control:

When anti-rape activists tell men they simply want to “teach men not to rape,” that sounds reasonable enough. When they say that there is a “culture of rape” that perpetuates rape, men are hesitant to disagree because they don’t want to be regarded as forgiving of rape or accessories to rape. It is precisely because most men are already against rape that women are able to use rape as a kind of personal holocaust. Anti-“rape culture” advocates are exploiting male disgust for rape and using it as a tool to silence criticism of women and exert control over men’s sexual behavior and conceptions of their own masculinity.

Where have we seen this strategy employed before?

Rape culture is a lot like racism. Maybe they should just call it “rapism.” It’s an abstract “evil” that a certain group, in this case women, reserves the right to identify and use to manipulate another group, in this case men, into increasingly defensive and impotent positions. As long as they can keep men apologizing, they can keep controlling them.

While the idea of a “rape culture” in the United States may seem laughable to any sane man (or woman), it would be a disastrous mistake to dismiss this latest attack as naive alarmism. If nothing else, progressives have demonstrated that they are extremely skilled at spreading propaganda and turning demographics against each other for political gain. How to fight this pandemic is yet unclear: the facts don’t seem to be effective medicine.

Reparations – Wouldn’t It Be Worth It?

Admission: I didn’t read The Case For Reparations by Ta-Nehisi Coates, which appeared in The Atlantic on May 21, 2014, and has since been liked and shared in my Facebook news feed by nearly every progressive or black person I know. I recognize the “reparations” argument as little more than socio-political trolling at this point, which doesn’t warrant serious intellectual attention. Furthermore, breathing life into these ancient racial issues over and over again ad nauseum does incredible damage to what little social fabric and trust we have remaining in American society, which is exactly the sort of social-identity fodder progressives need to fuel their national political machine.

But the subject matter did get me thinking this time around – wouldn’t it be worth it? Just imagine: one lump sum payment and we’re done with the slavery and white-privilege issues – forever.

How much would you pay to never have to see or listen to social-identity troll LZ Granderson ever again?


Like most litigation, the vast majority of discrimination lawsuits in this country are settled out of court. Most of them are predictably bogus on the merits and amount to little more than a disgruntled employee striking back at an employer out of spite with whatever weapons are legally available to them – race, sex, age, and disability being the most difficult to disprove and therefore most likely to survive the summary judgement phase of litigation. Employers settle nearly all of these lawsuits in the low five figures because A) it’s cheaper than proceeding to trial, B) it avoids any further bad publicity, and C) there is typically a non-disclosure agreement which bars the litigant from ever suing or discussing the matter again under penalty of voiding the settlement. Think of it as the simple cost of doing business in America because that’s what it is.

Wouldn’t it be *glorious* to structure a one-time nation-wide settlement in this manner with every black individual who signed on – say, in the $10,000 – $20,000 range (roughly 3% the cost of the Iraq War) – and never have to listen to “slavery reparations,” “institutionalized racism,” or “white privilege” rehashed ever again under penalty of settlement forfeiture and repayment?

No more affirmative action. No more Title VII litigation. No more MSNBC race panels or LZ Granderson CNN op-eds. Done. Forever.

How much would that be worth to you?

Deserve’s Got Nothing to Do with It

Viewing capitalism through the progressive lens, we can find examples of cosmic injustice everywhere in the marketplace. All around us, the hardworking go unrewarded; the creative lack backing for their visions; the foolish can succeed despite themselves, whether out of nepotism, cronyism, or most commonly, plain dumb luck. It is a natural tendency for many, therefore, to grow frustrated with the whole endeavor and to set out seeking a better way – a way to force this chaos into a more enlightened order. However, this framework misunderstands the proper role of an economy, which is not to dispense justice at all, be it cosmic justice, social justice – even financial justice – or any other sort. As Clint Eastwood’s character explained in Unforgiven: deserve’s got nothing to do with it.


Markets are a mechanism for valuing scarce resources and distributing them according to dispersed but critical pieces of information. Besides being more efficient than top-down control over the economy, which lacks any effective means of objectively ranking material wants and needs, a market channels man’s self interest into productive ends instead of the destructive behaviors that inevitably result from political jockeying versus others.

One misconception progressives (and even many capitaists) share is that capitalism is, or should be, rational on the individual level. To the contrary, capitalism relies upon micro-irrational risk-taking to produce the wide array of trials and errors necessary for macro-level advancement.

A perfect example of micro-irrationality/macro-rationality is the Silicon Valley start-up scene, which for the vast majority of young self-starters will prove to be their financial coffin:

The most expensive lottery ticket in the world

It is the height of irrationality for any computer engineer to start a Silicon Valley company, accurately assessing his chances of success versus catastrophic ruin. Yet without this widespread overconfidence among entrepreneurs, we wouldn’t have the transformational breakthroughs of one-in-a-million companies like Facebook and Google. Markets are beautiful because they convert man’s self-delusion not into massive public boondoggles, but into a net-rational evolution by straining out stumbled-upon successes through the law of large numbers. The truly remarkable aspect of the process is it does so entirely according to the consent of the participants. Getting the fuller picture of capitalism through a wider lens than progressivism offers, it turns out to be a highly rational and compassionate arrangement.

Minimum Logic

We’re told that today is Fast Food Workers’ Strike Day, although you might not have known it if, instead of watching sensationalized media coverage of the event, you visited a fast-food restaurant where it was extraordinarily unlikely anyone was actually on strike.

Each year, the world’s largest labor union, Service Employees International Union (SEIU), eggs on or pays a handful of people to stand outside fast food joints protesting while the union contacts the media to drum up coverage. This year’s message was that fast food workers should be paid an absurd $15/hour – more than double the national minimum wage and over 50% more than even the nuttiest progressive states have set their minimum wages.

If you believe a six-figure union boss earnestly cares about the plight of the fry chef, you probably haven’t worked in a union environment before. Two years ago, I found myself in a large collective bargaining unit when a series of budget cuts came down the pike. Rather than all take a slight cut in our pay or benefits so everyone could keep his job (my personal choice), the union took about 10 seconds to instead demand that every contract worker in the organization be laid off so the bargaining unit wouldn’t be affected.

It’s predictably about money in the end – lots of it. Union dues run as high as $400-1000 or more per year, so with millions of fast food workers under the SEIU’s umbrella, that would equate to billions of dollars in guaranteed additional revenue – a whopper return on the initial investment.

“Would you like a side of fries for just $7 more?”

Setting aside what reasonable economists immediately recognize – that there must be an entry level below “living wage” to prevent all still-dependent young adults from being priced out of the job market entirely – if someone is still making minimum wage after more than a few years of employment, that is indicative of a serious underlying problem with that worker. In my brief stint in the fast food industry, I learned this much: anyone who stays put more than 6 months with the same company is guaranteed regular pay raises and a fairly reliable track to lower management. But there lies the rub – most fast food workers flit about from company to company and sacrifice any seniority and experience with every lateral movement they make. Besides getting the equivalent of $15-20 of free food and drink every day – no small perk in itself – it was made quite clear to me that my wage was just a starting point and there would come increased levels of pay and responsibility once I proved I wasn’t a total flake by staying more than a few months (in fact, I didn’t, so there you have it).

At least if Glassdoor reviews are any indication, most fast food workers aren’t buying into the sob story. On the lower end of the satisfaction scale, McDonald’s still manages a respectable employee rating of 3.1 out of 5. Most of the 560 employee reviews echo the sentiments above, stressing the ample room for advancement that comes with just a smidgen of company loyalty. Progressive-vilified Chick-fil-A amazingly scores a 3.8 out of 5, which is 0.2 higher than employees rated working at Microsoft.

There is nothing wrong with organizing together to improve one’s working conditions, but when unions employ the mechanisms of the state to drive their own narrow revenue and agenda, that no longer falls under the tree of free association. If unions really do provide a great value for their members worth their hefty dues payments, they are hard-pressed to explain why the fast food workers on the whole don’t seem very interested, and for that matter why unions tend to lose half their membership as soon as employees are afforded the option of not joining.

Breaking Parkinson’s Law

British historian Cyril Parkinson observed in a 1955 article in The Economist that government bureaucracies tend to grow themselves regardless of need. Using as an example his own experience in the British civil service, Parkinson pointed out that the Colonial Office had its greatest number of employees at the point when it was dissolved because of a lack of colonies to administer.

Parkinson explains the trend by pointing to the natural incentives of public officials to multiply subordinates and thereby increase their own power within the organization, along with the tendency of public officials to make work for each other. Others have since generalized and restated the principle as: “Work expands so as to fill the time available for its completion.”

At the end of his piece, Parkinson states: “It is not the business of the botanist to eradicate the weeds. Enough for him if he can tell us just how fast they grow.” So the difficult task falls to the rest of us to contain or reverse the trend, if possible.

If Parkinson’s Law is in fact a law of economics, does that make it an immutable component of the human condition? Is there an institutional way to shift incentives away from growing one’s sphere of influence in an agency through bloated budgets and unnecessary hiring?

Rhode Island: A Case Study in Progressivism

From the New York Times:

Poor Little Rhody. Not only is it the smallest state, it is often a punch line. And in many state rankings, it comes out on top for the wrong things, like having the nation’s highest rate of unemployment. Now comes yet another blow to the state’s fragile self-esteem. A Gallup poll found that of all 50 states, Rhode Island was the least appreciated by its own residents. Only 18 percent of Rhode Islanders said their state was the best place or one of the best places to live.

Rhode Island is heavily Democratic, heavily Roman Catholic and heavily unionized. Some say the state is beholden to its unions, as evinced by its generous pension system. But the high cost of government, said Robert D. Atkinson, the former executive director of the defunct Rhode Island Economic Policy Council, is not matched by a high quality of services. “Rhode Island has the high-cost structure of Minnesota but the low-quality services of Mississippi,” Mr. Atkinson said.

Over the past 80 years of continuous Democratic control of the state, an unshakable alliance of organized labor, progressive social interest groups, and corrupt politicians have tanked the culture and economy of Rhode Island. In addition to having the highest unemployment rate in the U.S., as mentioned above, over the past five years, the state has consistently ranked as the bottom state for business in the U.S. The Providence commercial tax rate is the highest in the U.S. Welfare benefits pay over $12 per hour, among the highest in the U.S. State temporary disability insurance payments costs are among the highest in the U.S.

Welcome to Rhode Island.

In the last three state election cycles, unions donated more money than all businesses in the state combined. In 2011, the state education board granted illegal immigrants in-state tuition at public universities. One in seven Rhode Islanders are on food stamps. In the shrinking private sector of the state, politically directed public loans and tax breaks have created a planned economy where businesses break bread with the governor, house speaker, and senate president before seeking private investment. Bolstering Rhode Island’s existing reputation for political corruption, progressive-endorsed House Speaker Gordon Fox recently resigned after being raided by the Federal Bureau of Investigation.

Progressive-endorsed House Speaker Gordon Fox is raided by the Federal Bureau of Investigation.

Bringing up the rear in nearly every national ranking there is, Rhode Island is a damning case study in the social and economic effects of longstanding progressive politics. The state serves as an important reminder that, however far liberty has eroded in most states, hand the keys over to progressives and things can get so much worse.

The Real Value of a College Degree

Everyone knows college has degraded into an expensive four-year party and prolonged adolescence for young adults. Sleep in, show up to classes, regurgitate what the professor says onto some papers, and a hundred-thousand dollars (or more) later, all students meeting the barest minimum requirements will walk on stage to collect their diplomas.



This bemoaned but persisting state of affairs has led some to declare that the college degree is “worthless,” or that it “tells us nothing” about its bearer. Such critics have a point: degrees no longer guarantee deeper understanding in a field of study (if they ever did). Yet employers across the board continue to weight higher education in important hiring decisions. This paradox implies another variable in play – some hidden value of the degree that the credential deniers are missing.

The best and simplest explanation, from the economic perspective, is employers are using degrees as a proxy for information they are unable to obtain through other means. This information could include an applicant’s economic status, home environment, capacity for long-term commitments, and freedom from any crippling physical or mental disorders lurking under the surface.

First and foremost, a diploma means the graduate had access to the substantial amount of money necessary to obtain it, which is an excellent indication of a stable family life and financial background. Some of the most pervasive problems employers face are employee absenteeism, tardiness, and turnover, all of which are closely associated with the trappings of poverty or lack of family support (e.g., single parenthood, second jobs, lack of adequate transportation, changes in living situations, etc.). The colossal price of admission to college is itself the best indication that these complications are less likely to flare up in the workplace and cause problems.

A four-year degree also means the completion of at least one long-term and voluntary full-time undertaking. The commitment might not have involved much day-to-day heavy lifting, but the sheer length of time involved, combined with the requirement that the student stay put for the duration of a program, says a lot about what the applicant did not do instead. He didn’t decide it was too hard and drop out; he didn’t run off and join the peace corps; he didn’t fall into a deep, dark, debilitating pit of alcoholism or depression; and so on. The fact that none of this occurred on the college’s watch, when it was statistically most likely to happen, is the best possible predictor that it will not happen in the next four years of employment.


Viewing a diploma not as a reliable certification of subject-matter expertise (which we all know it isn’t), and instead as the most reliable predictor of stability available to employers, the nearly universal preference for graduates over non-graduates reveals itself as entirely rational and self-interested behavior.