Tuesday, March 17, 2020

On Stocks



 Over the last couple days, I've started and stopped a number of posts, but due to our current pandemic (COVID-19), I kept falling behind. I’ll work some of them in later if I can. But for now, let’s focus on something a little more evergreen: the stock market.

I'll be honest, my notes for this post were just the words “Crash Good.” I do plan on justifying that at least somewhat, but I felt it would be best to be upfront about it. Another qualifying parenthetical: I own stocks and they are/were valuable enough to provide me a strong personal interest in having the market be high.

I went on at length about the nature of capital in my previous post, and it’s relevant again here. My borderline antipathy towards the market has to do with the fundamental nature of stocks and corporations. Simply put, a corporation is a machine that makes money, it is the essential tool of extracting capital. Each stock entitles the owner to a tiny piece of the machine, and an equivalent share in the profits. If the machine is good at extraction, the pieces are valuable as a source of capital. Great, except for the fact that for capital to be extracted, it has to come from somewhere. For corporations, that value comes from the workers. (Yes, I am aware that this is a grotesquely oversimplified view of the market, but let's focus on the forest, not the trees.)

I don’t think I was always a Marxist. But it’s hard to say. Most of the time, your beliefs change so slowly as to be imperceptible. If there was a moment, though, or a single fact that made it clear, it was when I understood the degree to which capitalism requires continuous growth. On a long enough timeline (shorter now), we’ll inevitably run out of places to grow, and — like bacteria in a petri dish — will end up eating each other to survive just a little longer.

Whatever replaces “capitalism” must be zero growth — a stable system. It can have money and commerce, sure, but it must be able to survive without growth. There is a way, but it comes right up against the stock market: zero growth means zero net profits.

Back to the capital created by workers, the profit of a corporation is ‘excess’ value created by the employees. Now, for worker-owned businesses, this is not a problem, as the value is (presumably) evenly distributed amongst the employees. The more common scenario, however, is that the created capital is disbursed to the owners of the company via stock, while the workers are paid whatever small monies must be paid in order to retain them. Each dollar of wage is a dollar that does not go into the stock.

The problem with this is obvious, and the results of which we’ve seen firsthand. Wages have stagnated, while the rich get richer. The stock market reflects (or reflected..) this, and as the workers lose power, influence, and share in capital, the stock grows accordingly.

If wages rose, say by an increased minimum wage or by worker ownership of a significant percentage of stock, the market would correct itself accordingly. The market’s fall would be a good thing, since it would mean that the profits were more widely shared, instead of being concentrated within sociopaths like Bezos and Bloomberg. Average quality-of-life would raise, in exchange for a reduction in profits.

Now, the current crash isn’t quite like that. Instead, we have a precipitating event that is causing a dramatic drop in profits (forced closures due to quarantine, etc). While this is less ideal, it counter-intuitively still represents an opportunity for improved quality-of-life for our poor beleaguered workers, albeit an indirect one. Assuming that wages do not fall, their percent of the total available capital increases as the corporations lose value. What this looks like in practice is falling prices, deflation. The stable wage becoming more valuable, not less. We’ve been used to inflation for a long, long, while now, which benefits only the people at the top, those who own the means of production.

So yes, it’s good that the market goes down. For all of our sakes, I’d hope it goes down farther. While the natural inclination of capital is to cut wages proportionally, this is something that is not at all guaranteed, and can, in theory, be easily stymied by worker organization. For without the workers, the corporations have nothing.




Tuesday, March 3, 2020

A Late-Stage Endorsement


"Those who make peaceful revolution impossible will make violent revolution inevitable." ~ John F. Kennedy

After a long hiatus, I’m finding it refreshing to write something for myself, without any real thought for my audience. Still, it’s important to address the reader, at least as a basic conceit, so with that in mind I will endeavor to explain why it is imperative that Bernie Sanders become president of the United States.

If I had to lay out my thesis in simple terms, it’s this: Bernie is the compromise candidate.

I alluded to the economic precarity in the latter part of my welcome back letter, but repetition is important. In general, the people in power (ie the “boomers”) do not understand the depth of the problem. To use an example, when Greta Thunberg said “You have stolen my dreams and my childhood,” she was being incredibly literal. The youth (‘generation Z’) grow up understanding something that millennials (that’s me) had to learn the hard way: there is no hope. The problems I wrote about 3 years ago have only gotten more dire — we even have a pandemic now, which is exciting.

There are so many issues, I struggled to winnow it down enough to start this sentence, but I think the most central aspect of the problem is the so-called “Late-Stage Capitalism.” I will explain: At its heart, capitalism is an exploitative system. This is not even meant as a criticism, merely a statement of simple fact. The main mechanism is to extract and retain value, which we call capital. The craftsman, assuming he sells his goods, trades his time, expertise, and capital for materials and a greater amount of capital. If he manages to retain enough, he might be able to open a new workshop and thereby increase his ability to retain capital. Or, if he fails to retain capital, he can find himself unable to make the initial trade. He then might have only his time and expertise and be obligated to trade those for capital directly. In the real world we would call this a “Job” or a “wage”.

This system works.. fine. In theory. Unfortunately for all of us, there is an underlying assumption — a zero-ith law, to make a reference. The greatest source of capital has always been the natural world. It grows food, there are shiny rocks to pick up, I need not elaborate further. Follow the trail back, and it will always lead to some basic extraction of value from the earth. This eventually creates a problem, as we are now seeing quite clearly. The great majority of the ‘natural’ capital is finite, and the parts of it that aren’t are generally unable to meet sustained demand (for instance, overfishing). So-called “human capital” is inherently reliant on natural capital, as human beings need things like food, water, and clean air in order to live.

In the past, perhaps, natural capital was effectively infinite. Examples of overuse were anomalous, such as consuming all the passenger pigeons. However, the days when the aggregate impact of human activity was insignificant is long gone, if it ever even existed. This is basically just a long-winded way of talking about the tragedy of the commons, except that the planet itself is a giant commons and we’re the ones facing the tragedy. The idea is pretty straightforward, as every individual has a base level of need below which they cannot survive, the human population is increasing, and the world is inherently finite, there will reach a point where people’s needs will not be met. As Malthus might say, something’s got to give. Obviously, we are not at this point yet — isolated pockets aside — but we don’t need to get all the way down to bare subsistence level before we see problems. Capital accumulation snowballs, dividing into the haves and have-nots. We’ll talk more about the billionaires soon. The have-nots are obligated to exchange their time in order to live and are — as a rule —structurally unable to accumulate capital. As capitalism has been around for a long time, there are very few unexploited natural sources of capital — not a lot of places you can go and just stake out a claim. Just about everything is owned by someone.

Now, of course, the average American worker is significantly worse off than in our stick-figure model. Not only do they not have capital, they are forced to leverage their future ability to accumulate capital and to make economic exchanges that are less than ideal. Student debt and paying rent, to give an example of the former and the latter. This double-whammy exacerbates the aforementioned structural disadvantages — i.e. the system is rigged. The fundamental assumptions — that everyone has access to natural capital, that the supply of natural capital is inexhaustible, that making use of natural capital has no downsides — no longer hold. This is a fact that the younger generations feel in their bones, their basic survival needs forcing them inevitably into a cage of work and debt. It is economic slavery, plain and simple — the chains are simply more abstract.

The boomers should not be surprised that socialism is popular among the young, if anything, they should be surprised that it is not more popular. It goes without saying that the beneficiaries of this system do not have any desire to dismantle it, and programs of wealth distribution which would otherwise blunt the system’s sharp edges are manifestly unpopular among the billionaires.

Of course, they would not be billionaires if they were not short sighted in this regard. Better writers than I have explained at length why it is unethical to be a billionaire in a world where people are literally starving and dying preventable deaths, I will merely be gesturing in the direction of the argument.  But the fact of the matter is that very few would choose this system that they are forced to live in, if given the slightest choice in the matter. The phrase “the consent of the governed” comes to mind, and it’s appropriate here, as the entirety of the edifice rests on mutual understanding and respect for the rules that govern capital accumulation. While it’s true that we live in a society, that concept is a far more fragile thing than most people realize.

I implore you, boomers, try to enter the zoomer mindset. Looking around you see a world full of riches and luxuries that you have no hope of obtaining. If you work hard, go to a good school, you too can make 35K and live in an apartment with roommates. Looking around in the mall, at the tentacle facades of multi-national corporations, staffed by minimum wage workers. Who decided this? Why not just take what we want? The only reason the person behind the counter cares is that their manager might fire them if too much stuff gets stolen. And so on up the chain, until it’s just a billionaire owner trying to get a higher profit margin.

The instant someone offers a plausible way out, people will take it. Capitalism consumes and churns, creating millions of people who have literally nothing to lose. Historically speaking, this comes via populism, and eventually it’s always a choice between socialism and fascism (or imperialism, if you prefer). We can either build a society that frees us from the malicious incentives inherent in capitalism, or we can scramble to become the group on top, profiting by hurting anyone we can get away with hurting.

I have no illusions that Bernie will be able to do that, his proposals are only extreme by American standards — and by the standards of other non-first world countries. Most of them have made a conscious effort to take care of their citizens, cushioning them from the thresher at the foot of Moloch. Try not to let people starve to death, take care of the public health, invest in youth. Pretty basic stuff, as society goes. We’re all in this world together, “rugged individualism” is the world’s biggest scam, and Atlas Shrugged should be rightfully ignored by anyone more mature than a stunted teenage boy. Bernie merely represents a step in the right direction, and one that we desperately need. Not taking the step is possible, but unwise. There’s simply too much momentum, historically speaking. One can only rule over a disaffected populace for so long; though I’m sure the Romanovs must have felt near invincible right up until the end.

Ultimately, though, our hand is already collectively forced. We’re currently being dragged headlong into fascism by the current party in power, who are gleefully destroying anything they can’t loot or subvert. The scaffolding upon which our society rests is actively collapsing, it’s merely a matter of where we want to land. For me, the choice is clear.

A government of the people, by the people, for the people.

Saturday, February 29, 2020

On Writing


I was working on a story pitch the other day, and —

Actually, let’s start over. Hi. It’s been a while. Since last we spoke, I went to a journalism grad school, got married, and had a kid. Scams all around, but that’s a different essay. And now, as the grim specter of another election season is upon us, I have returned to wrestle with my demons in a semi-public way, in full view of friends and potential employers. Despite the poor decision in linking my personal blog with my professional writing portfolio, I’ve been moderately productive. Published a few pieces. Ghost-wrote a book. Worked a 9-5. Changed the poopy diapers. All good life things.

So anyway, I was working on this pitch. I had seen one too many bad political “Harry Potter” memes, and I needed to find a way to channel the way I was feeling into something productive.

Now, a necessary parenthetical: I am politically liberal. I like Harry Potter quite a bit, and I am happy to talk about it at length, though I will try to persuade whatever poor soul is stuck in a conversation with me to read the Methods. But that’s neither here nor there. I say this to underscore the fact that, by all rights, I should be favorably disposed to them, at least theoretically. Instead, I feel the same visceral embarrassment that I felt when I saw that some well-meaning group sent sheet cakes with the words “You’re in the room where it happens. Let Bolton testify” to all 53 Republican senators during the impeachment. And we all saw how well that went.

What could possess people — “my side” — to be so hopelessly incompetent? Who is going to be persuaded to vote for Elizabeth Warren because she is, and I quote, “[A] grown up Hermione… with a plan to give every elf a sock”?  I note that I am confused, and frustrated.

But of course, to add insult to injury, I was far too late with my thinkpiece idea. Instead of just having to dodge the iconic 4chan rant on the subject, I saw that Jacobin Magazine published “Politics is Not Harry Potter” a well-written functional piece on… basically everything I was looking at writing about. So much for that. So if you’re looking for a takedown of its neoliberal morality, you’ll have to go somewhere else.

It is difficult for me to avoid negative thought patterns. I’m not even sure I’m supposed to, as a journalist, as long as they aren’t all-consuming and demotivating — the world is full of bad news and things to rightfully be concerned about. Indeed, as you can see by the contents of this blog, it’s kind of “my thing”.

You know that saying, about trying to make it in the big city? “If you can make it there, you’ll make it *bump bump* anywhere”. It’s just that harder to stand out/be successful/find a niche when there are more people trying to do it. The internet still has many advantages, but its net effect on the strugglers and strivers has not been a positive one. How could you want to open a store if you have to compete with Amazon? How can you make a living creating when there are a thousand or ten thousand people all trying to do the same thing, and for free? No seriously, how? The real currency is human attention, and it’s not exactly fungible. The nature of our capitalistic system fundamentally discourages creative expression for its own sake — people are infantilized whenever possible, turned into passive consumers. Everything polished and gleaming, impressive and beyond the abilities of us mere mortals. Content as a service and as a way of life, invisible, ever-present, and as impossible to replicate as tap water. Time is a precious commodity and while the simple joy of creation can be monetized, it usually looks like people paying money to paint by numbers or getting MFAs — someone else is inherently profiting by the enablement.

90 percent of everything is crap, and it’s more like 99. The more people getting in on the game, the harder it is to find something that isn’t entirely shit. And for those of us who are talentless, uncreative boobs, the act of putting words on paper is akin to defecation. The problem is enormous in scope. I’m an enormous reader, I spend hours every day looking for interesting things to read or funny comics that reinforce my existing political beliefs. Forget articles, there are more interesting quality publications than I can keep track, and I am ostensibly (among other things) a freelance journalist who tries to write for a living. What does it say about the volume of ‘content’, if I can’t even be aware of the publication’s existence? I could send this garbage piece to a hundred different sites, receive 100 polite rejections, and still not have scratched the surface of what’s out there.

Human beings have choice anxiety, provide them with enough options and they shut down. Which is, I guess, by someone’s design. It certainly matches up with the previous paragraph. We’re also good at making connections (even if they’re spurious), which is why I can see a straight line from this to our imminent technological unemployment, and to the reason why there’s nothing for anyone to do in [Insert Small Town of Choice Here] except to do drugs and die. People get justifiably mad at the pharmaceutical companies, but at least it’s oxy, not krokodil. It’s the desire of people who have little purpose or meaning remaining to feel good for a hot second. Certainly, can’t blame them, makes the horrible droning pressure of this hellworld lift for a time, and shows just how horrible life can be when the real world comes crashing back.


And that’s where we are. In a world dark enough for this inchoate nonsense, on the bitter fragile edge of the void.

Welcome back.

Saturday, July 15, 2017

How Global Warming Will Kill You

Blogging is frustrating, because no matter what the post is about, most of the clicks come from having an exciting title. Last week’s post ("The Destruction of the American Cuisine") had more reads in two days than my last 3 “On Secession” posts combined. This is something I should have learned back when I wrote “Everything Wrong with College”. On Secession will remain on hiatus until I can figure out how to rebrand it. So, this week’s post is another entry in the Cassandra series, inspired by personal experience in the American southwest.

Previous Cassandra posts: A.I. — Global Warming — Antibiotics — Pandemic — Virtual Reality — Monocultures  

This is how it starts
Let’s start with some basic, uncontroversial biology: human beings can exist only at specific temperatures. The reason for this is chemical; each enzyme in the body has a temperature range that it can function in. Some ranges are wider than others, but all have hard-stop temperatures whereby enzymatic activity will cease.

A normal Enzymatic curve
There are a lot of different enzymes, and some of them have surprisingly complex interactions — but detailed knowledge of them isn’t necessary (not even to get through medical school). To put it simply, enzymes are biological catalysts, and almost every biological process relies on them. They are extraordinarily sensitive, and evolution has gone to great lengths to optimize the reaction rates (see: reasons for external genitalia in human males). Without functioning enzymes, life becomes impossible.

This is exactly why the hypothalamus raises body temperature in response to infection (i.e. fever). While the primary purpose is somewhat contested, one of the reasons is that fever inhibits the growth of many infectious agents. Viruses and bacteria suffer for the same reason that we do, they are less able to function when their temperature is outside of their ideal range. Raised temperatures also cause white blood cells to function better, making it easier for your body to fight off infection (that it is almost impossible for people to do anything besides lie down and wish for death is considered a positive side effect, albeit an unpleasant one).

As we can see, the temperature range compatible with life is extremely narrow:
Phyrexia Pyrexia is another word for fever
Less than 2 degrees Celsius (4°F) separates hypothermia from fever. 2 degrees is a very coincidental and auspicious number, and I’ll come back to it later, but first let us consider certain morbid implications: what is the temperature at which people die?

This is a difficult question to answer, not just because we can’t exactly test it, but because animals experience temperature differently. The short answer is that the same temperature feels hotter when it’s humid. But to explain in a little more detail, we have to talk about “wet-bulb temperature”.

As anyone who has spent time on both US coasts will attest, 90 degrees Fahrenheit feels very different when the air is dry, compared to when it is humid. Wet-bulb temperature is the attempt to reconcile the two, defined as the lowest temperature that can be reached by evaporating water into the air (often measured by sticking a thermometer in a wet sock and swinging it around. No, that’s not a joke.)
They look really stupid and you'll feel dumb swinging it around
The wet-bulb temperature is always lower than the “dry bulb” temperature because it’s measuring how much it’s possible to cool down (i.e. the difference between standing outside [dry] and standing outside in front of a fan while being misted with water [wet]). The two temperatures approach the same number as the relative humidity of the air increases. When the humidity of the air is 100%, no additional water can evaporate, and the two temperatures are identical. The reason the wet-bulb temperature is always lower is because evaporating liquids create a cooling effect. Since humans are much more like wet socks than pieces of glass or metal [citation needed?], the wet-bulb temperature provides a good approximation of the subjective temperature experienced. Therefore, we can use it to determine the temperature at which people die.

Don’t act surprised! Obviously, there is such a temperature, we already know that all the necessary enzymes shut off outside of their (narrow) ranges, and we know that the main way that human bodies cool themselves is by sweating profusely, taking advantage of the evaporative cooling effect. At some point, the body is no longer able to prevent itself from heating up, and will rapidly go into hyperthermia and die. This chain of events is why heat stroke is so dangerous; a common, real-life example of the phenomenon. Using the wet-bulb reading, we can isolate important temperatures, including the onset of heat stroke, and the temperature above which is death is certain. That temperature is approximately 95 degrees Fahrenheit (35 degrees Celsius).

This is a counterintuitively low number, but it is important to remember that our bodies are constantly generating heat. Hence it is easy to feel overheated at 80°F (27°C), even though a similar internal temperature would be fatal hypothermia; controlled dissipation of heat is necessary to prevent overheating. There are many online calculators available to convert dry-bulb readings to wet-bulb readings (most depend on atmospheric pressure, which influences the percent humidity), a decent dry-bulb estimate of a 95°F wet-bulb reading is around 110-120°F (43-49°C). (Desert climates have lower humidity, so sometimes the 95°F wet-bulb temperature is not reached until over 120°F.)

For context, let’s look at some maps:
Here we have the Average Summer Temperatures (which are always measured in dry-bulb unless otherwise stated), including the average highs…
and lows…
 Based on NOAA data. Maps from Weather.com
But as we know, this is not the whole picture. Because it’s the hottest days that matter, even if they are statistical outliers. So here is a map of the average highest temperature recorded in a typical summer:
Some places in that white patch are hotter than 118, but they've got to cap it somewhere
As we can see, vast swaths of the United States have day(s) during the summer that are incompatible with human life. Literally “so hot you could die”, and people do so with great regularity. And, of course, anomolous record-high heats can occur almost everywhere:
"Come to Lake Havasu, the hottest place in Arizona! Don't go outside!"
It’s not news that people die during heat waves, but what should be a sobering thought is the degree to which we are dependent on functioning air conditioning in order to literally not die. The excuse of “well we have air conditioning” isn’t especially reassuring, because our power system is not designed to provide continual power to every single person on the grid. Excessive use caused, say, by a massive heat wave, quickly overwhelms ordinary power grids. And while worrying about it now is definitely a little silly, over the long term, you actually do have to deal with extraordinary events. You have to worry about things like earthquakes, hundred-year waves, and other rare but predictable events. The longer you plan on living, the more likely it is that you run into something out of the ordinary.

I would not be harping on this if it was merely an American problem:
Although they are, perhaps, disproportionally represented in comment sections, the number of people who insist the world is not warming (despite all the easily accessible evidence to the contrary) is extremely small. Even those who deny that the warming is caused by human activity rarely go that far. The bigger issue is that as the temperature increases, more and more of the world will become uninhabitable.

That angry dark red splotch that goes through Africa and the Middle East? That’s desert. There are two ways to make a desert and the first is obvious — no rain. The second is less so — heat. Plants have enzymes, same as we do. Get them too hot, and they can’t grow. Heat also reduces the amount of precipitation, but this is fiddly and missing the point — the real problems come much sooner. I cannot be explict enough, our fatal temperature from earlier is assumes people are doing nothing but trying to stay cool. Heat stress, whether due to working outside, walking around, or just standing still in the sunlight, effects humans, plants, and animals alike. Some species are more sensitive than others, and there are small variations due to age and health. But, again, the fatal temperature is a hard stop. It’s not something you can fight, something you can just “power through”. It’s not even fighting biology, it’s fighting chemistry. And I can tell you exactly what happens when you fight chemistry: chemistry wins.

I mentioned our temperature range from earlier: 2°C. This is also the number that the Paris Climate Accord is intended to ensure, though the goal of the accord is to prevent the world from warming more than 2°C. Currently, no one believes that keeping to that number is possible, most estimates put the amount of warming at at least 3°C, though if and when the permafrost melts, it’s anyone’s guess what happens next. Optimistic low-ball estimates come in at around 7°C, meaning a 15°F average worldwide temperature increase (though of course cities will heat up much faster). For a sense of what that might entail, look at our heat maps from earlier, but this time add 10 or 15 degrees onto it. We’re looking at the entire country being as hot as the hottest part of an Arizona summer, and that doesn’t even begin to cover what the rest of the world will look like. Buckle your seatbelts folks, because this is going to get rough.

Monday, July 10, 2017

The Destruction of the American Cuisine

Alright, I’ve (finally) fallen behind schedule. My lack of knowledge of political philosophy, coupled with some “IRL” concerns, means I will not be delivering Part V of On Secession on time. And, since my other back-burner ideas are equally unwieldy, I’m short on options. Consider this one a filler post.


I have another interest, which has not appeared on this blog until now: Food. I’ve been cooking quite a bit recently, though by “cooking” I mostly mean preparing food (there’s relatively little heat applied). And somewhere between making bread, salad dressing, sushi, ice cream, and lox, I’ve realized something a little disturbing: we’ve been suckered.

For most of history, everyone* cooked. All the time, every meal. Food was life and death, especially in America, which was an unforgiving frontier for the initial settlers (the colonies had horrific survival rates). Much of the cultural culinary knowledge brought with them from Europe was insufficient, and even the most saccharine stories of Thanksgiving acknowledge the fact that the Mayflower settlers were far up shit creek until the Native Americans helped them out.
*not literally everyone
I'm not seeing the Three Sisters here, but you get the point
Skipping forward a bit, we’ve got a rich American cuisine developing, with people eating indigenous animals like passenger pigeons, all sorts of gourds and corn, and fruits like cranberries and pawpaws. People are curing meats, making cheese, and canning fruit to last through the winter, as well as developing the regional cuisines and iconic dishes that everyone loves so much. Regular infusions of foods from other cultures and new immigrants meant that the American culinary landscape never stood still for too long.

So… What happened? If that rich American cuisine describes your current eating/cooking habits, please tell me where you forage for food. But it's not that way now, and it's not even close. 
Seriously, who thought this was a good idea?
It’s obvious what happened, ad agencies happened. Frozen food was the food of the future, and it was very cool. And one thing led to another, and now you can buy pre-packaged balsamic vinaigrette in the supermarket. The basic advertising idea that prepackaged food is a labor and time saver is going strong, but the other big piece of the puzzle is convincing people that cooking is more difficult than it is. By separating us from the creation of our food, things that cost pennies to make at home can be sold for whatever it is they’re charging you at Whole Foods. I know I grew up with canned whipped cream, even though that is literally just cream and sugar, whipped together. It’s not hard, it doesn’t take a significant amount of time, and it tastes hundreds of times better than the canned stuff.

Obviously, the entire fault can’t be laid at the feet of the ad agencies — it was also the fault of the supermarkets. Until the 1950s, a weekly shop would involve a trip to the butcher, the greengrocer, the bakery, and the dry goods store. The milk, of course, was delivered.
I prefer Wegmans, but it's the same concept
But local “mom and pop” shops were inexorably squeezed out by national chains that could benefit from economics of scale, and factory farms began feeding citizens across the country. Until finally, the notion that you could go to the store in New York and buy something that you couldn’t get in California seems archaic and quaint.

So, in effect, we’ve been removed from our food twice. First from the food itself, and second from the knowledgeable people who provided what we could no longer provide ourselves. Ask anyone whose parents were adults before 1950 whether they ever bought meat from a butcher they didn’t know. This change was in living memory.

And even speaking strictly as a public health concern, this change has not been positive. I’ve written about our problems with antibiotics before, but one of the major contributing factors is the overuse of antibiotics in animal feed. Which is necessary if you’re going to cram hundreds of animals together in a small space, and is necessary if you’re going to provide cheap meat to supermarkets around the country. Likewise, similar issues crop up on the vegetable side (see “On Monocultures”). The mass production of food has given us calories and salt in abundance, and homogeny and malnutrition. But even if these problems were solved, we would still have others. Consider French cheeses:
Hint: the best one is Époisses
This is by no means a comprehensive list, nor are the regional differences limited to cheese. In every place, individual communities developed their own cuisines. And of course, it’s not just France. Every place around the world had things that made that region, town, or village unique, something that was theirs and theirs alone. It’s not limited to food, but since that’s what we’re talking about now, let’s stay on topic. Look at this map:
There are 4 distinct types of barbeque sauce in South Carolina, to say nothing of the individual variations between each type (it’s not like all mustard BBQ sauces are the same). And this is in a place 1/8th the size of France, without the advantage of a thousand years of tiny isolated villages working, living, growing, and cooking. The number of regional dishes in America was enormous… and most of them no longer exist.

That’s what we’ve lost. Because while the occasional dish rises out of local kitchens, the majority don’t. Everything is homogenized and flattened. Pawpaws don’t keep, and can’t be shipped fresh, so you can’t get them in the supermarket. And there’s no reason to make your own ketchup — Heinz is fine, that can be your sauce base. The basic principles of Newspeak apply: you don’t miss what you don’t know. Thus, “American cuisine” came to be little more than hamburgers and hot dogs, with only the strangest and most hardy surviving to the present day. Our rich cultural legacy exchanged for frozen pizza.

There have been efforts to walk it back, with “eat local” campaigns, but the worst of the damage is already done. Much like folklore, once the stories are lost, they’re gone. Why would your grandma (or her grandma) bother to write down the recipe of that dish she made when she just threw together what was handy? You kids sure loved it, though.



I’ve said enough for one day. While I’m nowhere near the level of DIY fanaticism of some, I’m trying a little harder not to buy things that can be made easily at home (and I’m not especially handy). For all those who want to play along, here are three easy recipes.
"One can make all kinds of interesting things, using simple household items"

Vanilla Gelato: This calls for whole milk, eggs, and sugar. I used a vanilla bean extract that had been sitting in the back of my cupboard for years, instead of vanilla bean, but other flavors work fine. The step involving an ice cream maker can be replaced by taking the bowl of ice cream base out of the freezer every 10-20 minutes and stirring (this keeps the ice crystals from getting too big).

Carbonara Pasta: Pasta (which is also easy to make), egg, parmesan cheese (or American imitator), pepper, and garlic or onion if you have it. Substitute Bacon for Panchetta, because it tastes better and is cheaper.

Crispy Skin Salmon: This one you’ll have to buy a piece of fish for, try your local fishmonger if you have one. Do what Gordon says in the video, you just need olive oil, salt, and pepper. I add lemon juice afterwards, and a little bit of honey during. Serve with whatever, I usually do either broccoli or a salad.


Next week, we’re back to normal (fingers crossed).