Skip to content

So how is everybody doing on the COVID-19 front? Here are cumulative COVID-19 cases for the ten largest states. I'm showing just the past few months so that the differences are visible to the human eye:

Total cases to date range from 8.9% (Pennsylvania) to 10.4% (New York). California is flattening out at the moment, while every other state is continuing to rise.

I'm really unsure what lesson to take here. On the one hand, sure, some states are doing better than others. On the other hand, the differences aren't huge. To some extent I think that's because different state regulations have (a) been less different than you'd think, and (b) are playing second fiddle to local and corporate rules anyway. Texas can declare that no one needs to wear masks, but if restaurants and supermarkets continue to require them then people will wear masks.

UPDATE: I made a mistake in the original version of the chart, omitting Pennsylvania and adding New Jersey. Both the chart and the text have been corrected.

Peter Orszag says that I'm wrong about America's bridges: we really need to increase funding to repair and rebuild them if we want them to remain in decent shape. His strongest argument is that instead of looking at the past, we need to look at the future:

Consider the age of American bridges. Not surprisingly, bridge quality tends to diminish as a bridge ages, especially toward the end of its useful life. Today, the average bridge in the U.S. was built in 1975, making the average age 45 years. In 1992, the average age of a bridge was 34 years. And today a quarter of bridges are 60 years old, up from 15 percent in 1992. As the bridges grow older and a larger share of them approach the end of their useful lives, it will cost more to maintain or replace them than a backward-looking indicator would suggest. Without investment beyond current levels, bridge quality will decline.

I'll buy that.¹ I'm not actually opposed to spending more on roads and bridges, and I've long favored an increase in the gasoline tax to fund exactly that. So we're not that far apart, especially since President Biden's infrastructure plan includes only $115 billion for roads and bridges. That's $11 billion per year, which is hardly a king's ransom.

For what it's worth, my main gripe is really with the overall conventional wisdom that US infrastructure is falling to pieces. Some of this comes from the American Society of Civil Engineers, which is naturally in favor of lobbying for more civil engineering projects and tells us on an annual basis that our infrastructure is crumbling. I don't hold that against them, but I am perpetually surprised that their lobbying is accepted so credulously as some kind of unbiased assessment.

Beyond that, I suspect that our New York-centric national media plays a role here too. If you live in New York City, with its potholes and train failures and traffic jams and century-old water tunnels, infrastructure porn is probably a pretty easy sell. Outside of New York, however, things are not nearly so bad.

I suppose this is yet another example of my view that most things are not as bad as we're often led to believe. But I hold this view only because it's true. There are things that are getting worse, but for the most part we are victims of politicians who get more votes if they scare people and media folks who get more clicks if they scare people.³ This is why I'm instantly on red alert every time I read a story (or listen to a politician) telling us how bad something is without also telling us how it compares to previous years and/or other countries. That comparison is everything.

¹In fact, if you want a good example of this, just take a look at Italy. They built a ton of bridges right after World War II and most of them were designed for a 50 service life.² Everything was hunky-dory through the aughts, but Italy hadn't maintained their bridges very well and then, just when the need was greatest, was unable to maintain them thanks to the Great Recession. The result is that a bridge system which had a pretty enviable record from 1950-2000 suddenly began killing people in large numbers.

²It's worth noting that this is for everyday bridges, the kind that are workhorses of daily commutes but that no one notices. The big, impressive bridges that cross scenic gorges and end up on postcards are generally designed to last much longer.

³Plus, of course, the usual problem that "Bridges Continue Not to Collapse" isn't much of a news story. If a bridge collapses, it's news. If a bridge stays intact, then it's just a bridge.

Why did primates and then humans develop a facility for advanced language? One theory is that it helped primate tribes defend themselves better against predators and other dangers. Before the development of language here's an example of tribal coordination when a saber-toothed tiger drops out of a tree with lunch on its mind:

SOMEONE: Eek!

SOMEONE ELSE: Arrgh! [Kill it!]

At that point everyone panics and either runs away or attacks the tiger depending on their personal levels of bravery and stupidity. Result: Satiated tiger. Here's how it goes after the development of language:

SAM: Tiger!

TRIBAL LEADER EDWARD: Fred, grab that big stick and start waving it. Joe, take point. Bob and Andy, you're on flank. Charlie, go get some fire.

Result: Dead tiger.

Artist's conception of pre-language Stone Age man facing a saber-toothed tiger.

This is not a bad theory, as these kinds of theories go, but there's a better one based on the idea that the greatest threat to humans is other humans. In operational terms, it means that the development of language was mainly due to a desire to gossip about each other.

This must have been a powerful force. It didn't come for free, after all. It required changes to the larynx that made choking more likely, and it required the development of a large brain, which needed more energy and made childbirth more dangerous.

Nevertheless, for a dominance-hierarchy species like ours, knowing who's up and who's down was critically important. And affecting who's up and who's down was the difference between living a decent life and being tossed to the wolves as a small child. Naturally this started a linguistic arms race, since tribal dominance depended on ever-escalating subtlety and nuance among the gossipmongers. The side effect of all this was the eventual ability to construct pyramids and solve differential equations, but that was never the original plan.

Scientific illustration of the evolution of the larynx.

So time went by and language became ever more sophisticated, as did cooperation and betrayal. We invented writing, which provided new scope for gossip and backstabbing, and then movable type. Then there was the penny post and telephones and finally social media.

And this explains what social media is really for: bitching and moaning and talking shit about other people. There is nothing new about this, and it taps into the core motivation for developing language in the first place.

There's more, of course. The big difference between, say, telephones and Facebook, is that Facebook is generally more open and blunt. But didn't I say that language had grown ever more sophisticated and subtle over the years? I did. So why is Facebook famous for being rude and stupid?

Answer: Because it's populated mainly by idiots. No one with any sense would do any serious scheming on Facebook even if they had all their privacy settings applied correctly. It's just too risky. The best and most serious communicators in the modern world use social media for occasional propaganda, but otherwise stick to telephones and encrypted emails and so forth.

AI reconstruction of typical social media users.

This is the basic story of social media. It seems like it should be the latest and greatest advance in human communication, but it's not. Just the opposite: It's designed to appeal to one of the core motivations of humankind, but only in the most brutal, unsubtle way. This makes it mostly a way to corral kids and unsophisticated communicators away from everyone else, which is why it seems like such a cesspool. It's how kids and idiots have always talked, but in the past we've mostly been able to ignore them.

And we take it seriously even when we shouldn't. Don't get me wrong: kids and idiots can cause a lot of harm. This is why staying abreast of social media is important. But it needs to be seen for what it is: a communication medium that appeals to our strongest human instinct—meanspirited gossip—but is mostly just a distraction from the real centers of power and sophistication.

And now I shall tweet out a link to this blog post.

There are certain topics that generate an almost infinite number of news articles that are essentially identical except for the single detail of who the main character is. The pandemic is an example: you can write a piece about how it affected the theater business; and the restaurant business; and the trucking industry; and Joe's Dry Cleaning—and they will all be basically the same: their customers went away and this caused them big problems. Racism works the same way: you can write a thousand stories about a thousand different industries and they're essentially all identical. The details will differ, but it will turn out that racism is responsible for the underrepresentation of Black/Hispanic/Female/etc. people in that industry.

We are now entering the same phase of the end-of-lockdown pandemic story. Guess what? It turns out that practically every industry cut back on production because their customers stopped buying stuff. And only now, for reasons that remain puzzling, are they realizing that the pandemic wasn't a permanent condition and they need to ramp up production. Why didn't they figure this out a few months ago when vaccines became widely available? Beats me. My unconsidered view is that it's because they're idiots, but I suppose there's more to it.

Anyway, we're now being deluged with stories along this line. Last night, for example, 60 Minutes ran a segment about the shortage of chips for cars and videogames and whatnot. And why is there a shortage of chips? Is it because we've outsourced everything to the wily Chinese folks on Taiwan? You'd think so after inhaling Lesley Stahl's inane reporting, except for the fact that she inadvertently allowed the chairman of Taiwanese chipmaker TSMC a brief moment to give the game away: "In March, 2020, as COVID paralyzed the U.S., car sales tumbled, leading automakers to cancel their chip orders. So TSMC stopped making them."

TSMC chairman Mark Liu

Oh. So it has nothing to do with Taiwanese fabs vs. American fabs or global supply constraints or any of that. Nor is it related to a possible invasion of Taiwan or the fact that Intel may or may not have made good decisions about its future business. It's because American car companies canceled their chip orders and never bothered to reinstate them. Then in December, when car sales "unexpectedly" began to rebound, they panicked and realized what they had done. You'd think these guys had never done an economic forecast or used an MRP system before in their lives.

Anyway, be prepared for hundreds of stories like this. For each one, be careful to ignore the details and instead focus on the big picture. In about 90% of them, it will be the same: They didn't plan for the pandemic to ever end, and now they're paying the price.

If you sell an app for use on an iPhone, Apple demands a 30% cut of any revenue generated by the app. But is this legal?

Apple is being sued by Epic, the maker of the popular video game “Fortnite,” for allegedly using its control of its mobile operating system to stymie competition....Up for debate is how Apple allows apps to function on iPhones. The only way to install software on Apple’s mobile operating system, called iOS, is through the company’s App Store. Developers who make software for iOS must follow Apple’s rules and use its payment system, which charges a commission on every sale.

....The trial will determine whether Apple’s control over iOS is a monopoly, and whether Apple can use that control to force developers to use the App Store and its payment system. One possible outcome in the case is a very different smartphone landscape, in which the powerful computers in everyone’s pockets operate more like desktop computers, where any kind of software is allowed to exist.

The trial begins today. As it happens, Apple's market share of smartphones in the United States is about 50%. Does that qualify as a monopoly? I'm not sure, but it comes pretty close by anyone's measure.¹

One thing to keep in mind, because a lot of people don't seem to understand this, is that there's no law against being big. Nor is there any law against having a huge market share. Like it or not, though, there is a law that prevents big companies from abusing their bigness. A small company could charge anything it wanted to sell apps on its phones and no one would have a case against them. But a big company has to be more careful. There are things a small company can do that a big company can't.

So the question is whether Apple is doing something that they shouldn't, even if it's something that a smaller company could get away with. But the peculiar thing about this case is that antitrust law is generally oriented toward preventing big companies from doing things that stifle competition. Spotify, for example, says that Apple's 30% charge makes it hard for them to compete against Apple's own music offerings. That's a fairly normal case.

But Epic is different. Even if they can show that Apple has a habit of stifling competition in some cases, Apple doesn't have a big presence in the gaming market and their 30% charge doesn't really do much to prevent competition against Apple's own offerings. It's just a way of hoovering up a bunch of money. So even if Apple is, in some sense, abusing its bigness, it doesn't seem like it's doing anything that makes it harder for small gaming companies to enter the market.

In any case, this trial is scheduled to last three weeks plus about a thousand years of appeals, so don't expect any quick decisions. Normally, I might predict that the companies will reach a settlement before then, but both CEOs seem inclined to treat this as something of a dick-measuring contest and are unlikely to give up. So this case might truly drag on nearly forever.

It's more than obvious now that early April represented our peak vaccination rate and we're now declining pretty steadily.

This whole thing is just such an ungodly shame. For all the endless griping about how badly we screwed up our response to the pandemic, nobody can complain about our vaccine development. The Chinese released the coronavirus genome on January 10; we had the first vaccines developed by January 12; and they were ready for distribution by December. That's just mind-boggling. Overall, we've also done a good job of distributing them, with something like half the country vaccinated within four months.

But now, just as we're on the brink of truly crushing the coronavirus, we can't quite take the final step. Masking and other countermeasures are being abandoned too quickly and we're now having trouble getting the other half of the country to get the vaccine. All we need is a few more weeks. That's it. And we can't quite do it.

We'll get there eventually. Everyone forgets this now, but vaccine hesitancy ran around 30% for the polio vaccine too, so response to the coronavirus vaccine is not unprecedented. I'm just not sure if that makes me feel any better.

A new study suggests that the CDC's pause in administering the Johnson & Johnson vaccine had little to no effect on attitudes toward vaccination. Here are the results of a survey that spans before, during, and after the pause:

Before the pause, 60% of the respondents had either gotten vaccinated or wanted to as soon as possible. After the pause, that number was 63%. The percentage who wanted to wait until "after most people I know" stayed the same, as did the number of people who flatly didn't want it.

The authors also break down the results by demographic group, and there's not a single group that's more vaccine hesitant after the pause. Their conclusion:

Our survey suggests that awareness of the J&J pause was extremely high. Despite this, vaccine hesitancy/resistance did not increase for responses received after the pause, and our analysis of repeat respondents suggests a small but systematic shift decrease, largely because a fair number of people who were vaccine hesitant in early April were vaccinated by late April. In short, it seems very unlikely that the pause had major negative effects on vaccine attitudes.

My update to Wednesday's post about the J&J pause points in the same direction. The immediate decline in vaccination rates after the pause was due mostly to a decline in J&J vaccines, as you'd expect, and the subsequent smaller decline in the overall vaccination rate was most likely because we had vaccinated nearly everyone who was enthusiastic and were now entering a new phase that required more work to persuade people to get shots.

For now, then, I've changed my mind. It appears as if the J&J pause, regardless of whether it was right or wrong, probably had little to no effect on vaccine resistance.