Skip to content

This is now officially one of the worst experiences of my life. Two of the top three have involved neupogen, a white blood stimulant. In fairness, it worked spectacularly this time around (0.5 to 14.7 in less than 48 hours) but the pain and other side effects were horrible.

I'm better now, but this morning I was still super tired and a doctor decided for some reason that this might be due to neurological problems as opposed to, you know, not getting any sleep during the night. So a couple of hours later a nurse came in and told me I would be getting a steroid later in the day. This immediately raised my suspicions. "What kind of steroid?" Do I even have to tell you that it's dexamethasone, aka the Evil Dex?

I'm going to try to talk my way out of this. I'm genuinely not suffering any neurological problems, and I really don't want to be up all night again. We'll see.

In the meantime, enjoy this coronation video.

This is a bush poppy, I think. I took this photo on the path leading from Angeles Crest Highway to Colby Canyon Falls, which is both pretty and easy to get to. It will probably get even prettier when our massive snowmelt begins in earnest, but I will likely be housebound by then, so I'll never see it.

April 21, 2023 — Angeles National Forest, California

More about Clarence Thomas and his seemingly endless stream of gifts from Harlan Crow:

“This is way outside the norm. This is way in excess of anything I’ve seen,” said Richard Painter, former chief White House ethics lawyer for President George W. Bush, referring to the cascade of gifts over the years.

The ethics lawyer for George W. Bush thinks this is bad. But wait!

Justices also must report many gifts to their spouses and dependent children. The law’s definition of dependent child is narrow, however, and likely would not apply to Martin since Thomas was his legal guardian, not his parent. The best case for not disclosing Crow’s tuition payments would be to argue the gifts were to Martin, not Thomas, experts said.

Thomas was just his legal guardian, so it's all good.

This is obviously ridiculous. The money was plainly a gift to Thomas, since he's the one who normally would have paid the tuition fees. And it's not like he couldn't afford it:

Thomas has long been one of the less wealthy members of the Supreme Court. Still, when Martin was in high school, he and Ginni Thomas had income that put them comfortably in the top echelon of Americans.

In 2006 for example, the Thomases brought in more than $500,000 in income. The following year, they made more than $850,000 from Clarence Thomas’ salary from the court, Ginni Thomas’ pay from the Heritage Foundation and book payments for the justice’s memoir.

This really should be the last straw. Believe it or not, I've been a little bit on the fence until now about this whole affair, but not any longer. Even if it was all legal, Thomas should have reported it. The fact that he didn't means nothing good.

But Thomas doesn't give a shit. Only impeachment can force him out of office, and that's not going to happen. He could shoot someone on Fifth Avenue and he'd still be secure in his job.

Jesus. I'm out of it for a couple of days and it turns out that Harlan Crow paid something on the order of $150,000 in private school tuition for Clarence Thomas's grand-nephew, who Thomas raised "like a son." Needless to say, none of this was reported.

The best explanation Crow could come up with was this: "Harlan Crow has long been passionate about the importance of quality education and giving back to those less fortunate, especially at-risk youth." I'm not sure how the son of a Supreme Court justice is "at risk," but whatever.

Damn. I wish Crow were passionate about finding a cure for multiple myeloma. Maybe then I could have seen a piece of that sweet Harlan Crow largess.

I spoke too soon on Sunday. Things were very definitely not back to normal.

I spent Monday back at the hotel, where things seemed OK at first. My temperature was elevated, but not in the danger zone, and by bedtime it was down to 99.2. The next morning we trotted over to the Day Hospital as usual, and they measured my temperature at 100.5, so I was admitted to the main hospital.

My temperature spiked as high as 103 on Tuesday, but it's been below 99 for all of today, so the worst is over. And there's no sign of neurological fuzziness, which is super good news.

I'm very, very tired, and this is the first time I've felt up to writing even a short post. Bottom line: everything is fine. Just a normal fever that most CAR-T patients get. More later.

April is too early for good Milky Way photography, but during my last trip to the desert I decided to try it out anyway. Right now the Milky Way doesn't rise above the horizon until about 3 am, and even then it's sort of dim. Still, I was out there anyway at 3 am, so why not give it a try?

In the past, I was never able to find good stacking software, but the software I bought for my general astrophotography is a superb stacker. So I pointed my camera toward the Milky Way, kept my exposures to 20 seconds so there wouldn't be any streaking, and took a couple of hundred exposures.

I was shocked at how well it turned out, especially since the individual exposures seemed to show nothing at all. As it happens, I did get some streaking, so next time (whenever that is) I'll use 10-second exposures. Or maybe I'll even haul out the teensy little equatorial tracking mount I bought a few years ago. Who knows?

April 16, 2023 — Desert Center, California

This is a typical weekend crowd along the Seine checking out the famous green bookstalls. These days about half the stalls sell mostly the same mass-produced bits of tourist stuff, but the other half still have interesting specialties (music, art, Romanian poetry, whatever) and are fun to browse around in.

June 5, 2022 — Paris, France

In case you're interested, Bob Carpenter has a short technical explanation of how modern Large Language Models (like GPT-4) work:

In a nutshell, language modeling is the simple task of predicting the next subword (“called a token”) based on the previous sequence of subwords. The state-of-the-art had stalled for years on n-gram models that use the previous n subwords (usually with n < 5). In 2017, a team of Google researchers released a paper titled "Attention is all you need," which introduced the current state-of-the-art neural network architecture for language modeling. The breakthrough was in extending the context length into the thousands (GPT 3.5 uses 4K, GPT 4 has 8K and 32K models) with an attention model that figured out which parts of the context to concentrate on. The fundamental bottleneck is that computation is quadratic in context length (though it's all on GPU, so that's a massive numbers of flops for relatively low power).

There's more at the link. It's an interesting short read.