Dimensional analysis and blasts from the past

It’s a summer of cleaning out a lot of clutter for me. So here’s something I meant to write last year. That would have been nice because the timing would have lined up with a certain cinematic event that a lot of people seemed to care about. But I got wildly distracted by moving from North Carolina to Maine, so… time to clean out some old, less-timely clutter.

There’s a really lovely book that somehow wound up on my shelf years ago. The book is Theoretical Concepts in Physics by Malcom Longair. It definitely qualifies a “physics textbook,” but it also has quite the narrative element as it takes the reader from Newton’s discoveries to modern cosmology. Really, it’s a delight. But despite the well-done, big picture stuff, the one thing I remember most clearly is this tiny section on dimensional analysis. Longair recounts a story about G. I. Taylor annoying some authority figures by using crude dimensional analysis to estimate the yield of the blast at the Trinity test from photos published in a magazine. Two other things: (1) the blast yield was highly classified (hence, the “annoyance of authority”); (2) that’s the same Taylor behind a famous video demonstrating the weirdness of low Reynolds number flow (see here). It’s dated, but it holds up well enough to spook even today’s students.

Trinity blast at 0.025 seconds after detonation (source)

The photos Taylor saw were probably just like the photo above. Note that there is a time stamp and a distance scale. That’s important, because those quantities let one “measure” the time t the blast has been developing (from, effectively, a point) and the size (or radius r) of the “spherical” shape. Yes, you have to treat the top part as part of a spherical shape and hope the interaction with the ground doesn’t affect that too much. But the thing with hope is you might as well see how far it takes you before you lose it.

That’s actually quite the complicated fluid dynamics problem. But dimensional analysis is powerful. Figuring the density of air \rho might have something to do with the size (since the blast has to push out all of that surrounding air) one can fiddle around with just those three quantities (R, t, \rho) to obtain a combination that has dimensions of energy, E.

E\sim \rho\displaystyle r^{5}t^{-2}

Following Taylor, we can get out a ruler. I estimated the diameter at 260\mbox{ m}, which almost certainly contains more precision than I’m justified to retain. This is an example where things are so crude that significant experimental effort is not rewarded—the theorist’s dream! The expansion time is t = 0.025\mbox{ s}, and the STP density of air is 1.2\mbox{ kg/m}^{3}. Put all that into E \sim \rho r^{5}/t^{2}, and you get E \sim 7\times 10^{13}\mbox{ J}. A number that big means we’re probably using silly units. We can take the old-school approach of looking up the conversion to kT of TNT (it’s 1\mbox{ kT} = 4.184\times 10^{12}\mbox{ J}) or just ask Google directly. Either way, we’re probably using Google, and we get E \sim 17\mbox{ kT}. The source on that photo estimates E \approx 20\mbox{ kT}. Not bad for the minuscule effort!

I think I saw this ten years ago. Fast forward a few years, and I saw this again in Zee’s entertaining Fly by Night Physics. There, Zee goes through the same sort of argument, arriving at the same order-of-magnitude result. But he also makes the very sound point that one would get a better estimate by performing regression on a series of measurements. After all, think about how many introductory labs eventually coerce the student to performing linear regression on some data set before yielding the “final” answer. If one goes to the source for that image above, you can click “next” or “previous” to scroll through shots at different times. That’s handy.

The Trinity photos are pretty famous, and Taylor beat us all to the punchline. However, a few years ago, Lawrence Livermore National Labs posted a nice collection of recently declassified, high-resolution test footage from the 1950’s. Given that Wikipedia has an extensive list of US nuclear tests with estimated yields, one is well equipped to try out the method and see how it does in the wild.

Frame 15 from Tesla shot, Operation Teapot (source)

I spent a few minutes scrolling through the footage, and the Tesla shot from Operation Teapot (1955) was the first one I found that looked amenable to this kind of crude analysis. The good news is that the footage includes the frame number in the lower-right corner. The video comment also indicates the video was shot at “around 2,400 frames per second.” The ambiguity is ominous, but crude methods are robust to uncertainty. So we have a time stamp for each frame.

The less-than-good news is that there’s no length scale handed to us like in the Trinity photos. Fortunately, the title frame just happens to divulge that the device was detonated from atop a 308-ft-tall tower. That’s about 90 m, and it corresponds to the distance between the ground and the blast center.

Title frame for Tesla shot, Operation Teapot (source). From here we get the length scale and the (approximate) yield.

So in principle, we can just grab a frame and use the appropriate r and t to estimate the blast yield. But since there are so many usable frames here we can also take advantage of all the information, ultimately performing a fit to get a better estimate. This procedure has “tedious” written all over it, but there’s no reason one couldn’t do this “by hand” (like Taylor allegedly did) with each frame from the YouTube video. That’s one path. But I used LoggerPro because I happen have it installed. Any primitive video analysis software is going to save some time.

In LoggerPro, the green line sets the scale (90 m), the yellow crosshairs show the origin, and the blue dot represents the point being tracked (edge of blast).

Back at HPU, we had a bunch of introductory labs involving video analysis in LoggerPro. It’s remarkably easy to use. After importing the video, one just sets the origin, scale, and frame rate before clicking on a point in each frame to track. It’s so easy, even I can do it and get decent results. The high-speed 1950’s film quality leads to a lot of judgment calls about where the edge is, but the idea behind using all the points is that random errors are going to cancel out (roughly, hopefully). Also, it turns out that taking a screen recording of the YouTube video playing is “good enough,” but you’ll probably get some duplicated frames because the recording won’t be perfectly in phase with the playback. Quick and dirty for the win.

Once all the points are collected, LoggerPro (or any tool you use) should have a collection of t and r values. If you’re using LoggerPro, you can do the rest there or just export and use Excel or Python (or whatever is hip these days). Since we expect r^{5} \sim (E/\rho)t^{2}, it means we should expect a linear relationship between t^{2} and r^{5}. So we plot r^{5} versus t^{2} and observe the strikingly linear relationship that emerges.

Use your favorite tool to perform linear regression, and that slope should correspond to CE/\rho where C is some unknown (dimensionless) constant we must inherit as the cost of using dimensional analysis. We don’t know what C is, but generally it’s “of order one,” C \sim \mathcal{O}(1). That doesn’t mean it’s necessarily close to one. It could be \pi or 1/\sqrt{8} or any number of exotic constants. But it’s probably not going to be 10^{\pi} or 10^{-\sqrt{8}}. We’re looking for a crude estimate: is the yield a few kilotons or a few hundred kilotons? So really, we’re just estimating \mbox{slope} \sim E/\rho. Behold, the street-fighting approach.

In this case, the slope works out to 2\times 10^{13}\mbox{m}^{5}/\mbox{s}^{2}. Converting this to energy, and using \rho_{0} \sim 1\mbox{ kg/m}^{3} gives E \sim 2\times 10^{13}\mbox{ J}, or \sim 5 \mbox{ kT}. That’s compared to the 7\mbox{ kT} quoted, so it’s quite good agreement. If anything, it suggests that we’re lucky and C \approx 1, at least for the conditions of these two tests.

To be fair, I tried the same thing with another shot and got \sim 100\mbox{ kT} compared to a quoted value of 43\mbox{ kT}. I suppose it’s possible that the government underreported or miscalculated the yield, but certainly not to this extent. More likely, that dimensionless constant C is some function of parameters that doesn’t vary much from unity for low yields. At higher energies, other physics might become more relevant, leading to a different (apparently smaller) value for C. It’s possible one could comb through these videos and work out the effective C for a range of yields to see whether or not there’s a simple functional form. Unfortunately, most of the bigger yields seem to be air drops or near-surface detonations. In either case, there’s no easy way to set a length scale in the video. But there are a number videos for other low-yield devices set off from towers, so it would still be interesting to see if C \approx 1 holds up for some of those.

Leave a comment