You are currently browsing the category archive for the ‘Modest Proposals’ category.
In our modern, tech-driven age one of the major subsets of monopoly is proprietary data formats. If you can set it up so that the only way to access the data is to pay you for a way to read it, you can rake in the cash. The version of that that people my age are most familiar with is the VHS vs. Betamax wars, in which Sony tried hard to make video tape inaccessible to anyone who didn’t pay them, but it was a standard tactic in the early days of computers. Right up through the early 1980s, all of the computer manufacturers and most providers of sophisticated programs made their data formats proprietary and incompatible with anyone else’s to the extent possible, and employed legions of lawyers to prevent others from creating conversion software that would allow users to escape “lock-in”.
Nowadays we’re seeing a repeat of that tactic in electronic book formats. The Sony rôle is taken by Amazon, with proprietary formats for the Kindle, and the VHS-equivalent is .epub, a semi-open standard being adopted by Amazon’s smaller competitors. Kindles won’t read .epub, and Amazon keeps the specifications of its own formats close to its vest and won’t license them (or won’t except at prohibitive cost), so competitive e-readers can’t load them. They have in common the .mobi format, a legacy of MobiPocket, but .mobi doesn’t have a lot of the features both manufacturers and users want in an electronic book.
Unfortunately for Amazon (and for anybody else seeking to establish a proprietary format) the world is full of programmers who learned from the early days of computers to hate that tactic with the heat of a thousand suns, and are prepared to put their coding effort where their hearts are. Conversion from Betamax to VHS and vice-versa required a complex, expensive machine; there weren’t many companies with the capability to develop and manufacture such a thing; the result was that it was easy to detect and deter violators of the Betamax licensing provisions. Conversion from any computer file format to any other requires… a computer and suitable software; computers are ubiquitous, and computer software requires programmers to stay up late and code; it’s virtually impossible to even detect a programmer punching keys in the basement of a house somewhere in Eastern Europe, let alone find out what he or she is working on so as to deter them from that attempt. The predictable result is conversion software, which is already starting to appear.
The three I’m familiar with are Calibre (note the spelling), Jutoh, and Scrivener. They each have a different focus: Calibre is primarily library management and conversion; Jutoh focuses on the .epub format, and is a way to edit files in that format more or less directly; and Scrivener, which has been popular on Apple for years and is now available for the PC, is primarily utility software for writers but can read and write a number of different formats. Calibre is free. The other two are payware, but the cost is within reach of anybody who can afford $200 for an ebook reader in the first place.
For the owner of an ebook reader who isn’t concerned with authoring, Calibre is the right choice. All of the readers come with (proprietary) library management software. Calibre aims to replace that, and can connect directly to most readers; its conversion capabilities are extensive, and are or can be made more or less transparent. Hook your Kindle to your PC, use Calibre to download and file a .epub from B&N, and transfer that to the reader. It’s all automatic, with the only thing you’d notice is that it takes a little longer than a simple file transfer. It isn’t perfect — the format it’ll convert to is .mobi, not .avi or the newer Amazon format, so if the book is complex the result may be missing some features. We can expect that Amazon will field a legion of lawyers to make sure that continues to be the case, but if all you want is to read the book, the process works fine.
For authors who are already using a compatible workflow (which most are: Microsoft Word as the origin document) Jutoh works well. Its developers are directly connected to those working on .epub, so it probably has the most complete set of facilities for managing that format. Once again, its access to the Kindle depends primarily on the .mobi format, so some of the gee-whiz features may be missing. It’s easier for an author than Calibre, because Calibre’s focus on library management creates some clumsiness, especially for incremental development, i.e., editing and new versions.
Scrivener comes closer to matching my writing workflow, but it isn’t very useful as a general file-conversion utility. The capability is there and usable, but that isn’t what the program was designed for. Mac-using writers have raved about Scrivener for a long time now, and having it available for the Microsoft environment will probably attract a host of new users, but they’ll mostly be writers or wannabees. A person who just wants to read ebooks and doesn’t intend to write them will find it formidably complex; Not Recommended for such an individual. Again, its access to the Kindle is mainly via MobiPocket.
The common thread in all this is that Amazon is vigorously developing new, better, flashier proprietary formats, leaving the .mobi format as an orphan, and employing lawyers to enforce the full rigor of copyright law to freeze competitive e-readers and the developers of conversion software out of their revenue stream. If Jeff Bezos were to ask my advice (not bloody likely!) I would tell him to abandon that approach. The whole point of .epub is that it’s a relatively open, common format, and later versions have a good feature set. If it doesn’t have a feature Amazon wants (other than proprietary lock-in) the .epub developers would welcome assistance from Amazon’s highly competent army of programmers to incorporate it.
They won’t do that. Thanks to some very smart moves in other venues, Amazon is the Big Kahuna of the ebook, and sees their competitors as ankle-biters whose primary utility is that they’re useful as defenses when the SEC comes calling. “No, of course we’re not a monopoly, look at all the competitors we have!” No doubt their bean-counters are sniggering at the vain attempts to depose them from their perch, as the cash rolls in.
I would urge them to caution and avoidance of complacency. In the late Seventies Sony was the Big Kahuna of video tape, making money at a ferocious rate and prepared to deploy schools of legal sharks to maintain its position. It’s worth noting in that respect that Beta/Betamax/Betacord was, in fact, a superior format, with better definition and more satisfactory synchronization than VHS — but VHS was easier to build, and didn’t require substantial payment$ to Sony in order to deploy it. Smaller manufacturers, working on a shoestring to bring product to market, adopted VHS, and Betamax was eventually nibbled to death rather than either winning or going out in a blaze of glory. Beta was always the preferred option of professionals in the video business, and now that DVDs and Blu-Ray have more or less eliminated video tape from the consumer market, most of the surviving tape users are still working in Beta — but they’re a small market, and always have been. When tape was “live”, consumers went with VHS, and Sony failed at its attempt to lock everybody in to its revenue stream.
Something similar is very likely to happen in the ebook business, although the situations aren’t precisely parallel. Unlike what Sony did, at present Amazon isn’t charging a premium to users of its format(s), but if it achieves lock-in the clamor from the bean-counters to start doing that will be well-nigh irresistible. Even if they don’t, there are enough programmers out there who despise proprietary formats that the pressure to decode and reverse-engineer them will be equally great, and keeping the lid on via deploying platoons of copyright lawyers is a short-term solution at best, because if the decoding/encoding system escapes onto the Internet at any point the effort will have been futile.
Again, at the moment Amazon isn’t acting like a monopoly or near-monopoly, but there are a growing number of people who recall the near-inevitable consequences of the establishment of a monopoly and View With Alarm the growing power. One thing Bezos and company could do to at least partially head that off would be either to open their formats to competitors (and conversion software writers) or to adopt a relatively open standard. It would reassure people that, even if the Big Kahuna achieves full dominance in the market, others would continue to have access, and would go a long way toward decreasing any feelings of resentment that might result in either programmers or the legal system attacking them. Embrace .epub, Jeff. You’ll spend less for lawyers and programmers both, and get fewer people mad at you.
But since he won’t, conversion software is needed. It will be available, too. It already is, and more will come along. Users of e-readers should seek it out and support it, if for no other reason than to keep Amazon honest.
Moe Lane has a modest proposal:
Let’s combine a new CCC program with a student loan forgiveness deal: twice minimum wage, but every dime above minimum gets deducted automatically and goes to pay off your student loan principal.
Unfortunately it doesn’t go far enough, and therefore fails in its didactic purpose.
The Moonbat Conservation Corps shouldn’t actually get wages, as such. They should get a tent, a bedroll, and three squares a day, and at regular intervals (perhaps weekly) they should get $20 to squander on personal gratification. The rest of their (nominal) wage should go to repay the loan.
What they should not get is heating, air conditioning, electric power to charge their iWhatevers and game consoles, or the use of powered machinery, all of which (gasp! the horror!) emit CARBON DIOXIDE, THE RUINATION OF THE PLANET. Building a campfire should result in instant dismissal, with a Stripping of Insignia ceremony borrowed from “Danny Deever”. Their meals, of course, should conform to the nutritional advice du jour of the food-health nannies. Perhaps we could get Mr. Giuliani to act as Chief Nutritionist.
Their employment need not depend on the existence of potholes. Many of them could be used on the farms of the Midwest — a gang using short-handled hoes should be easily able to meet the EPA dust restrictions — and many more could be employed on treadmills, running electric generators to substitute for them nasty ol’ coal-fired power plants. A minority could be given rags and squeegees, which they could use to wipe the efficiency-robbing grime and/or snow deposits from solar panels, and an even smaller group might be technically inclined enough to be given climber’s straps and put to work repairing broken windmills, of which we already have an elegant sufficiency and will get more.
The only real problem would be finding overseers, but that would probably solve itself. Historical evidence suggests that the necessary lash-wielders can be drawn from the population of workers, with no extra costs involved. I suggest Al Gore as overall manager of the program, with Michael Moore as advisor on matters of justice and fair play. The rest of the posts can be easily filled by the normal process of bureaucratic hiring, although for verisimilitude we should probably require Southern accents for most of the managers. This can work!
A very nice graph from Robert Bryce at National Review. In Texas at least, the relationship is clear: When power is needed, there ain’t no wind. When wind is available, the need is minimal. This matches what I see when I drive by the two “wind turbines” my neighbor-up-the-road has installed. He originally intended to sell wind energy installations, and got a couple of customers, but I notice he’s taken the sign down —
However, all is not lost. What we need to do is put up a couple or three frickin’ huge nukes. Then we can run all those propellers in reverse. When it’s 110°F plus, a nice whole-state fan would be just the thing.
Liquefied Natural Gas is natural gas, the same stuff that’s piped to your house for heating and cooking, except that it’s been chilled down to -260°F (-160°C) so that it’s a liquid at normal atmospheric pressure.
Natural gas, in general, is a near-ideal fuel. We will never have the “hydrogen economy” because the molecule of hydrogen, which contains two atoms, is so small that it is almost impossible to store. Hydrogen is smaller than the spaces between molecules (or crystals) of anything you might make a tank out of, so keeping it is like storing marbles in a net-bag — inevitably some seeps out. If the tank is metal, on the way out the hydrogen bonds to the metal atoms to form hydrides, which are soft and brittle, not at all what you’d want to make a fuel tank out of. Natural gas is mostly methane, which is one carbon atom and four hydrogens. Most of the energy from burning natural gas comes from the hydrogen. The carbon atom serves as a ball and chain to keep the hydrogen safely confined.
The disadvantages of natural gas as a fuel primarily come down to density. At normal temperature and pressure it’s a gas,
slightly heavier than air (no: methane is lighter than air. Thanks, Cajun), with an energy content so close to 1,000 BTU per cubic foot that the units are interchangeable until you get to fine details. Gasoline, by contrast, has over 800,000 BTU per cubic foot, which is why it is, so far, preferred as a vehicle fuel. Replacing a 16-gallon tank of gasoline with the same energy content of natural gas would require over 13,000 gallons of volume, not particularly practical — a big tanker truck might carry 5,500 to 9,000 gallons, and towing two of them with your car would be clumsy and inefficient. There are two ways of overcoming this problem: Compression and liquification.
Compression is simpler. Natural gas at 200 bar has 200 times the energy density, so storing it in a vehicle becomes less problematic. You still need four times the volume to replace gasoline entirely, which isn’t practical in a small car but may be doable in a van or a truck. A bigger problem is the pressure. 200 bar is 29 thousand pounds per square inch, or roughly a thousand times the pressure in your tires. The tanks have to be strong, which makes them heavy. Again, that’s a problem in a small car but may not be in a truck. It might also be a problem in accidents, but a tank strong enough to hold 200 bar is also strong enough that it won’t be easily punctured. CNG is becoming available, and there are vehicles manufactured to use it, but it seems to me to be only a step — a step in the right direction, but only a step nevertheless.
[Correction from Fred Abernathy of Harvard: 200 bar is 2,900 PSI, or a hundred times normal tire pressure. Thanks, Fred. Mumble grumble. I used to be fairly good at arithmetic]
The more modern choice is liquification. This has two advantages. It results in even higher energy density — liquified natural gas has over 600,000 BTU per cubic foot, only a 1/3 penalty vis-a-vis gasoline. It also gets rid of the other stuff in it. “Natural” natural gas, straight from the ground, has all sorts of other compounds mixed in with it, and during the liquifying process they get separated out, so what’s left is almost pure methane. That’s important from a pollution point of view, since the other “stuff” has more carbon in it than methane, so it produces more CO2.
A tank for LNG needs to be strong, but not as strong as one for CNG. It also has to be insulated, and as a vehicle fuel this comes out to possibly the biggest disadvantage of LNG: if it isn’t used right away it warms up and escapes. One of the target markets for LNG is over-the-road trucks, which fill up and go, so the fuel doesn’t stay in the tank long. I don’t drive much, so a tank of gas lasts me over a week, sometimes two. If my car ran on LNG it would need a chiller to keep the fuel cold when I’m not driving, which would have to be run off the electricity in the house or use up some of the fuel to keep it going. For a city dweller with natural gas piped to the house, that would be an advantage — the same gadget could liquify the piped-in gas and store it, much more efficiently than battery charging. As a side benefit, in an LNG-fueled vehicle an air conditioner would be an existential definition of “redundant”. If you have a store of liquid at -260°F, compressing and expanding Freon® to get cool would be totally unnecessary.
One huge advantage of natural gas as a fuel is its octane rating. That’s a complicated subject, made even more complex by past advertising campaigns that label high-octane fuel as “premium”. An engine with high compression is more efficient, and for high compression you need a high octane rating. Natural gasoline has a low octane rating, so it needs additives to make it usable in an efficient engine. When high-compression engines first started being produced that problem was quickly noted, and fuel to be used in the newer, better, more expensive engines was made available; the primary additive was tetraethyl lead, produced by the Ethyl Corporation and advertised to the skies as Newer Better Fancier — “premium”. High-octane gasoline actually has less energy than the cheap stuff! Nowadays, of course, pollution fighters have serious hots for lead in any form, and the additives that replaced it, mostly various forms of alcohol, are either more expensive, reduce the energy content even more, or both, and in any case don’t increase the octane rating as much as lead did. Natural gas has an equivalent octane rating of about 130, higher than any form of gasoline including “purple racing gas”, and as a bonus its burning process doesn’t form as much oxides of nitrogen as other fuels, so efficient engines are practical. They don’t even need catalytic reactors.
Another advantage of natural gas is that it’s here. The United States imports some natural gas (in LNG form, because that’s the most efficient way of transporting it) but it doesn’t need to, except where it’s more economic to buy it from an easily-exploited reservoir than to drill for it, or where population density plus safety concerns make pipelines less attractive. There’s natural gas almost everywhere down deep, and we keep finding reservoirs of it where it’s seeped upwards and been trapped in rock that’s less porous. There are also clathrates, blobs of natural gas mixed with other things found in the deep, high-pressure, cold oceans — it’s difficult and dangerous to retrieve them, but if we run short the technology could be developed, and clathrates are abundant, possibly even containing more methane than rock does. There is lots of natural gas, and exploiting it would reduce both pollution and dependence on exports.
So why aren’t we using natural gas more?
Well, we’re starting to — now that it’s become apparent that it’s abundant, and therefore cheap and likely to remain so, new fixed installations tend to go that way. Among other things, it’s the ideal fuel for “peaking” and “backing” plants designed to generate the electricity supposedly produced by windmills and solar panels. As a vehicle fuel it’s a bit more problematic because of the storage problem, which boils down to infrastructure and thence to history. Gasoline was originally an unwanted byproduct of drilling for oil; the original market was for lubricants, to replace whale-killing and other animal fats, and for kerosene for oil lamps. The early invention of the Kettering spark system made gasoline-fueled, spark-ignition engines easier to build than Diesels, and the gasoline was there in quantity, despiséd by most and available for use. It’s relatively easy to handle and has high energy density, so the familiar setup of storage tanks and gas stations grew as the use of automobiles did.
The technology of the early Twentieth Century wasn’t up to achieving and maintaining the low temperatures and high pressures necessary for liquifying natural gas and storing the result, which is another reason they based the system on gasoline instead. Now, though, we have relatively efficient and inexpensive ways of doing that, many of them derived directly or indirectly from the space program — liquid nitrogen, at a temperature of -320°F, is considerably colder than LNG, and nowadays costs about as much as beer or gasoline. It’s time to look more closely at LNG as a vehicle fuel. The fact that the EPA isn’t clamoring for its use and offering subsidies to build up the infrastructure for it is conclusive evidence that they don’t know what they’re doing and/or have some agenda other than pollution and efficiency.
 This is one of the factors limiting the life of a nuclear fission reactor. Hydrogen is just a proton with an electron for company. A nuclear reactor produces free protons as a byproduct of its operation, and the protons latch on to electrons to form hydrogen — which promptly begins seeping through the piping and the reactor vessel, which are usually steel. The resulting hydrides make the steel brittle and likely to crack, so if it goes on very long the vessel and piping start breaking and it’s time to shut the thing down because it isn’t safe any more.
 The debate over whether carbon dioxide qualifies as a “pollutant” is not addressed here. The Law says it is, and that’s what we have to work with. There are also components of natural gas that don’t burn as efficiently, producing unburned hydrocarbons and nitrous oxides, which really are pollutants.
 It’s also possible to change the mix of petroleum products in gasoline to make a fuel with higher octane rating. This is being done in a small way — in most parts of the US it’s possible to buy “pure gasoline”, with no -ols in it — but it requires changing the refining process and produces less fuel per barrel of oil input, so it’s more expensive.
 The thing under the floorboards beneath your feet is a strong vessel in which a chemical reaction occurs, promoted by a catalyst. It is therefore a catalytic reactor, by definition. The term “catalytic converter” is a mealymouthed euphemism designed to keep Greenies and other ignoramuses from riffing on “well dayum, reactors is noocular, git that thang away from me!” Confuse the bastards. Use the right terminology — it’s even Yuropeen (and therefore sophisticated) to do that.
 Which is silly. LNG is delivered to places like Boston because of the perception that pipelines are dangerous, which isn’t a foolish concern — if they break it’s a problem (to put it mildly) — but a tanker full of liquified natural gas contains a lot of energy in a small space, and if one should ever catch fire and explode in a harbor next to a city full of people, it’s going to make the Texas City disaster look like a squib.
 This seems to me strong support for “abiogenic petroleum”, a.k.a. “the Gold hypothesis” after Horace Gold, the scientist who proposed it. To a very close approximation the Universe is made of hydrogen, and carbon is one of the main byproducts of the fusion process that makes stars hot and bright. Methane, natural gas, is the simplest combination of the two, and is even found in interstellar clouds. The Gold hypothesis is that when the Earth formed from a cloud of particles around the Sun, hydrogen and carbon were incorporated into it and formed methane; oil then occurs because heat and pressure forced methane molecules to combine into heavier fractions. This theory is pooh-poohed because we find microbes in oil, and it’s assumed that the microbes ate the carbon and produced petroleum; to me it seems equally reasonable to assume that the microbes found the oil and went “o yum, food!” (thus adding to the effects of heat and pressure to form heavier molecules). If the Gold hypothesis is true there is methane, and therefore oil, literally everywhere; there is no possibility of running out, because we’ll run out of oxygen to burn it with first.
Our belovéd Lege is thinking about raising the speed limit on certain roads here in Texas to 85 MPH. The proposal has passed the House, and now goes to the Senate for approval. It’s attracted a little attention nationally, but no huge splash — which makes sense; we already have Interstates out West with 80 MPH as the posted limit.
In a perfect world, posted speed limits would be a stupid regulation from a safety standpoint. The only thing that actually makes sense is maintain a safe and reasonable speed, which takes in all the factors — weather, vehicle and road conditions, and the capabilities of the driver. Posted speed limits encourage idiocies like people driving 60 MPH on an icy road in traffic; when the inevitable crunch occurs, they point at the sign and protest, “Hey, it’s 65 here! I was well under the limit!”
Unfortunately we don’t live in a perfect world. “Safe and reasonable” runs up against the fact that most people have truly lousy judgement of things like momentum, and don’t think much about vehicle maintenance beyond whether or not it starts in the morning; and it leaves the whole mess up to the discretion of individual police officers, which is a great way to get arbitrary oppression institutionalized. Posted speed limits are about the only suitable compromise, but they ought to be set according to reality, which as a rule they are not. I could easily show you several places where Texas “Farm to Market” tertiary roads cross the Interstate; the speed limit on the wide, flat, open, divided, limited-access four-lane is 65, and the sign on the narrow, near-shoulderless, two-lane road infested with farmers pulling hay balers at 10 MPH says 70 in the daytime. That sort of thing encourages people to ignore the limits, because they’re obviously set by people who don’t know what the f* they are doing.
Increasing the speed limit always draws the ire of well-meaning nannystaters, subtly or blatantly encouraged by people with skin in the game, predominately local Justices of the Peace (who see their speeding-fine revenue vanishing) and insurance companies. The insurance companies point out, quite reasonably and truthfully, that higher speeds result in more accidents, and that even if that weren’t true the accidents that do happen will be more severe. Momentum goes by the square of the speed, so a change from 80 to 85 MPH is a 6% increase in speed, but a 13% increase in accident severity — for which they have to pay. The rational answer to that would be to let them price it out; people like me, who drive old cars with minimum insurance coverage, should have to sign in blood that they’d keep the speed down, and people with proper equipment and plenty of money should be able to pay for the privilege of going as fast as they like.
What I propose is an extension of that principle. There’s no particularly good reason to put a speed limit below 65 or 70 MPH on the open Interstate, especially out West where “open” is ironic understatement, but allowing speeds higher than that tends to increase accidents and accident severity because of the “icy road” phenomenon — people who don’t have good cars and/or aren’t good drivers will go that fast anyway, because they don’t have the knowledge and judgement to determine a “safe and reasonable” speed and trust Big Brother to put reasonable numbers on the sign. So set the posted limit relatively low, and introduce a new class of licensing.
Call it the “unlimited permit”, and model it on the concealed carry laws. The vehicle has to pass a real, stringent inspection for things like brakes, tires, steering accuracy, and the like, instead of the present “inspection” system, which amounts to “yup, all four wheels are there, that’ll be $14.50, please.” Cars that make the grade get a special license plate, perhaps with a different-colored background to make it easy for the police to distinguish them from the general ruck, or a distinctive medallion to attach to a standard tag. Drivers have to pass a serious course in how to go fast safely, including skidpad action and familiarity (perhaps using simulators) with how things go at high speeds on the highway. People who pass that course then pay a moderately exorbitant license fee, and get special placards to be displayed fore and aft for the edification of law enforcement, and which they’re required to remove when lesser drivers operate the vehicle. An “unlimited” driver in an “unlimited” vehicle then is subject to the safe and reasonable rule; how fast he or she can go is between him or her and the insurance company. Crucially, drivers with “U” permits would be held to a much higher standard than everybody else. Lapses in judgement like whipping around school buses at eighty or going 60 on an icy road would incur penalties much more severe than for drivers with lower grades of license, ranging from loss of unlimited privileges to jail time, because they’re supposed to know better than that.
People would go for it like coyotes after a deer carcass, especially those whose jobs keep them on the road a lot. From San Antonio to El Paso is over 550 miles of damned near nothing but flat straight road, and takes seven and a half hours under the present limits; cranking it up to 100+ cuts two full hours off that time, and that would be worth a lot to some. It could easily become a modest but significant source of revenue, especially if you extended it to people with out-of-state driver’s licenses. They’d still have to pass the inspection and the course, but could stop in at the Welcome Station, show the paperwork, and pick up their license medallions and placards, thereafter driving fast if they cared to. How much would the testing staff at Car & Driver pay for the privilege of driving Ferraris at full bore without having to deal with annoyed Department of Public Safety troopers? You tell me, but I’ll bet it’s a lot. It would, after all, be deductible as a business expense…
The best effect of that, though, would be better driver education for more people. An unlimited permit would be like catnip to young drivers — at age 20 I would have jumped through some pretty tight and fiery hoops for one — but to get it, they have to pass the course in how to do it safely, including how to decide whether the car they propose to crank up to warp factor 9.2 is suitable for that application. That would be an enormous advance over the present system, in which a 16-year-old demonstrates the ability to keep it between the ditches and is handed a permit to do 80 (or, now, 85) in whatever car they can buy or borrow. It doesn’t take much road experience to realize that literally anything which would encourage people to learn more about driving, cars, and speed would be an improvement over the way we do it now.
Ain’t gonna happen, of course. But a fellow can dream.
Addendum: It occurs to me that requiring a vehicle inspection means the permit isn’t truly unlimited. Call the inspection-required form “S”, for “Speed”; the course for Unlimited includes how to inspect the vehicle for suitability, including accepting the liability incurred thereby. You wouldn’t get many of those, but the ones you did get would be really good drivers.
Rebels and insurgents need communications. One of the problems with cell phones and WiFi is that they depend on base stations — somewhere there has to be a transceiver that’s hooked to the Internet or the phone switching fabric (or both, or the two may be one and the same), and if you’re a rebel depending on electronic communications you’re SOL when the Big Guy shuts the towers down.
The Economist has an article on guerilla telecommunications, things people can do to restore access when Maximum Leader pulls the plug. Lots of them are pretty clever, and will get you “on the air” in unlikely circumstances. You still need a tower somewhere, though; maybe the neighboring country will provide. But what if the neighbors like your Maximum Leader, or are in cahoots?
A commenter at Moe Lane’s place notes
…a possible capability of the US military (specifically the Air Force’s EC-130J): forcibly offering telecommunications and internet access to restive, rebellious populations.
I don’t know whether that’s true or not, but it’s certainly plausible. Military equipment doesn’t shrink as fast as civilian stuff (for good reasons not germane here) but it does shrink, and the smaller, lighter, and more power-efficient it gets the more you can stuff on an airplane, including an EC-130J or an E-3 Sentry. But better yet…
The actual equipment at a cell phone base station is ridiculously small, even though they have plenty of room to spread out in those prefab concrete buildings. It’s a couple of “relay rack” cabinets, not all that different from what you’d see in your server room if you work for a medium-size company or bigger. How hard would it be to shrink it a bit and stuff it into a Global Hawk, or one of the other medium-to-large UAVs they’re building nowadays? Of course there’s always the chance that Maximum Leader’s goons will shoot it down, but that’s just an excuse for a real “no-fly” zone instead of the thinly-disguised air assault that’s currently going on in Libya — and the whole point of Global Hawk is that it’s way up high, hard to get to, and hard to find even if you can get your SU-27 up there.
Subscriptions, number assignment, and the like are problems, but I’ll bet they’re solvable. Maybe less-capable UAVs could airdrop bundles of valid SIM cards, like they did with leaflets in WWII. There are very likely other ways. Not every country in the world uses the American system of phones “locked” to a single provider.
The Fifties SF writer Eric Frank Russell wrote a whole series of stories about the disruptive effect of minor pinpricks in a monolithic system, notably Wasp and Next of Kin (The Space Willies). Charles Stross‘s novel Singularity Sky opens with the invaders dropping millions of free cell phones (and later much else, but that’s pretty disruptive by itself). If we want to give a big boost to rebels and insurgents, assuring them reliable telecommunications free of eavesdropping (by Maximum Leader & Co; of course we’d listen in, in order to keep current with the situation) would be one way to do it — and since it doesn’t involve horrid nasty guns, at least some of the leftoids ought to be able to tolerate it.
It could even be done privately, or semi-privately. Yeah, airplanes are expensive, but not that horribly so in a world where lots of people make $100K and above a year — we see shrieks about OMG a $2 million private jet!!!11!eleventy!!111!, but really, that’s about what Minimum Leader will spill on his vacation in South America. There’s no particular reason why a consortium of like-minded individuals couldn’t put together an approximation using a commercially available aircraft — anything reasonably suitable would already have an autopilot, and UAV software isn’t that hard if all it has to do is a “racetrack” over the affected area. Or, heck, hire the Russians. Aviation is one thing they do really well, and cheaply.
Go for it. The concept is free under the GPL
Is there a musicologist in the house?
I could use some help (no, not money, though if you’re so inclined I will thank you sincerely). It’s clear that I’m not going to teach myself music theory over the Web in any reasonable time, and I have a music project I would dearly like to see done.
What I want is to re-score the old Mack Davis tune “Oh Lord It’s Hard To Be Humble” for brass band, J. P. Sousa version, then feed it to a synthesizer program like MuseScore and get a playable file. Possible? Reasonable? Pointers? If you think you might be able to do it, comment and I’ll contact you by email to tell you why I want it.
We need some logger rhythms. (We pause briefly while you ROFL.)
Generations of students have rejected logarithms because they’re just too hard. It used to be true, too — actually calculating a logarithm is extremely difficult, and has always been relegated to specialists who did little else. Those specialists created “log tables” that could be used to look up logarithms, but using those was an intricate task because the tables were necessarily made as short as possible, in order to keep from using up all the paper and ink in the world to print them, and finding the logarithm of the number you were looking at in the table took a good bit of error-prone ingenuity.
That’s no longer the case. Calculators abound, and in all but the cheapest getting the accurate logarithm is a matter of pushing a button. Call up the Windows Calculator, choose [View] –> Scientific, and what comes up has two buttons for logarithms, [log] and [ln]. Put in an ordinary number, press [log], and there’s the logarithm. Put in a logarithm and press [Inverse] [log], and there’s the ordinary number.
There’s a tsunami of numbers washing over us — budgets, test scores, polls, prices, and a thousand other things. The people who calculate and publish them are anxious to make points, and use all kinds of tricks to emphasize the differences they think are important. Logarithms can help you cut through the fog, because
Logarithms tell you if the difference is important.
The logarithms most people encounter in normal life are the Richter scale for earthquakes and the “decibel” scale for sound (and a lot of other things). Earthquakes are reported directly as logarithms, decibels are logarithms multiplied by ten to avoid using the decimal point. In both cases, the important fact is that a difference in the second digit doesn’t matter much, where a difference in the first digit matters a lot — the difference between a 5.2 earthquake and a 5.3 is barely detectable to the folk who encounter it, but the difference between a 5.2 and a 6.2 is really significant, and 7.2 breaks things and kills people; the difference between 82 decibels (abbreviated dB) and 83 dB is just barely noticeable, 92 dB goes from “loud” to “really annoying”, and 102 dB is “possible hearing loss”.
Logarithmic scales have a starting point or “origin”, but unlike plain numbers (“linear scales”) the origin point doesn’t matter for purposes of comparison between two numbers. The decibel scale was created with exactly that in mind. The researcher tested a lot of people, figured out the minimum loudness difference they needed to say one sound is louder than the other, and assigned that as 1 dB. Remember that decibels are logarithms multiplied by ten, and go to your calculator; put in o.1 and press the [Inverse] and [log] buttons. The result is a long number, 1.2589 and a lot more digits. Round it off to 1.26; in order for most people to hear the difference in loudness, the actual difference in sound power has to be about 26%. The base point, or zero dB, was set at the softest sound most people could hear at all, but that’s arbitrary.
The ratio in power between the minimum audible sound and permanent hearing damage is about 1,000,000,000,000 — one trillion. That’s a big number, cumbersome to use; what’s more important is that it can be misleading. A difference of 26% is 260,000,000,000 or 260 billion, which looks like a really big difference. Convert to dB, and it’s 120 dB (one trillion) versus 121 dB (one trillion 260 billion). Now the difference looks trivial, and it is. You’re already bleeding at the ears at 120 dB, and adding one more decibel is barely noticeable, either to you or to your doctor.
Now let’s apply that principle to the national debt and deficit cutting. The deficit is roughly one and a half trillion dollars. [log] gives 12.17, or about 122 dD (decidebts). The proposed budget cuts amount to about 60 billion; [log] gives 10.78 or about 108 dD. $60 billion sounds like a lot of money, and it is for you and me, but 122 – 108 = 14 dD, which doesn’t look like much — and it isn’t. An engineer would say that the smaller number is “down” or minus 14 dD from the larger one. Put -1.4 into your calculator, [inverse][log]: 0.0398, or less than 4% of the basic problem. If you’re bleeding to death, stanching the wound by 4% won’t help you much.
2009 4th Grade Math
White students: North Carolina 254, Texas 254, Virginia 251, Wisconsin 250, Georgia 247, South Carolina 245, (national average 248)
North Carolina 254, [log] 2.405, 24 dE (“deci-educations”)
Texas 254, [log] = 2.405, 24 dE
Virginia 251, [log] = 2.399, 24 dE
Wisconsin 250, [log] = 2.398, 24 dE
Georgia 247, [log] = 2.392, 24 dE
South Carolina 245, [log] = 2.389, 24 dE
Nat’l Average 248, [log] = 2.394, 24 dE
How remarkable. There is no difference at any level that makes a difference.
Try it with some of the numbers you encounter. It’s a great way to cut through the fog and see what (if any) real differences exist.
 They’re both “logarithms”, but done to different “bases”. They give different numbers when pushed, and the difference is hyper-important to mathematicians, but both make the same comparisons; pick one and stick to it, and you’ll be fine.
 Now replaced by the “Moment Magnitude” scale. The difference is important to people in the business, but when reading the paper or a blog it doesn’t matter.
 Not really, but close enough to be useful. The Wikipedia article is much better if you care to wade through it, but it will just confuse a lot of people.
 The nice way of saying “Ain’t that some shit?”
The caracoling in Mad City has drawn attention away from the launch of Space Shuttle Discovery on mission STS-133, delivering supplies to the Space Station, including the necessary parts to make the new room usable. That’s as it should be. In the original vision for the Shuttle it was supposed to be a utility vehicle for getting people and things into space, used so often that individual missions were relegated to brief mentions in the inside pages of the paper, next to the lingerie ads. As we all know, it didn’t work out that way. “Turnaround” for the orbiters has from the first been an elaborate, expensive matter, so much so that NASA was more or less forced to try to make each trip dramatic and newsworthy. Having the Shuttle shoved off the front page by political shenanigans fits the original idea much better.
I’ve been to Florida reasonably often, but somehow never managed to get to the Cape to see a rocket launch, though I did do the tour once when nothing was visibly happening. My brother John (RIP) went once, and was most impressed. (He was also a NASCAR fan, and loved loud noises.) Now it looks like I’ll never get the chance, at least for Shuttle. If this mission goes off without disasters Endeavour will do STS-134 in April; if they can find enough loose change behind the couch cushions Atlantis will do STS-135 sometime this summer. That’s it. No more Shuttle.
Many people are lamenting that. Chuck Gannon on Facebook wails, what reason validates the decision (or apathy) that has let this capability slide away from us? He concludes that it’s truly unfortunate, and many agree. For myself, I’m not so sure.
Let’s face facts squarely, boys and girls: Mercury/Apollo was a publicity stunt. It may very well have been a necessary political stunt; at that time, it wasn’t at all clear just how destructive of wealth a full-on socialist system would be, and many people took Nikita Khruschev at his word. Khruschev wasn’t threatening war, he was promising that the Politburo would deliver such prosperity to the Russian people and their fraternal socialist brethren that the West would be eclipsed by the tide. As part of the accompanying propaganda the USSR was doing very well at what can only be described as a Potemkin space program, with lots of visible spectacular achievements, and it was important for international prestige that the United States establish that it could do it bigger, better, and with flashier paint jobs. This was duly accomplished, and although the budget for it was huge by any other standard, in comparison to the GDP or even the Government budget of the United States it was trivial. The United States could go to the Moon on pocket change and walking-around money; the USSR never got there at all, despite depriving its people of many comforts in order to try.
It might have been better if American politicians of the time had noted that Nikita Sergeyevich was being squired around in a ’57 Packard with Cyrillic badges. The Soviet Union, from inception to end, was like the Western towns in old movies — tall imposing front, little better than huts behind the façades. There is no doubt the space program was a magnificent achievement, but it came at a cost that wasn’t measured in money and hasn’t really been accounted for to this day.
When 1960 rolled around the United States’s aeronautical industry was varied and vibrant; jet passenger planes were fully on-line, there were experiments with ducted-fan and Coanda-effect fliers, and the several manufacturers seemingly came out with something new every week. Space, too, was getting a lot of attention: the Air Force was experimenting with lifting bodies and beginning Dyna-Soar, X-15 missions were regular, and there were dozens if not hundreds of other projects going on. Of particular interest to me are Project Pluto and SNAP, attempts to use nuclear reactors instead of chemical rockets — no, it wouldn’t have irradiated anybody; the stuff coming out would’ve been water, which doesn’t get radioactive, although Pluto was a nasty concept. In engineering labs and “skonk works” from Long Island to San Diego, scientists and engineers and techs were thinking about space and how to get there, and many of them were building models and looking for funding for the real thing.
But we had to get to the Moon, and we had to do it right away, and all of that, all of it, got sacrificed to meet that goal. Mercury/Apollo was a huge challenge, barely doable with the technology of the time; NASA lured (or shanghaied) as many of the top people as possible for the crash program, and the contracts they let for the parts that weren’t within their capability brought most of the other aerospace manufacturers into the program — or destroyed them if they didn’t go along. A truly gigantic fraction of the total aerospace expertise in the United States got diverted to the official space program, and a sizeable chunk of Canada’s as well, all buoyed up on a flood of Federal funding. The other projects got miniaturized, slowed down, or ended totally. The only way to get to space was with a NASA meatball emblazoned somewhere.
I don’t believe many — possibly any — of the scientists and engineers and techs and support people realized what was going on at the beginning. They were all either starry-eyed at the prospect, or grumpy realists who were paid to make it work and would do so if possible, and almost all of them were susceptible to the romance of the proposition. The clues came early, and we all, scientists and fanboys alike, missed them. There was going to be a Moon base — no, too complicated to do in time; the guys would just go and come back. There was going to be a Space Station, probably not a huge wheel like in 2001 but building toward that — no, too slow and too expensive; we’ll just build a great big rocket to get them up there at once. The ultimate futility of Mercury/Apollo can be summarized in a single sentence: It left nothing behind that was usable later. Even the leftover Saturn rockets, towering achievements of power and grandeur, were too big and too expensive to use for launching probes, and were broken up or used as museum displays.
When the movie’s finished, the sets get broken up and trashed or stored so they can be re-painted for another one. That’s where we were in 1969. We had film and video tape of American footprints on the Moon, surrounded by junk that was of no use to anybody, and enormous installations on Earth that had no purpose but to support single-use equipment that had no purpose going forward. Meanwhile all the other projects, the ones intended for incremental achievement of things that might have been used over and over, were dead and gone, dusty relics in the backs of hangars and closets. It was the biggest, most extravagant movie ever made. Now it was done. What now?
Engineering and design departments at aerospace manufacturers, now missing the flood of money they’d adapted to, started cutting back, and one by one the manufacturers themselves started falling. Projects that had once been cheap and quick — Lockheed built the SR-71 from scratch in four years, and came in under budget — turned into long, drawn-out, often futile wastes. NASA had had to get a complex task done quickly, and do it with as few mistakes as possible; they set up a bureaucracy to support that, and procedures that had to be followed, including by contractors. Aerospace firms adopted them perforce, and (because they worked, at least at first) extended them to their other lines. In the Fifties, engineers would make up drawings and the shop would start bending tin. In the Seventies, it all had to be done with PERT charts and planning meetings and part certifications and inspections at every point, NASA style. The product wasn’t airplanes, it was reports about airplanes, reports that had to predict the future — and since that isn’t possible, it meant backtracking and re-doing and more reports.
Worse, from the point of view of the Senators and Congresscritters and their hangers-on, the flood of money dried up. Bureaucrats battling for budget in a permissive environment can find many creative ways of directing the funds they control as is needed to firm up support for their efforts; the funds weren’t there any longer, but the NASA bureaucracy was firmly in place, and like any bureaucracy was unbudgeable without atomic weapons, or at the very least a ninja raid. Within NASA there were still a majority of people who believed in space and space exploration, but they had learned The Procedure too well and had become tech-oriented bureaucrats. They set off to build the Space Shuttle, but ran into two roadblocks: The Procedure had become too cumbersome, and they had no Grand Story to tell.
A lot of the commentary about space laments the lack of a grand story, not least within NASA itself. They kept trying the publicity stunt approach ‘way past the point where it had succumbed to the Law of Diminishing Returns, trying to build a new Grand Story that would get them back to the glory days of Apollo and near-unlimited funding. Bluntly: It didn’t work, and it shouldn’t have. If space is useful at all — which I think it is, but there are those who don’t agree — it has to be utilitarian. It can’t depend on a Grand Story to keep it afloat. It’s just my job, five days a week, a rocket man, Elton John sang, and that’s the way it has to be, or it’s just a stunt, too expensive a stunt to do with public funds.
And there was The Procedure, which by then had become Holy Writ. NASA proposed the Space Transportation System, and the aerospace firms responded — with reams and reams of paper and stacks and piles of Vu-Graph® slides, because PowerPoint hadn’t been invented. Corporate bureaucrats delivered bureaucrap to NASA bureaucrats, who dutifully digested it and produced more bureaucrap. It was delightful — things were getting done! Look how busy we all are! — and it was expensive — hey, it takes a long time to do due diligence on something this complicated — and it took a while to realize that no metal was getting bent. Even the supportive Congresscritters looked at rooms of people filling out forms instead of hammering on shiny titanium or big roaring flames, and had a hard time supporting it. No Grand Story and nothing actually getting built translated into funds gradually drying up.
It was in this atmosphere that Shuttle was developed, and you can see the traces of Apollo in every step. It was supposed to be small, so we could have lots of them and use them often; no, it needs to be big for lots of plausible reasons. It was supposed to be flown, like an airplane; no, that’s too complicated, we need to spend a lot of money building computers to control it. It was supposed to be metal, with a disposable, renewable ablative coating; no, that doesn’t fit the ideal of “reusability”, so the belly has to be exotic ceramic. It was supposed to have a “flyback booster”, a big thing that carried it partway to orbit and came back to land and be used again; no, if the orbiter’s big the booster has to be gigantic and we can’t do that, so it got whittled down, first to a disposable booster, then to bigger (and more expensive) engines on the orbiter so the booster didn’t have to be so big, then finally to solid rockets that the builders promised (faithfully, cross my heart and hope to die) would be cheap and reliable. Echo answers hollow…
Meanwhile the little projects, the incremental projects, the Dyna-Soars and SNAPs, never got started again. NASA didn’t have a Big Story for the general public, but it had an internal Big Story: make shuttle work. If there was money for space, Shuttle got it. The scientists who wanted probes cried out with loud and plaintive voices, and got enough diverted to JPL, etc., to do some wonderful things, but they were sucking hind tit and knew (and resented) it. When Shuttle went operational, it got worse, not better. Every time it landed it had to be rebuilt from scratch, for all practical purposes — the super duper last-forever tiles, no two the same size or shape, had to be inspected one by one (without breaking them, mind you, and you can ruin one with your thumbnail), and often enough taken off and reglued, and that was just the tip of the iceberg. Cynics grumped that it would be cheaper to build a Saturn and throw it away every time, and I’m not sure they were wrong, but NASA had the bit in its teeth. Shuttle was reusable, and reusable was the only way to go, and they had a not-so-small army dedicated to making it reusable, broken or no, and every cent they could scrounge went to support that effort. Small incremental projects? We’ve got no time or money for that piddly shit.
Now it’s over. Instead of bitching about President Obama “yielding our dominance in space”, you should be breathing sighs of relief. (It’s not his fault anyway; the Bush Administration decreed it, and like everything Barry-O does that actually works, he just went along.) The Ares component of the Constellation program was a hopeless-from-the-start attempt to re-use Shuttle components in a new configuration and save all that lovely pork; the deader it is, the better off we are.
In the meantime, surprise surprise! Suddenly we have a various, vibrant aerospace industry, coming up with new and different ideas and trying them out. They’re mostly shoestring operations or sucking a little of the fat out of LockMart or Boeing, and none of them can manage the pure echoing thunder of a Shuttle launch, so they’re not going to be the lede on the seven o’clock news very often. They have funny names, Xcor and Bigelow and Armadillo (Armadillo!) and more. Some of the things they try work, and some of them go down in flames; when s*t happens, they try again. They all use The Procedure, but in the early, streamlined version that actually did put a man on the moon, not the cumbersome, all-paper-no-rockets version that turns Research and Development into alternate ass-kissing and ass-covering.
Bottom line: We just pissed away fifty years. It’s 1961, and there are dozens of little space-oriented things going on, some of which might work, some of which surely will not — but we won’t know which is which until we try them. The new guys don’t need or want a Grand Story; what they need and want is a trickle of help. If NASA went back to what it was in 1961, doing research available to all comers and helping choose (and fund) likely things to try, a fraction of the budget for Shuttle would keep the new little guys fat and happy. On the other hand, trying to shove everything they do through 300 E St. SW will strangle them in the cradle. Let’s not do that. When Discovery comes back, let’s shove a steel pylon through its guts and stand it up in front of the Air & Space Museum like the obsolete fighter planes that used to grace the entrances to Air Force Bases; Endeavour can go up similarly outside the Cape (superfluous vowel and all), and Atlantis can go to Pasadena, where the probe scientists can sneer at it.
It pisses me off. When I was thirteen I expected to be able to go to the Moon, or at least the Space Station, before I kicked the bucket, and that isn’t going to happen, because we have to start over. So let’s do start over, hmm? And do it right this time, bit by bit ’til it’s a business, not an extended Hollywood extravaganza with a Federal bureaucracy designing the sets. Ad astra per aspera! (per pecuniam was a dead end.)