A Daily Diatribe by a Pompous Git

Who is that fat bastard? A Sturm's Eye View, Guaranteed Free of Harmful, or Potentially Harmful Chemicals -- but Watch Out for the Ideas! Some of them are Contagious! 

A journal of sorts to record Jonathan Sturm's (and others') thoughts and observations on things worth thinking about. Feedback welcome, but be aware that unless you prominently say you want your communication kept private, I may publish it.

Valid HTML 4.0!

Paying for this website

Join the Blue Ribbon Online Free Speech Campaign!

Previous |Next | Home

Monday | Tuesday | Wednesday | Thursday | Friday | Saturday | Sunday  

Monday 29 April 2002

The Pompous Git's energy levels have been at a low ebb recently. I could pretend that it was due to a very minor cold virus that affected me for most of last week, or the aches and pains that are legacy of a misspent youth, but I suspect that would be self-deceiving. The Git's real problem is most likely the lack of focus now that The House of Steel is completed. It's not that there's any shortage of things to do: there's the cottage renovation to finish, firewood to cut, minor bits of finish on THoS, a client website to create, the House of Steel book to finish, the resurrection of the vegetable garden... All relatively trivial compared to the realisation of a thirty year dream of building a house.

I just read Simon Singh's book, Fermat's Last Theorem, and suspect that the hero, Andrew Wiles, felt much the same as I do right now when he finally proved Fermat was correct. Doubtless he has found some other metaphorical mountain to climb and doubtless I will too. The real problem is choosing the next mountain: learning how to program a computer beyond having it print "Hello World" to the screen, coming to grips with differential calculus, becoming a Linux savant, learning to speak Japanese, advancing my musical skills beyond those of Dave Lister, going to Afghanistan to defuse land-mines... There are many options that come to mind, just nothing compelling at this point in time.


From the Completely Stupid Programmers Department at my ISP:

Usage Summary, as at 28 April 2002
Accumulated Connection Time 328 hours
Total Remaining Hours 72 hours
Current Daily Average Use 12 hours
Suggested Daily Average Use 36 hours

I really wish there were a way to get 36 hours in a typical day. This is an example of a lack of bounds-checking, not dissimilar to the data used to "prove" that anthropogenic Global Warming exists. Here are some data behind those graphs that IPCC and others use:


 Bogurany USSR   

 Jun 1982   

 110.8 C 


 Tamanrasset Algeria   

 Jan 1995   

 96.0 C 


 Fort Hope Canada   

 Aug 1919   

 99.9 C 


 Vitim USSR   

 July 1990   

 78.6 C 


 Tomsk USSR   

 Jan 1981   

 76.9 C 


 Minnsinsk USSR   

 Sep 1982   

 72.9 C 



 Apr 1982   

 73.1 C 



 May 1980   

 80.3 C 


 Bemako Mali   

 May 1995   

 50.0 C 


 Beirut Lebanon   

 Feb 1868   

 93.2 C 


 Kidal Mali   

 Apr 1972   

 50.7 C 



 Jun 1977   

 79.2 C 



 May 1977   



 Gasan Kuli   

 Nov 1977   

 109.4 C 



 Mar 1978   

 73.4 C 


 Hatanga/Khatanga USSR   

 Dec 1979   

 28.8 C 


 Lhasa China   

 Mar 1996   

 84.0 C 


 Djibouti Somalia   

 Nov 1994   

 69.0 C 


 Balhas/Balkahash USSR   

 Dec 1994   

 31.1 C

Thought for the day:

This is my depressed stance. When you're depressed, it makes a lot of difference how you stand. The worst thing you can do is straighten up and hold your head high because then you'll start to feel better. If you're going to get any joy out of being depressed, you've got to stand like this.

Charlie Brown

Current Listening

Leonard Cohen -- Songs of Love and Hate


Tuesday 30 April 2002


Gear Magazine March 2000

In 1996 a scientist claimed he'd found a way to defeat AIDS. In the wave of euphoria that followed, a batch of new drugs flooded the market. Four years later, those drugs are wreaking unimaginable horror on the patients who dared to hope. What went wrong?

"Mathematical models suggest that patients caught early enough might be virus-free within two or three years."

David Ho, Time concluded, delivered "...what may be the most important fact about AIDS: it is not invincible."

Based largely on a single paper -- Ho's 1995 paper -- protease inhibitors received lightening-quick FDA approval and poured onto the market. The mass media declared AIDS to be "over," albeit with a question mark floating overhead. A new euphoria filled the air, and David Ho spawned a multibillion-dollar drug industry.

Amidst the excitement, something was overlooked.

Ho's mathematical model was wrong.

There's much more about what various researchers are describing as "the missing virus" here where I found Celia Farber's piece. It's far from the usual paranoia site that is all too easy to find on the web. Many of the writers are concerned academics.

I find disconcerting parallels with the Greenhouse Industry where massive government funding is only for supporters of the mainstream hypothesis, labelled as a "consensus". Not only the government here, but the drug industry as well. many had excellent reputations until they refused to "jump on the bandwagon".


Doubtless you will have ready by now:

First quarter of 2002 is "warmest for a millennium" in New Scientist. The National Climatic Data Center reports: "The average temperature in March 2002 was 40.5 F. This was -1.8 F cooler than the 1895-2002 average, the 33rd coolest March in 108 years. The temperature trend for the period of record (1895 to present) is 0.1 degrees Fahrenheit per decade."


Karl Popper in Conjectures and Refutations:

1. It is easy to obtain confirmations, or verifications, for nearly every theory -- if we look for confirmations.

2. Confirmations should count only if they are the result of risky predictions; that is to say, if, unenlightened by the theory in question, we should have expected an event which was incompatible with the theory -- an event which would have refuted the theory.

3. Every "good" scientific theory is a prohibition: it forbids certain things to happen. The more a theory forbids, the better it is.

4. A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice.

5. Every genuine test of a theory is an attempt to falsify it, or to refute it. Testability is falsifiability; but there are degrees of testability: some theories are more testable, more exposed to refutation, than others; they take, as it were, greater risks.

6. Confirming evidence should not count except when it is the result of a genuine test of the theory; and this means that it can be presented as a serious but unsuccessful attempt to falsify the theory. (I now speak in such cases of "corroborating evidence.")

7. Some genuinely testable theories, when found to be false, are still upheld by their admirers -- for example by introducing ad hoc some auxiliary assumption, or by reinterpreting the theory ad hoc in such a way that it escapes refutation. Such a procedure is always possible, but it rescues the theory from refutation only at the price of destroying, or at least lowering, its scientific status. (I later described such a rescuing operation as a "conventionalist twist" or a "conventionalist stratagem.")

Thought for the day:

Science is what we have learned about how to keep from fooling ourselves.

Richard Feynman

Current Listening

Incredible String Band -- 5,000 Spirits Or the Layers of An Onion


Wednesday 1 May 2002

I'm off to the dentist shortly to have the holes made by the removal of two root canals filled. Some interesting reading here about The Pristine Myth, the concept of wilderness unsullied by the influence of mankind, at The Atlantic:

"Many people don't like putting things this baldly, but if there really has been very little "untouched" nature for 10,000 years then it is essentially impossible to go back -- conditions have changed too much. But many well-meaning people find it difficult to come out and say, for instance, "we want tall-grass prairie because we think it's really nice and we like it" -- especially when they're fighting economic forces. So they tend to invent standards, states putatively preferred by natural systems -- wilderness. It's like appealing to a deity, an ecological Ten Commandments that comes from some source outside the fallibly human. Yet if we truly can't return to pristine wilderness, then there's no way around it: we're in charge of deciding how, say, the prairies are going to look. Obviously we don't have absolute control, but we sure have a lot of influence."

Thought for the day:

The most common of all follies is to believe passionately in the palpably not true. It is the chief occupation of mankind.

H.L. Menken

Current Listening

Spectrum -- Milesago


Thursday 2 May 2002

For many years I used to read each issue of Scientific American with considerable delight. It was exposure to the full gamut of science. It's one magazine that survived being thrown out during several purges of excessive paper and it was only a period of poverty that prevented the continuation of my monthly habit. When prosperity returned, I had a computer and 2400 bps modem to satiate my need for scientific news, so the SciAm habit never returned.

Yesterday, I purchased my first copy for a very long time and what a disappointment it is. Science, it appears, is misunderstood not just by the general public, but by the reporters of science.

Robert Morgan writes:

Apropos your recent discussions.

Survey finds few in U.S. understand science


Ironic that they highlight the fact that most Americans don't understand the scientific process but then ask true/false questions like "the universe began with a huge explosion".

Also interesting how leading the survey must have been, with questions asking how serious a problem is global warming.

And in a follow-up math test, CNN seems shocked that while 57% disagree that UFOs came to earth, 30% believe otherwise. I betcha about 10% are undecided.

The editorial of the March issue of SciAm is an exhortation on the necessity to control HIV, a virus that has yet to be clearly identified and the existence of which is more than somewhat doubtful. Just in case you didn't follow the link I provided on Tuesday, or were overwhelmed by the volume of material, let me summarise.

To establish that a virus causes a particular disease, the procedure is to first isolate the virus as a pure extract, then infect another organism with it that must subsequently display symptoms of the disease. This has not been done with HIV. The purported reason is that the virus is in such small numbers and is too fragile to survive the process of recovery using the usual procedures. On the other hand, it is said to be extraordinarily robust in surviving the procedures used to render blood products safe for use in transfusions. To complicate matters further, unlike any other STD we know of, AIDS passes in one direction only, in the sperm, or blood of an infected person. Unless you happen to live in Africa, where apparently it passes freely from sperm recipient to sperm donor. It would be a remarkable virus that could determine its behaviour from its locality.

AIDS is not what kills the AIDS sufferer, it is usually one of a range of well-known diseases: tuberculosis, pneumonia, Kaposi's Sarcoma etc. When the results of an antibody test that supposedly indicates that the patient was previously exposed to HIV is positive, the patient dies from AIDS. If the results are negative, the patient dies from tuberculosis, pneumonia, Kaposi's Sarcoma etc. Usually, the presence of antibodies is taken as an indicator that the organism is now immune to the disease that stimulated the production of those antibodies. This is the principle behind inoculation, a well-accepted method of disease prevention.

There are alternative hypotheses as to what is happening here, but I do not propose to discuss them here. Suffice to say that science today appears to me more like a conservative corporate enterprise where orthodoxy and appeal to authority rules, rather than the vigorous debate that characterises real science.

Thought for the day:

Authority has every reason to fear the skeptic, for authority can rarely survive in the face of doubt.

Robert Lindner

Current Listening

The Folger Consort -- A Distant Mirror


Friday 3 May 2002

Found in my Inbox:

Water Level History of the U.S. Great Lakes (Reference Larson, G. and Schaetzl, R. 2001. Origin and evolution of the Great Lakes. Journal of Great Lakes Research 27: 518-546.) 

What was done 

As indicated by the title of their article, Larson and Schaetzl review what is know about the origin and evolution of the Great Lakes of North America: Lake Superior, Lake Huron-Michigan, Lake Erie and Lake Ontario. We report on their findings relative to one of the major concerns they discuss, namely, the worry that "increased evaporation under a possible greenhouse-enhanced climate, coupled with even more consumptive use of the Great Lakes waters, could lead to lower lake levels in the near future."

What was learned 

From graphs of lake level fluctuations of the Great Lakes from 1915 to 1998, we note that the lowest levels of the lakes occurred at about 1926 for Lake Superior, 1962 for Lake Huron-Michigan, 1933 for Lake Erie, and 1934 for Lake Ontario. We also note that the longest sustained period of high lake levels for all of the Great Lakes occurred over the last 30 years. In addition, lake levels at the end of the record are essentially the same as those at the beginning of the record.

What it means 

Climate alarmists worry - or claim they worry - that greenhouse-induced warming will dramatically lower the water levels of the Great Lakes. However, over what they claim to be the century that has exhibited the greatest warming of the entire past millennium, there has been no net change in the water level of any of the Great Lakes. In addition, over the past two decades of what they typically refer to as unprecedented warming, the four lakes have exhibited their greatest stability and highest water levels of the past century.

These observations fly in the face of all the climate alarmists' horror stories, suggesting that either the consequences they predict to follow on the heels of global warming are wrong or their global temperature history of the past millennium is wrong... or both are wrong. Based on their poor track record in representing reality, we lean towards the latter alternative.


And here's some very interesting data:

Global cooling?

You can read more of Hans Erren's research findings here, where I found the above graph, and here. Hans writes:

Here is a summary of my findings:

1: The current used surface data of CRU and GHCN show homogenity problems when compared with radiosonde data. In particular the climatic shift of 1977 is not reflected in both datasets.


2: To test the claim that global warming is happening, the difference was plotted between surface radiosonde data and troposphere radiosonde data. Contrary to what greenhouse warming predicts - troposphere warmer than surface - there is no warming up to 1991 and then a remarkable sharp cooling.


I use data from Angell: http://cdiac.esd.ornl.gov/trends/temp/angell/angell.html

3: To correct for volcanic eruptions, which cause cooling, this was removed. The trend is now zero up to 1994, and then again a sharp cooling.

http://members.lycos.nl/ErrenWijlens/co2/radiosondeco2diff.gif (volcanic data from http://climexp.knmi.nl/)

4: The sharp cooling after 1994 is atributed to the effect of stratospheric cooling which cools also the lower lying atmospheric layers. The data then reveals a slow rising trend which is attributed to GHG warming. However the CO2 forcing trend is four times larger as predictecd by IPCC, which could be due to the very crude 3 layer atmosphere that was used (Tsurf=15C, Ttrop=-30C, Tstrat=-60C), and the use of an emissivity factor of 1.


The above method of data reduction is used in exploration geophysics to find mineral deposits from gravity observations. Removing what you know gives you something to drill into.

I found the following nuggets:

The Global climate trend is dominated by the following factors

- Long term oceanic temperature variations (like Pacific Decadal Oscillation) - Short term oceanic oscillations (like El Nino Southern Oscillation and North Atlantic Oscillation) - Stratosphere temperature - Volcanic cooling - GHG warming

A nice comparison of PDO and ENSO can be found here: http://tao.atmos.washington.edu/pdo/img/pdo_enso_comp.gif Latest PDO index: ftp://ftp.atmos.washington.edu/mantua/pnw_impacts/INDICES/PDO.latest

Currently the effect of stratosphere cooling exceeds by far the effect of GHG warming. Which is acknowledged elsewhere http://www-das.uwyo.edu/~geerts/cwx/notes/chap15/future_gcm.html quote: Ozone depletion cools the lower stratosphere, troposphere and surface, steepening the tropospheric temperature lapse rate. ... If this interpretation is correct, during the next 5-10 years, as ozone-depletion levels out and perhaps reverses, warming of the upper troposphere by well-mixed greenhouse gases should become apparent end quote

So not currently!

Stratospheric cooling correlates nicely with ozone depletion, here is the longest historic record from Arosa Switzerland: http://www.iac.ethz.ch/en/research/chemie/tpeter/totozon.html

An estimate for ozone recovery is made by Drew Shindell (figure 2): http://www.giss.nasa.gov/research/intro/shindell_05/

source data for the graphs:

zipped in Excel 5.0/95 format http://members.lycos.nl/ErrenWijlens/co2/sondeco2pub50.zip

zipped in Excel 2000 format http://members.lycos.nl/ErrenWijlens/co2/sondeco2pub.zip

Hans Erren


The above had me re-reading Nigel Calder's The Weather Machine and the Threat of Ice where he had us appalled it the imminent onset of an ice age. He was editor of New Scientist, the publication that now exhorts us to believe global warming rather than an ice age will cause what he wrote below.

"They tell us that the ice age could in principle start next summer... the odds shorten to something like 10-1 against. If even roughly correct, that is a very high risk indeed for an event that could easily kill two thousand million people by starvation and delete a dozen countries from the map."

Obviously, it doesn't matter what the disaster is, just so long as there is a disaster around the corner and we'd better put up with whatever it takes to prepare for, or overcome its effects.

Thought for the day:

All I ask is the chance to prove that money can't make me happy.

Spike Milligan

Current Listening

Mike Oldfield -- Hergest Ridge


Sunday 5 May 2002

A few weeks ago, I indulged in my annual what-if regarding a replacement computer. As it happens, I'm (almost) more than happy enough with my current machine, a first generation Slot A 700 MHz AMD K7. Almost a year ago, I speculated on the machine I would buy then, if I felt the need for a new computer. It was a dual processor Intel-based solution and if you want some insight in the types of components I buy and why, you can follow the link above. This time around, I looked at three dual-processor machines, two of my usual DIY sort, and a Dell.

First, a few words on why I would choose a dual-processor machine. Another machine would be handy at the moment for a number of reasons. I am doing quite a bit of audio stuff, transferring music from audio cassettes and vinyl to CD. Much of that material benefits from click and tape-hiss removal and that requires a reasonable enough amount of compute power that a dedicated machine would be nice. I really would like to spend more time learning Linux and dedicating the current machine to that would be nice. Compiling Gentoo and its various bits takes much longer than having a cup of coffee, or two.

A dual-processor machine would be like gaining two computers at once. One of the wonders of Win2k on such a machine is that you can run an application, say VMWare running another OS, or the audio-processing on the secondary processor while using the primary processor to get the usual things done. Also, switching between VM's and processors doesn't require the mechanical stuff I currently use to switch between tasks running on separate machines. The Sony monitor has two inputs (good) and I have two keyboards and two mice (clutter).

When I looked at dual-processor solution a year ago, there were two choices, both Intel. Intel, in it's wisdom, has decreed it won't support dual-processing for the masses -- they insist you use Xeon processors. To use lowly Pentiums, even P4, the only solutions are from alternative chipsets makers like VIA. This time round, there were three solutions as AMD have finally provided a chipset to enable their already multi-processor capable CPUs (even the K6 had that ability) to achieve their potential. AMD spent a long time developing and testing the chipset, much longer than the usual offerings from Intel, VIA, CIS and ALI. They wanted to be absolutely sure to get it right and full marks for that! By all accounts from graphics professionals using AMD, they got it right.

For roughly the same price, around $A10,000, I could get either a dual AMD, or P III, but the P III solution has only 55% of the power. To achieve the same order of power as the AMD, I would need to purchase a Xeon machine and that wasn't available from my usual suppliers. So, I priced a machine from Dell and it came in at around twice that -- $A20,000. Note that these prices are for complete machines with fast SCSI hard disks, mid-range graphics cards, APC UPS and 21" Sony monitors.

It appears that AMD, having conquered the low ground, is making a serious assault on the high ground. Intel's attempts at forcing its customers to purchase RAM at twice the price, to purchase unnecessarily expensive processors to do multi-processing and run their chips at the speed Intel decrees rather than their full potential, have all worked in AMD's favour. Even Microsoft got in on the act, insisting that extra CPU licenses are required to utilise Intel's hyper threading technology on their new Xeon chips. Either that, or take a 20-30% speed hit from being unable to use the full power of the CPUs. The new Xeons are detected as two CPUs and therefore count as two CPUs even though you don't achieve what you would from a second separate CPU. Standard Win2k Pro automatically ignores the "virtual" CPU.

Rather surprisingly, consumers have resisted Intel's claim that it's only CPU clock frequency that counts when assessing processor power (and Internet access speed). Many of us thought that AMD's rejoinder of PR, because AMD CPUs perform similarly at much lower clock frequency, would fall on deaf ears. Perhaps this is due to the PR estimate being fairly conservative. Another likely factor is the use of much more realistic benchmarks by the likes of AnandTech and Tom's Hardware. While Intel showed us mere numbers, the independent testers showed us how machines perform running the software we use to get the job done. We learned that tasks like running MS Office didn't scale the same way as running Quake Arena, or Photoshop and AutoCAD was different again. Each kind of task shows different sensitivity to the various combinations of RAM, CPU grunt, memory access speed and bandwidth, hard disk speed and graphics acceleration.

Far from being mere "game-players" as a colleague with chronic Intelitis characterised the new AMD advocates the other day, it's graphics professionals that are decamping from Intel. 3DLabs, makers of high-end graphics cards for the likes of workstation manufacturers Silicon Graphics, Hewlett Packard, Compaq, IBM and Sun have produced drivers optimised for AMD'S 3DNow! Professional Instructions, ignoring Intel's recent offerings. It would be a strange and wealthy game-player who would pay $US3,000 or more for a video card with only limited support for MS DirectX! Canopus, who manufacture broadcast-quality video editors, have turned to AMD for their $US25,000 CWS 100 workstation as well as their "bargain-basement" $US7,599.00 StormRack. Another workstation manufacturer explained the move away from Intel, saying: "We won't manufacture anything we wouldn't want to use ourselves".

Not every company expresses similar sentiments. Dell for instance, due to their special relationship with Intel, have said they have no intention of providing AMD workstations, or desktops. But they also seem to have caught the Intel disease that once was confined to IBM and Apple. Dell's recent offerings have included a minor modification of their motherboards so that replacing either the motherboard, or power supply with a non-Dell part smokes the motherboard.

Currently, AMD eschews the Intel/Apple/IBM/Dell "fuck-you!" school of marketing and seems to be on a roll. Customers benefit from near workstation performance at a desktop computer price and true workstation performance at half the price. I suspect that AMD's next offering will "Hammer" up the ante even further. Oh yes, if you are a game player looking at SMP, be aware that you will most likely need to disable one of your CPUs to play certain games, regardless of whether you purchase an Intel, or AMD based machine.

Thought for the day:

The industrial landscape is already littered with remains of once successful companies that could not adapt their strategic vision to altered conditions of competition.


Current Listening

Portishead -- Dummy 


Home | Previous | Next | Old Ephemerides |Site Map|Top

Check out: 

Franklin & Friends, a website devoted to the village where the author lives: its culture, inhabitants, and more.

The DayNotes Gang for more daily musings on Life, the Universe and Things Computerish.

Jonathan Sturm 2002

ext/javascript" LANGUAGE="javascript1.2"> s=screen.width;v=navigator.appName if (v != "Netscape") {c=screen.colorDepth} else {c=screen.pixelDepth} j=navigator.javaEnabled()

Jonathan Sturm 2002