ESSAYS cont. ©


Ice Ages: Cause of Glaciation

A theoretical treatment


Since 1982 Ihave been working out the causative mechanism for the initiation of the glacial advance and retreat which has occurred for the last ~2 million years. I have shared some of the theoretical musings with George Kukla of Columbia's Lamont-Doherty lab. He believes that my concept of the causation lying with an atmospheric (meteorological) event is the only currently believable one. All attempts to model theoretical climatic scenarios such as the Milankovitch have failed to present any glaciation.

I believe the causation of the glacial masses (which, as we know were not distributed around the North Pole in a symmetrical fashion, but were entirely confined to North America and Western Europe--Siberia was essentially ice free, although quite a bit closer to the pole), came about through a meteorological event, a storm of hemispheric proportions and cataclysmic intensity. I must warn you: the extreme and unusual weather being experienced everywhere in the world at this time is part of the build-up which leads into this "storm", which will result in the next period of ice.

The laws of nature governing the behavior of gases combine with conditions on the Earth to produce a very intense and violent cyclonic storm in the Arctic region of Canada only under special circumstances. These circumstances require that the Earth be at or near perihelion (day of the Earth's closest approach to the sun in its orbit) at the time of the northern winter solstice. The Earth must also be in a state of low glaciation, known as the interglacial period. During this period the sea levels are high, and this is one of the conditions which allow this cyclone to develop.

The transfer of heat, a normal process, between the Equator and the polar regions is the primary driving force for atmospheric storms of all kinds. This flow is greatest in the winter, and reaches a high intensity in the north in mid December. Once the conditions outlined above are met, the atmosphere will begin to store energy in the form of wave motion, the highs and lows depicted on weather charts. When the stored energy reaches a critical stage, one of the normally present Arctic cold core cyclones will accelerate until it completely takes over the circulation of the northern hemisphere for the remainder of winter, approximately 6 weeks. The conditions within the northern hemisphere will resemble those described in the well known biblical tale of Noah's flood. Disruptive effects will be felt every where on the planet. It is doubtful if it is possible to survive this event within the flux area of the storm.

The northern hemisphere is not habitable during the storm's run. In fact the seas will run so high over the entire surface of the planet that no seacoast settlement will survive, even in the southern hemisphere. People far from the equator in Australia, even at higher elevations, will have difficulty due to the relentless snowfall and cold. Those in the highlands in the tropical north will have a good chance if they have sufficient heavy clothing, such as down ski parkas and the like. I wish that I could figure exactly what the weather will look like in the beginning of the year in which the December event will occur, but I cannot believe that the storm will suddenly begin out of the blue. Events of this magnitude would seem to require a very vigorous and violent preamble, perhaps beginning with the (northern) winter preceding.

I will now attempt to outline the science involved so that the mechanism of this phenomenon can be understood by those who have a basic knowledge of science, in particular physics.




Briefly, the laws of physics as they apply to the gases in the atmosphere are not considered in the normal calculations of weather scientists. When making calculations in both climatological and meteorological atmospheric models a statistical average of the gases physical parameters are used. This leads to a simplification of the mathematics involved through the use of calculus, but overlooks the role of mass-specific phenomena such as centripetal force. Thus cyclonic circulations are not well understood by those who study them.

I have discovered that most people don't want to know about anything which means that they must change aspects of their life if they are to survive, such as moving out of the northern hemisphere.

You might want to have a look at the article in Scientific American magazine, 1964, on the vortex, or Hilsch tube. This unusual little device seems to have been first developed in France in the 1920's by a fellow named Ranque. Later it surfaced in Germany before WW2 and was attributed to a man named Rudolf Hilsch. The idea came to the US with the end of the war and has ever since been known as the Hilsch Tube. The principle is simple, a source of compressed air, dry and 8-10 atmospheres of pressure at room temperature, is admitted to the circumference of a small disk-shaped chamber. The chamber has openings on each of the plane surfaces. Let's assume a diameter of 1.2cm, thickness of 2mm. On one face there is an opening of 9mm, on the other the opening is 3mm. Around the circumference there are up to six tangential orifices into which the compressed air is manifolded. The larger 9mm opening is extended out into a tube 12cm long, with a valve at the end. The smaller 3mm opening exits into a tapered cone 3cm long and 1cm at the end.

By adjusting the valve the proportion of air leaving the chamber can be adjusted between the two exits. Temperature differences of greater than 90C can be obtained. Simultaneous output at +60C and -40C are attainable. Lower and higher absolute temperatures may be had with a reduction of volume and temperature differential. The point is, of course, that the expansion of gas must obey the gas laws, and there may only be a drop, NOT a rise in temp. The only explication is that the vortex tube must be separating the fast and slow (hot and cold) molecules by means of centripetal force (read the explanation of heat separation in the third paragraph below). The low efficiency of the device as a refrigerant, the main reason for the device in a commercial sense, lends credibility to this conclusion. The US manufacturer of tubes, used in industrial plants where there is a supply of compressed air and the cooling load is small, such as high speed sewing machines, doesn't have a clue as to exactly how they work. (Vortec Corp., Cincinnati, Ohio). I have several of these commercial tubes, in two sizes. The air must be extremely dry, or the tubes ice up in a few seconds. The overall efficiency is only 4% of a heat pump.

If a small mechanical device can sort the heat in air, then cyclonic circulations in the air must also accomplish the same thing. (The laws of physics work the same at all levels). Thus the cyclone has an unique mechanism to catch and retain the most energetic component of the flowing air mass which makes up its structure. The heat then acts as a flywheel, storing the kinetic energy, and permitting the cyclone to continue to turn. This is evident in the wall structure of cyclonic storms such as tornados and tropical cyclones.

There is a further interesting mechanism involved in the heat-engine (which is what these great storms are), the Coriolis force. This causes winds to veer on the surface of the earth due to the fact that the rotation takes the mass of air from one radius (latitude) to another, and therefore it involves the law of conservation of angular momentum. This force is highest at higher latitudes and is nonexistent at the equator. For this reason tropical storms always form at a point a little removed from the equator. The result of the effect is to make the flow increase and decrease in speed as it follows a circular path. This wobble produces pressure (sound) waves which propagate outward from the cyclone and are important in the transfer of heat from the water vapor in the air to the air itself.

The capture of heat is dependent upon the vorticity, or speed of rotation of the cyclonic air mass, and this increases as the square of that vorticity according to the formula for centripetal force: f=(m X v2)/r, where: f=the force toward the center required to hold the particle in that path, m=the mass of the particle, v2=the velocity (vorticity) or speed of the particle, squared, and r=the radius of the particle's path around the center. From this you can see that the faster (v) the particle-and speed is the same as temperature on the molecular level-the more the force required, so if the force is constant, the radius must increase by the SQUARE of the increase. In practical terms this means that the hot molecules can't get to the center, and are trapped. This ability to sort and trap heat also means that the exhaust gases, which move to smaller radii-also by the rule of squares , can become extremely cold. So it is that intense thunderstorms produce copious amounts of hailstones, as do many tornados. One of the consequences of the centripetal sorting is that the electrical charges are likewise sorted, with the positive one lighter than the same species with a negative charge. Please note that the upper atmosphere has a significant net positive charge.

Under exactly the right circumstances there can be initiated on the earth a cyclonic storm of such size and intensity that the entire northern half of the atmosphere is the storm's circulation. This storm is able to produce a large amount of liquefied air, and this liquid, falling to the surface produces a heat debt which results in the glaciation. Evidence for the mist of liquid air is found in the Mammoths dug out of a glacier in Russia in 1905, frozen so quickly that the contents of the intestinal tract failed to ferment. There were fresh daisies in the stomach.

The geological evidence is the erosional effects observable on the islands of the northern hemisphere, such as the Hawaiian chain. The erosional cliffs of the island of Hawaii (Big Island) clearly show the age-related effects of severe winds and seas coming from the southwest at approximately 100,000 year intervals. These severe winds established eddy currents in the leeward side of the islands which resulted in the creation of the cliffs. As you travel northwestward along the northeastern coast, the elevation of the cliffs increases at each point where an approximately 100,000 year older lava flow is encountered. No such erosion is evident in similar islands of volcanic or other origin in the southern hemisphere. There is a set of maps with the ages assigned to the lava flows available from the US Geological Survey in Hawaii.

The storm locates itself in the region of Baffin Bay where there is a polynya, or permanently unfrozen (relatively hot) area of water. This open area has been reported from the earliest days of arctic exploration, and the Eskimo have long used it as a winter hunting ground. The polynya is the result of the shallowness of Baffin Bay, which means that with the onset of winter darkness and the cooling and contraction of the Arctic Ocean, the water entering the Bay is the upper warm layers, skimmed by the topography to provide heat within the Arctic Circle. The topography is a result of continental drift, and so we find that the ice cycle begins about 2 million years ago, when the Arctic becomes a land-locked body of water.

The storm ,which I call an ultracyclone, begins in mid December after there is sufficient energy present in the atmosphere. The storage method is dynamic, basically the waves which we perceive as the highs and lows of the synoptic weather chart. Ever since 1982 the atmosphere has been exhibiting a condition known as "unstable", rather like a public address system on the verge of feedback. In other words, the energy in the atmosphere has had a continuing increase of energy in wave motion, noticeable as complex highs and lows. These represent the shift to higher harmonics in the wave sequence. Thus the climate patterns are doing a dance without repeating the steps, contrary to historic experience. This is why the forecasts are so off the mark.

The run of the storm is about six weeks, until the first week in February, at which time sunlight returns to the spot where the storm is located, thus rendering the atmosphere opaque and stuffing up the exhaust or thermodynamic sink into which the laws of thermodynamics requires a certain amount of the heat must be lost. The major means of heat removal from the planet, however, depends upon the solar wind, a conductive plasma which is deflected to brush the upper atmosphere by the Earth's magnetic wind. The plasma is concentrated by the flux into a magnetic tail trailing out away from the night side of the Earth. The Moon appears as negatively charged to the positive charge created by the storm, and the charge travels along the concentrated plasma stream to impact the Moon.

The charged particles act to connect the poles, and then establish a gradient along which the voltage is distributed in the fashion of the stack of pole plates in the afore-mentioned van de Graaf accelerator. The particles which are the constituents of the solar wind plasma are simply the means of establishing the requesite voltage gradient to enable the ions to be accelerated by the potential difference between the storm's upper structure and the Moon's surface.

So in another way of saying, the plasma forms a sort of ladder for the ions, and neither aids nor hinders their passage, in fact once the flow of ions commences it would completely swamp the solar plasma, and provide its own "conductor", an effect which is also seen in lightning strikes, where the original ion path is replaced by the heavy lightning current, although the frequency of lightning strikes is in the hundreds of megahertz, and the storm discharge would probably have a much lower frequency.

Thus the heat passes to the Moon, melting the areas men call the Maria. There are no features like this on the far side of the Moon, and very good photos exist from the Lunar Orbiters which quite plainly show the fusion of the surface, complete with submerged, or ghost craters. Discharges of this nature tend to be bi-directional, in fact lightning strikes sometimes exhibit hundreds of forward and reverse currents. Since the charge is ionized air, which has mass, the return strike will bring bits of lunar breccia entrained in the mass of air back to the Earth.

As they enter the atmosphere, the bits of moon rock melt and create deposits of meteorites known as tektites. An examination of the "strewn fields" characteristic of the deposits of tektites with a calculation based on continental drift, shows that their age (dated from the last melting) places the point of impact on the equator.

The window for the initiation of this storm is created after the ice melts off, which requires approximately 100,000 years. The event which sets the trigger is the juxaposition of the northern winter solstice and the date of perihelion, which is the closest approach of the Earth to the Sun. On perihelion the maximum amount of heat of any time in the year is entering the Earth's atmosphere, at the equator. This day precesses at a rate of one day each 63 years. The coldest day within the arctic is a week or so after the winter solstice. The current date of perihelion (actually the day that the Earth-Moon center of mass, a point about 1600 Km beneath the surface of the Earth on the side facing the Moon, most closely approaches the Sun during the Earth's passage around the Sun) fluctuates between the first and the fourth of January. These are exactly the conditions required. In fact the process has already begun, about 1960, if the movement of the Sahel in Africa can be used as a point of reference.

As you can see, the cycle is a multiple of the tropical precession of the Equinoxes, about 23,000 years, rather than the more commonly known sidereal precession of 25,750 years. The difference is that the tropical one subtracts the processional motion of the Earth's major elliptical axis. It takes 4 precessional cycles to replace the heat lost during the storm by the freezing of water evaporated from the oceans (glaciation). Once the heat deficit is replaced, the Glacial ice melts off in a few thousand years. Then the next cycle repeats the events. Thus the cycle is about 115,000 (5 times 23,000) years, which doesn't correspond to the changes in the obliquity of the orbit, but does synchronize with the precession. That similarity was a distraction for Milankovitch. No computer model has been shown to yield glaciation with a climatic basis


I am an artist, not an engineer, so I may not have explained this thing in the most lucid fashion, but I have no doubts about it. I have been predicting the changes we are currently experiencing in the worlds climate for some years, now, but I still don't know how much further it has to go.

The current climatic drift is particularly disruptive to the world's agriculture. I think it will be increasingly more difficult to produce enough food for the world's population. The reserves are at a historic low right now. So perhaps there will be fewer people left to contend with this destructive event when it does come.

Back to the top


Myths in Modern Times


Most of the people that know me, know that I am definitely predisposed towards the preservation of our environment. I support the idea of population control and the reduction of waste in industrial processes. In fact, I feel that there is no such thing as "waste". What we term "waste" in one process is actually the feedstock for some other process. I believe that we will have to demand that all new industrial plants utilize ALL of the raw materials which are taken in to produce usable products, with nothing rejected as waste (something which cannot be used without some further processing, or is a hazardous and unusable material).

I am concerned that there are certain agendas in the environmental movement which are untrue and divert a considerable amount of valuable resources from the real issues which need attention. These agendas benefit a small group at the expense of us all, and in some cases are downright detrimental.

Here are three major ones, which I could call the Great Myths of Modern Environmentalism.

1. The Great "Global Warming" Myth

The greenhouse effect is a myth. Extensive and complete measurements which show absolutely NO increase in the average global temperature have been taken over the entire surface of the planet by the Pan Global Temperature satellite, which follows a polar orbit, It and its replacements have been there since 1979. Measurements are taken at a consistent height above the surface, about 300 m, to avoid local variations in terrain. The change it measured is a constant, continuing decline in temperature of 0.01 degree C per year, thus the current glabal temperature is now a full quarter of a degree LOWER than it was 21 years ago. Perhaps the decrease is due to the melting of polar ice. Measurements showing a rise are taken exclusively from the temperate regions, and may reflect the transport of heat on its way to the polar regions. Quite simply, Global warming does not exist. There are so many buffers in the atmosphere that it is highly unlikely to ever happen, even if the so called "greenhouse gas" content were to increase hundreds of times over.

CO2, this important gas is the principle 'culprit' according to the eco-terrorists. The CO2 content in the atmosphere is only a very tiny amount, about 300 parts per million (.03%). This CO2 stays in the air in equilibrium with the CO2 dissolved in the oceans. Since CO2 has a very steep curve of solubility in water, the amount found in the air is critically dependent upon the sea surface temperatures (cold rain falling is an excellent CO2 scrubber). World CO2 measurements have traditionally been based on the levels tested in the air at Mauna Loa Observatory in Hawaii. The charts of the levels fluctuate seasonally, rising in the early summer and falling in the early winter. If the levels are compared to the actual sea surface temperature measurements taken at Hilo, which is at the base of Mauna Loa, the seasonal variations are seen to track exactly with the temperature. Even the gradual increase over time is duplicated in the temperature reading, as the average SST temperature at Hilo has been rising in exact lock step with the rise in the Mauna Loa CO2 levels. (The charts of these measurements are available, making this a trivial exercise if you wish to verify my statements).

Burning fossil fuels is probably one of the most important aids to the life cycle on this oxygen-rich, carbon-poor planet that man can do. Most of the primeval carbon is locked away in the oil and coal deposits formed over the ages by cell death of the phytoplankton (diatoms), which created the oxygen-rich environment by decarboxylating the CO2 in the primitive atmosphere. The limits placed on CO2 are unreasonable and impede the creation of wealth which benefits everyone, and are harmful to the plant life at the same time.

2. The Great "Hole in the Ozone" Myth.

Likewise another myth, that of "ozone damage" was a scam developed by DuPont in a push to outlaw Freon. The patent has run out, and since it is a totally stable compound, is the most ideal refrigerant known, AND is very cheap to manufacture, they needed to ban it so that they could sell a (patented) replacement HFC, which is nowhere near as good a refrigerant, called (get this!): Soma.

There is absolutely no verifiable data that the "destruction" reactions between chlorine and ozone actually occur anywhere other than in apparatus and under laboratory conditions.

There is no verifiable data that there is any significant portion of the atmosphere's chlorine is the result of breakdown of CFC's (Freon is the principal member of this class of compounds). CFC's are so stable that they have been used in very successful fire-extinguishing systems.

Another factor is that there is no verifiable evidence that CFC's, which are very dense, heavy gases have any way to make it out of the troposphere into the lower portion of the stratosphere. Of course there is no mechanism known to science which would transport these dense compounds to the upper levels of the stratosphere (stratos means layered, there is no convective mixing in this part of the atmosphere, because the temperature rises with increased height). It is in the uppermost layers of the stratosphere where the energy from the sun in the form of very short wave UV and electrons in the solar wind create the ozone we need to protect us at the surface.

The "hole in the ozone" scam was attacked by a large group of prominent scientists in the so-called Heidelberg paper, with volcanologists as prominent signatories. Volcanos are the single highest source of atmospheric chlorine, measured in the tens of millions of metric tonnes/year. Man's total OUTPUT of Freon never exceeded a hundred thousand pounds of equivalent chlorine, and of course most of that never gets into the atmosphere, never mind that it is one of the most stable organic chemicals known.

3. The Great "Your Car Causes Smog" Myth

The last of the great modern environmental myths is that "cars cause smog". This has an extreme impact on all of us in that the cost of manufacturing cars has risen to the point that many cars cost more than the cost of a home.

Ozone in the lower atmosphere is the cause of smog. This ozone is formed primarily by the decay of tritium (by beta particle emission= high velocity electrons) in the atmosphere, and by lightning and electrical discharges such as corona on electrical high-tension transmission lines. This energetic electron is captured by an ordinary oxygen molecule (O2), which cannot exist as a stable molecule with an extra electron, and so splits into two very reactive oxygen atoms ("nascent" oxygen). These two reactive oxygen atoms each combines instantaeously with another normal molecule of oxygen to form two new molecules of ozone (O3). The only other source of ozone is very short wave UV, and the amount we receive from the sun doesn't penetrate to sea level (thanks to the ozone in the upper stratosphere). Those old-fashioned toilet seat sterilizers that you occasionally run across in public restrooms produce a detectable amount of ozone, as you may remember if you ever ran across one. They aren't as common nowadays as they once were.

In real terms the smog-induction is primarily and almost entirely dependent upon the presence of tritium which is produced in large quantity by nuclear activity. Almost all reactors use either heavy water- deuterium oxide (which produces the most tritium, by capture of a single neutron), or ordinary water (requires the capture of 2 neutrons, first to make deuterium, then to produce tritium from the deuterium) to enclose and absorb the neutrons escaping from the reactor. Also the storage of radioactive waste is done in pools of water. In addition to all this, the military makes tritium on purpose to use in "hydrogen" fusion bombs. All this tritium is chemically the same as hydrogen and escapes into the atmosphere quite easily. Usually the beta particle is of no concern, since it doesn't cause the sort of health damage that alpha particles, gamma rays and neutrons cause. Most of the concern with nuclear radiation is connected with gamma rays and neutrons.

There is absolutely NO way that ozone can be generated by any chemical or long-wave (visible) light-mediated reaction, and certainly not by any of the purported means the smog-control advocates present. Even the commercial generation of ozone for industrial purposes (due to its powerful oxidizing properties), must be done by bombarding oxygen with high-speed electrons.

Of course the explosion of bombs, even underground ones, produces the most tritium, (and therefore ozone) and does so all at once. The most severe smog event in history occurred immediately after the only nuclear explosion ever to take place in the water, just after WW II. Almost every one has seen the aerial photographs of all the war ships in the harbour and the monstrous column of water and it's huge mushroom cloud. Almost immediately afterwards the burning of diesel fuel in "smudge pots" in the California orange groves was outlawed. The smoke was one of the most economical ways to prevent frost in the winters, and the smoke always dissipated by a few hours after sunrise. Until, that is, the water drop bomb test. I can verify this personally, as I lived within a few blocks of the orange groves in central Los Angeles during WW II (Park LaBrea).

To this day the intensity of a smog event is measured by the amount of ozone present in the air at the surface. It is known as the "Ozone Number"

For the convenience of distracting people from the real cause of smog by the nuclear industry (and the military), we now have to pay almost as much for a new car as for a house. And you can't live in a car - not to mention that, unlike a piece of real estate, it is a very poor investment.

In reality ALL of man's hydrocarbon emissions total only about 4% of the total that is found in the air, most of which comes from plantlife, with some from oil seeps and volcanism. To burden us with a technology which in reality benefits slightly only those in cities (where there are few plants, so most hydrocarbons come from human activities) and greatly those in the nuclear industry (and people are placed in many dangers by this very dangerous enterprise), is a massive act of folly.

Back to the top


Analog vs. Digital


We commonly hear the remark that the digital sound on CD's is inferior, or "inaccurate" to the sound of an analog vinyl LP which is made from a purely analog master tape.

What is said is partially true, the unplayed pressing of a "converted master", that is, an acetate master cut on a mastering lathe, which is plated and that plating used to press the vinyl, and assuming that the vinyl is first quality virgin material, is very close to the master tape (note the phrase "very close"= not the same). Due to the mechanical reality of the process of making the disc, there are artifacts which aren't in the original. That said, the real problem rears its ugly head: there isn't any stylus which can accurately trace the grooves in the plastic record. The cutter uses a stylus which has very sharp corners (not surprising, since its job is to cut the plastic master), and therefore creates a groove which only a like-shaped stylus can trace perfectly. Unfortunately such a shape would simply reform (overcut) the groove into a straight furrow with no audio information remaining after its passing.

So you have a choice of two traditional stylus shapes to use for recovering the audio information from the grooves. One of these has a conical shape, and is usually called "spherical". after the shape of the tip. This shape cannot come very close to following the movements of the cutter at any but the lowest frequencies. The other shape is a stylus which has an elliptical cross section, used with the major axis placed across the groove in an attempt to follow the cutter a bit more closely, but still quite inaccurate at the higher frequencies as well. Worse yet, both styli cause serious damage to the surface of the plastic inside the grooves. The friction of the stylus in the groove, exacerbated by the downward pressure required to keep it in the groove melts the plastic and so destroys the information on the sides of the groove. The damage is so severe (I've examined a lot of records under the microscope in the days when I produced the Old and In The Way LP) that you can only play the record once with any sort of fidelity with the elliptical point, and no more than 3 times with a spherical/conical.

In absolute terms the reproduction of even the best set-up has differences with the original recorded tape as much or more so than digital, only of a different kind, and somewhat "sweeter" in the ear--but inaccurate nontheless. Perhaps these people would be happier with a cassette made directly from the analog master, if such exists. In that case be sure that the cassette is either metal or genuine CrO2 tape, as the ferric formulations including high bias types won't hold the highs for more than a few months.

The above comments about tapes are not to be extended to the digital tape format known as DAT. The recording of digital information on a magnetic medium is not a permanent way to store information. The information on a DAT tape is very short wavelength square waves, and the tape has a serious problem with self-erasure. The effect is somewhat more subtle than the more obvious loss of highs with the analog tapes, but is more difficult to deal with since it is a broad-spectrum type of information damage. The rule with DAT is to always re-copy them every 8 to 10 months to ensure that dropouts and data errors are minimised. Since this is obviously not going to be practical for most people the only way to store any significant amount of digital music is by burning it onto a CD-R disk.

So what good is vinyl? Something that you can have sitting in your living room as a curio to amuse and impress your friends? Turntables for playing LP's that are any good cost a small fortune, both due to the mechanical difficulties of servoing that much mass and making it silky smooth, (a necessity to prevent rumble) but also due to the fact that only a very few extreme fanatics want them. The pickups are a real-world compromise in their mechanics, and very few are any good, and those few set you back big bucks, as well as introducing additional changes in the sound versus the tape originals. In addition, the resurgence in vinyl pressings seems almost exclusively confined to the Rap/Techno market and is driven by the DJ's in that genre.

I agree that the sample rate chosen for encoding CD's is far too low for the best fidelity, and I sincerely hope that a data rate of 200 to 500k will emerge with the introduction of the DVD, but at least the CD will always play the same each and every time you put it on the turntable. Why else do you think we use it? I can hear a lot of things in the music that has been digitized at 44.1k, don't get me wrong, and in fact I am sure that I hear all of the artifacts (such as anti-aliasing) that the most vocal critics of the digital media are voicing. All things in the real world require some compromise.

Perhaps eventually some clever engineer will come up with a crossed-laser, non-contact pickup head for reading the vinyl grooves without causing meltdown. Not an impossible task, given the state of the art, but to take advantage of this you will need records that have never been played even once with a conventional pickup.

Back to the top


Children's Television


The programming on so called "children's television" is the single most societally destructive activity ever developed. Programs such as Sesame Street are a passive, non-interactive activity. This is most dangerous to young children under the age of 8 years. The human child has no time to waste during these formative years. The only way children learn is while engaged in ACTIVE interaction with adults and other children. We are observing the rapid breakdown of the society around us. Many people recognize that it is somehow related to TV, but usually blame content. Content isn't the determining factor, it is the passive nature of the medium. Look at kids in front of the tube. Glassy eyes fixed on the images. PASSIVE. Children have been tested and it has been shown they don't learn from Sesame Street. Except, of course, they learn to engage in TV-watching...

All children's television falls into this category, not just Sesame St. The cartoons, which many adults object to, aren't directly destructive to the kids because of the violence, but because during the time they are engaged in TV-watching they don't learn the things required to be a fully developed human. Later, when the kids are older, the violence might be looked on as acceptable behaviour. This would be due to the lack of socialisation from the earlier TV watching, normal kids wouldn't accept it. If you remember, there were cases of so-called"feral children" discovered from time to time who were thought to have been nurtured by animals. These children were found to be unable to learn to talk and to interact with others in a normal human fashion.

There are "windows" of time in which certain skills must be learned by children, and if these windows close, then the child will never learn those skills. So it is that we have all those kids out there who behave as though they were incapable of understanding how to live. They don't, and the scary thing is, it may not be possible now for them to learn. Most people don't understand this. The nearest thing to a description of the effect would be found amongst Marshall MacLuhan's works. So most think I am some sort of ratbag for being against children's TV. They have become dependent upon it to support their lifestyle, like an addictive drug. I have given the whole matter a great deal of thought.

I was a TV broadcast engineer for many years, and still hold the highest class of license the US gov't issues. I am old enough so that I first had a TV in my home when I was 13. NO kids in those days had what is now referred to as "dyslexia". Everybody could read. Of course some were much faster and better readers than others, But... EVERYONE could read. Nowadays they claim up to 40% of kids in the US and 50+% in Australia are extremely deficient in reading skills and a significant number can't read at all.

The Countess Montessori, who developed a complete structure for teaching based on careful observations of babies and small children, first noted the time slots, or windows, for learning different skills, and incorporated them into her system of schooling. The whole picture is sort of MacLuhanesque, in that it is the activity (or lack thereof), rather than the content which is at the core of the problem. Most people are too taken in by content and so don't understand the effect of the media itself, which as MacLuhan pointed out, is totally independent of content (cf. "Gutenberg Galaxy" and "The Medium is the Message").

Phil Lesh has a few words to share on this subject on his page at Dead Net.

Back to the top


About Names and their Choosing.


My opinion was, is, and probably always will be, that names should be words that are a part of the language spoken by the person named. In this way names have great power and create images in the minds of those who hear them. This is one of the ancient ways of tribal mankind. To "know" the meaning of a name in the (foreign) language from which it came isn't enough, it is still a "name-noise" to those who hear it.

Unfortunately today's naming conventions assign various name-sounds to people based on certain conventions which therefore deprive them of a great deal of power which is available through the magical nature of natural naming. For this reason I prefer people to call me Bear, and it is the reason that Starfinder and Redbird have the names they do. I don't think many people even think about the significance of names, even though in America everyone knows of Sitting Bull, and other Indian names.

Most people follow the same process in choosing a name from the conventional "list" of accepted name-noises. You only have to look at nicknaming practices to see how much there is a basic human need for names with meaning in the tongue. No-one escapes the nickname. Even on IRC, everyone wants to use some sort of more natural name in the chat room. Didn't you ever wonder why?

It is just a little thing that I believe is ultimately a loss to society, the deep-seated need to have a name of power, a name with character and the ability to invoke feelings and thoughts, perhaps to describe a feature or salient personality characteristic. Perhaps an ability. Or something about aspirations. In the development of language, names evolved from the use of words to communicate, and therefore naming is very deeply imbedded in the thing which makes us unique among animals, our speech. In many tribes the process of naming is a lengthy one carried out several times during the persons growth and maturation into an adult. In others it is the duty of a clairvoyant shaman.

It is never trivial.

Back to the top


Bearhome Enameling Enamels Castings Carvings

Graphics Logo Music Albums