* Entries include: JFK assassination, evolution of nervous system, domestication of the housecat, OA-X light turboprop attack aircraft for USAF, Scorpion small smart bomb, power from sewage, graphene electronics, genome conservation apparently unrelated to function, Chinese wind power, Genome 10K plan to obtain 10,000 genomes, schlock mockbuster movies, and antibiotic threat to our intestinal bacteria.
* NEWS COMMENTARY FOR JANUARY 2010: As discussed here in the past, while Barack Obama promised to shut down the terrorist detention camp at Guantanamo Bay ("Gitmo"), it was never going to be as easy a task as would be hoped. As reported by an article from BBC WORLD Online ("Yemen Al-Qaeda Link To Guantanamo Bay Prison" by Peter Taylor), the difficulties were highlighted by an attempted terrorist attack on a US jetliner over Detroit, Michigan, on Christmas Day 2009, when a young Nigerian named Umar Farouk Abdulmutallab tried to set off a bomb packed into his underwear. The bomb was a fizzle and he was the only one injured.
The Nigerian said he had been trained and dispatched by a Yemeni branch of the al-Qaeda terrorist network, named "Al-Qaeda In The Arabian Peninsula (AQAP)". What makes the matter troublesome for the Obama Administration relative to shutting down Gitmo is the fact that two of AQAP's founders, Said al-Shihri and Mohammed al-Awfi, were both formerly prisoners there. Several of AQAP's foot soldiers are Gitmo alumni as well.
The US had released these men at the request of Saudi Arabia; Saudi officials wanted their people back so they could be de-radicalised through a government-run program named "Care". A total of 120 Saudis were repatriated from Gitmo, with 111 run through Care; the other nine had been sent back home before Care had been set up. By all reports, Care is a fairly benign program and has proven effective in convincing the less committed "jihadis" of the error of their ways, with the Saudi government claiming a 90% success rate.
Unfortunately, that means a 10% failure rate, and ten of the 120 ran off to seek refuge in the wilds of Yemen. In Yemen, Said al-Shihri and Mohammed al-Awfi helped set up AQAP and then took part in the organization's launch video. The video was released on 22 January 2009, the day after Obama announced that Guantanamo was to be closed down by 22 January 2010. In the video, Mohammed al-Awfi savagely attacked the Saudi rehabilitation program, hinting that al-Qaeda believes Care is effective. Al-Qaeda has certainly reacted to it: last August, a suicide bomber attacked the Saudi deputy interior minister, Prince Mohammed Bin Nayef, with the bomber detonating explosives packed up his rectum. The minister survived.
However, Mohammed al-Awfi, it appears after encouragement by his wife and children, then returned back to Saudi Arabia and gave himself up. He is now incarcerated in a luxury suite in Riyadh's top security prison and proving very cooperative with his interrogators. He came across as perfectly docile to Peter Taylor when interviewed for the BBC, though Taylor had little doubt that the prisoner's responses were scripted by his "keepers".
Other Gitmo alumni who signed up with AQAP have proven less docile. One, Youssef al-Shihri, infiltrated back into Saudi Arabia in October 2009 dressed in a woman's all-hiding burqa, to form up a terrorist cell with six other AQAP terrorists. Saudi security busted the cell, with Youssef al-Shihri and another jihadi killed a gun battle; three explosive belts were found in their car, suggesting they were close to performing an operation. Said al-Shihri, who shared the spotlight with Mohammed al-Awfi in the AQAP video, is now second-in-command of AQAP, placing him on the Saudi "most wanted" list.
The Christmas Day attack on the jetliner over Detroit was an embarrassment to the Obama Administration, with critics claiming Obama was weak on terrorism and leading to declarations by the president of renewed vigilance. It is clear that Obama was sincere in wanting to shut down Gitmo, but the difficulty in doing so only seems to be increasing steadily. 22 January 2010 passed and Gitmo is still in business.
* After a disastrous earthquake in Haiti that lead to massive casualties and a breakdown in civil order, international emergency response organizations converged on the Caribbean to help. Of course, the US military was one of the big players in the operation.
WIRED Online had an interesting note about how the military was assisted in the operation with the Pentagon's "Transnational Information Sharing Cooperation (TISC)" network, the latest iteration of what was once called the "Asia-Pacific Access Network (APAN)". TISC is not a battlefield network by any means; it's much more like a social networking software system, based on the world internet and not secure military networks. TISC provides tools like file-sharing, blogs, wikis, calendaring, and message boards to interconnect users.
TISC was developed by the Pentagon's "Defense Information Systems Agency (DISA)". The concept was to use local internet connections and a software architecture that was easy for naive users to learn to aid collaboration between the US military and other players. Over 1,700 users were hooked up to TISC in the Haiti operation, many of them with civilian relief organizations. TISC allows the players to identify problems and direct resources appropriately. Despite the damage to the communications network in Haiti, DISA says that TISC has proven very robust.COMMENT ON ARTICLE
* GIMMICKS & GADGETS: European Union (EU) officials have occasionally made noises about a concept to set up solar-power arrays in the Sahara to supply Europe with "green" electricity over high-voltage DC (HVDC) lines strung across the Mediterranean. Sounds like a blue-sky scheme, but according to the DISCOVER magazine online blogs, a "Desertec Industrial Initiative (DII)" has been set up to make it happen.
The DII was put together under the leadership of Munich Re of Germany, the world's largest "reinsurance" company -- that is, it provides insurance to other insurers. Apparently Munich Re company officials felt that investing in green power was good insurance for the future and persuaded twelve players, including European industrial giants like ABB and Siemens, to sign up. The DII is talking about a $400 billion USD investment to provide 500 gigawatts of electrical power, 15% of Europe's demand, from Sahara sun by 2050.
The group is looking at solar turbogenerator plants, using arrays of mirrors to heat a working fluid and drive a steam turbine. Heat could be stored thermal reservoirs, for example molten salt, to keep the turbine running when the Sun goes down. In case of bad weather, the turbine could be kept spinning by a backup system driven with natural gas or biogas. The electric power would be transmitted to Europe via efficient high voltage direct current (HVDC) trunk lines.
Morocco, Algeria, Tunisia, and Egypt have been in discussions with the DII and proposals are being floated for deals. If all goes well, the first desert plant may be running by 2015, though that's clearly optimistic.
* An article from BUSINESS WEEK reports that German engineering giant Siemens has carved out its niche in wind power. While the company has only 7% of the global wind power market -- well behind market leaders like Vestas and General Electric -- it dominates the new frontier of offshore wind power systems. Setting up offshore wind power farms is of course technically more difficult than setting them up on land, both in terms of the turbine installation and the power connections. Siemens has long been a major player in electrical power distribution and so has close ties with European electrical utilities; that translates into an advantage over competitors in setting up the power distribution link for offshore power.
Siemens got into the offshore wind power business in 2004 by acquiring a firm named Bonus Energy that built offshore wind turbines. Since that time, Siemens Wind Power has grown sevenfold, with revenues in the billions of dollars and profit margins running well above ten percent. One of the big current projects is a $4 billion USD deal to set up 500 offshore wind turbines for the Danish power utility DONG.
Still, at present offshore wind power capacity is only about 1% of that of land-based wind power. Offshore demands more elaborate power infrastructure and more reliable wind turbines, which means that offshore wind has an inherent cost penalty; it remains a viable business because it doesn't have the same worries about siting and public hostility as wind power. Siemens wants to become number-three in global wind power by 2012, which means taking on the market leaders on their own ground -- while the market leaders are in turn eyeing Siemen's profitable niche in offshore wind power.
* Another article in BUSINESS WEEK discussed how Dutch electronics giant Philips is trying to exploit the company's early foothold in LED lighting technology. Although high-efficiency LED lights are still expensive relative to the current fluorescent standard -- LEDs were only 3% of the world lighting market in 2008 -- Philips knows the price of LEDs is on a strong downward ramp, and the efficiency of LEDs is likely to lead to their dominance. LEDS only make up about 10% of the Philips Lighting subsidiary's business at present, but the proportion is expected to rise to 80% by 2020.
Of course as LED prices go down and volumes go up, competition will increase while profit margins fall. Philips Lighting is trying to obtain an advantage for the future by addressing the full spectrum of customer lighting needs -- not merely components and fixtures, but systems, installations, and even financing. For example, Philips Lighting worked with Marriott Corporation to provide the lighting system for a 34-story hotel being built in Indianapolis, to open in 2011. Philips provided custom-designed LED fixtures, engineered the electrical and control system, calculated energy savings, and provided financial advice. However, although Philips is currently the boss in the LED lighting market, the competition -- Siemens, GE, Sharp, Samsung, and Cree -- is not idle and is guaranteed to give Philips a run for the money.
* As reported by a note in AAAS SCIENCE, two researchers at Rensselaer Polytechnic Institute (RPI) in New York State have come up with a fully biodegradable replacement for styrofoam packing material. They noticed how the tangled roots of mushrooms, the "mycelium", twined around forest debris and bound it together -- an observation that inspired the researchers to come up with a scheme to grow mushrooms on a bed of rice, cottonseed, hazelnut, and buckwheat hulls. The mycelia bind the hulls together into a lightweight composite that can be dried in an oven and cut into packing pieces as desired. The inventors call their bioproduct "EcoCradle"; they claim it is as light and cheap as polystyrene, while being fully biodegradable.COMMENT ON ARTICLE
* OA-X IN THE WORKS: The Pentagon's effort to obtain a light turboprop attack aircraft, last mentioned here a few months back, apparently was not regarded with enthusiasm by all US Air Force (USAF) brass, the general perception being that Defense Secretary Bob Gates was pushing the program whether the blue-suiters liked it or not. However, an article on the proposal in AIR FORCE magazine ("The Light Attack Aircraft" by Marcus Weisgerber, January 2010) suggests resistance is fading out.
The operational tempo in Iraq and Afghanistan has been hard on the Air Force's F-15 and F-16 fighters, wearing them out much faster than projected. These aircraft aren't really well suited to the current mission, which involves flying extended patrols -- keeping an eye on things with the visible-light and infrared cameras in the fighter's targeting pod, and then performing a strike when needed. This is basically an "armed surveillance" function and something like an F-15 is major overkill for the job.
A light turboprop attack aircraft like the Hawker Beechcraft AT-6B or EMBRAER Super Tucano would more than pay for itself in such a role by eliminating the costly wear and tear on the F-15s and F-16s. In other words, the "military depreciation" in value of these expensive jets in current operations is more than the pricetag of a Super Tucano. The OA-X would not only be cheaper to buy and operate, it would have longer endurance as well, which would translate into still lower costs because it wouldn't need tanker flights to keep it in the air. Air Force brass pushing the program think that the OA-X could perform a number of other roles, such as search & rescue or homeland defense, and believe the program is more likely to be successful if it does. Since the Air National Guard is faced with a decline in numbers of aircraft, the OA-X would also help keep Guard pilots flying.
The USAF wants to obtain 15 OA-X aircraft in Fiscal Year 2011, with the type going into service as early as 2013, and the long-range buy to be up to a hundred machines. The fast-track program dictates an "off-the-shelf" aircraft, though with gear fitted as per USAF requirement. Specs include carriage of a Gatling-type machine gun, what caliber wasn't mentioned, and fit of four stores pylons, hauling ordnance as heavy as 225 kilogram (500 pound) guided bombs on each pylon. The aircraft also needs to have a sensor / targeting system for observation and guiding smart munitions, as well as a defensive countermeasures suite -- one of the early objections of Air Force brass to the OA-X was that it would be too easy to shoot down. The spec does not actually dictate turboprop propulsion, but a turboprop is generally seen as more consistent with the spec than turbofan propulsion. There will likely be a "fly off" for selection of the winning aircraft.
* One of the parallel developments that would give the OA-X a considerable boost is the development of small smart weapons, last discussed here in June 2009. Small smart weapons mean lower cost and a significant reduction in the risk of killing innocent bystanders. They could be carried by smaller unmanned aerial vehicles (UAVs), and they could be hauled in numbers by the OA-X while allowing the aircraft to carry external fuel tanks as well, further extending its endurance.
Lockheed Martin has been working on a little smart bomb, the "Scorpion". It has a weight of about 10 kilograms (33 pounds), four popout tailfins, and a pivoting wing. The default target seeker system is laser homing, but the Scorpion is designed to support swap-in seeker modules based on other technology, such as millimeter wave radar. Lockheed Martin has designed a carrier that can accommodate three Scorpions on a Hellfire anti-tank missile launch rail, with the interface protocols emulating the Hellfire's. The Scorpion's form-factor is actually similar to that of an air-dropped parachute flare, and so the little bombs can also be carried in a four-tube SUU-25 pylon-mounted flare dispenser pod, with two bombs per tube for a total of eight bombs per pod.
Along roughly similar lines, General Dynamics has been working on a "roll controlled fixed canard (RCFC)" fuze that can be screwed into the nose of an 81 millimeter mortar round in place of a traditional fuze, providing it with GPS guidance backed up by an inertial guidance system, using moving fins on the fuze body for glide control. The RCFC fuze won't tolerate the shock of being fired out of a mortar; it is instead intended to adapt mortar rounds for carriage by small UAVs, the weight of the round with the fuze being only about 5 kilograms (11 pounds). There is the minor disadvantage that mortar rounds aren't designed for airdrop -- they don't have carriage lugs for hookup to a stores rack -- requiring the development of a clamp-type stores carriage system for the UAV.COMMENT ON ARTICLE
* POWER FROM SEWAGE: The concept of using a "digester" system to convert animal manure into methane-rich "biogas", which is then burned to provide heat and electric power, has been discussed here in the past. In addition to producing energy, the process converts the methane that is normally generated by the decay of manure into carbon dioxide -- reducing the manure's effect on global warming, since methane is about 20 times more potent a greenhouse gas than carbon dioxide.
To no great surprise, as reported by THE ECONOMIST ("The Seat Of Power", 2 January 2010), there's a parallel effort in progress to do the same thing with human sewage. Germans already obtain power from 60% of their sewage, while the Czechs, British, and Dutch are close behind. One estimate claims that Britain will obtain power from 75% of the nation's sewage by the end of 2010, producing enough electricity for 350,000 homes. That's not energy independence by any means, but it's a hefty business in itself.
The critical path of the process is the production of as much methane as possible as cheaply as possible. A British concern named GENECO, associated with the Wessex Water utility company, has come up with a process in which the sewage is heated at 40 degrees Celsius (104 degrees Fahrenheit) for several days, and then transferred to a tank maintained at 35 degrees Celsius (95 degrees Fahrenheit). The two-stage process promotes the action of different and complementary strains of digestive bacteria to increase methane production by about 30%. It does require an input of energy for heating, but that can be obtained by tapping off a portion of the additional methane, making the process self-sustaining.
A team at the Fraunhofer Institute in Germany has taken a different approach, accelerating the digestion of sewage to completion in one week instead of the usual two by using pumps to stir the mix. The stirring has two effects:
The Fraunhofer pump system is now in use in several dozen sewage treatment plants around Europe. The pumps do require energy, but they're only run a few hours a day. Researchers at the Tema Institute of Linkoping University in Sweden are also working on a stirring scheme, but they're using ultrasonics to perform the agitation. However, at present the ultrasonic system demands more energy than the process produces.
While the toilet does turn out to be a surprisingly impressive source of energy, there's also a substantial flow of food scraps and other wasted food that rivals or even exceeds the toilet's contribution. Processing waste food has become common at US sewage treatment plants, thanks to a collaboration between the US Department of Energy and the US Environmental Protection Agency to promote the measure. Unfortunately, digestion of waste food is currently discouraged in Britain by the fact that waste food has to be pasteurized before being digested. It is likely this rule will be amended to permit better use of this under-appreciated resource.COMMENT ON ARTICLE
* THE TAMING OF THE CAT (2): How humans and cats came to live together is an interesting question. Many domestic animals originally lived in packs and herds, the dog being a significant example, and being social animals they don't find the society of humans too much of an adjustment. Cats, aside from lions, are notoriously unsociable and independent, operating in small familial groups at most. In addition, while most domesticated animals live partly or wholly on plant food, cats effectively only eat meat -- they don't even have an ability to taste sweets -- and meat is relatively expensive. Finally, while there's a debate over just how intelligent cats really are, at least compared to dogs, nobody disputes that they're hard to train. A dog used in movies can be trained to perform multiple tricks; training cats means obtaining several cats that look alike and teaching each of them one trick. The general impression is that it's hard to teach cats tricks not because they're stupid, but because only they do what they damn well feel like doing.
All that suggests that, as cat lovers tend to believe, that cats decided to adopt us instead of the other way around. Humans simply presented opportunities for cats to exploit, or more specifically did so after the beginning of the agricultural revolution. Humans began to stockpile grain and other plant foods; those stockpiles attracted vermin, particularly the house mouse, Mus musculus domesticus, which originated on the Indian subcontinent. The most ancient known human grain stores, from Israel and dating back 10,000 years, include remains of the house mouse. The spread of the house mouse was entirely dependent on human agriculture, since the mouse could rarely compete with local rodents outside of human habitations.
Humans from the New Stone Age were not noted for their tidiness, with settlements including trash dumps, which encouraged mice. Of course, mice can breed rapidly, and under such conditions swelling mouse populations would attract predators. Cats would not only find the mice around human settlements an attraction, they would also scavenge from the trash dumps themselves. Those cats who were tolerant of, or even had a degree of interest in, humans were likely to find such an environment more congenial than those that didn't. Humans, on their part, didn't have a problem with cats hanging around to eat mice, snakes, and other troublemakers.
Indeed, humans clearly became fond of cats -- or at least many did, since even today a good number of people absolutely despise the creatures. People began to take them in for companionship, at least of a sort, with the increasingly tamed cats demonstrating some degree of the features of domestication, most importantly "neoteny", the retaining of childish features in adult animals. This is well-known in dogs, who retain features as adults only found in the puppies of ancestral wolves, such as the upturned tail and puppylike behavior. In cats, this neoteny is exhibited in features that humans find "cute" and "babyish", such as big eyes, snub face, and round forehead.
But why was only the Middle Eastern lineage of the wildcat domesticated? It appears that European and Chinese wildcats are much more leery of humans and so were reluctant to approach settlements, even given the temptation of large numbers of mice. The Central Asian and southern African wildcats are friendlier, but agriculture took hold much later in those regions than it did in the Middle East, by which time the domestic cat had spread its range, roaming or being taken from settlement to settlement. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE HSCA REPORT: Following the Warren Report, there were other US government investigations into the JFK assassination. In 1968, a panel under Attorney General Ramsey Clark reexamined various forensic evidence relating to the JFK assassination and generally backed up the Warren Report.
From 1976 to 1978, the US House of Representatives conducted its own investigation of the assassination of JFK, as well as that of the 1968 murder of civil rights activist Dr. Martin Luther King JR. The "House Select Committee On Assassinations (HSCA)" published its report in 1979. The report was critical of the FBI, the CIA, and the Warren Commission; most importantly, while the HSCA report claimed that Oswald had in fact shot three rounds from his sniper's nest, it also said a fourth shot had been fired, presumably from the grassy knoll, but it had missed. The conclusion was that there had been a conspiracy, though the report didn't speculate on who was involved -- and in fact flatly said there was no real evidence that the CIA, FBI, or the Secret Service had taken part in any such conspiracy.
The HSCA's dramatic conclusion concerning the fourth shot was derived from a recording made by the Dallas Police Department of a radio transmission during the assassination, using a "Dictabelt", an electromechanical recording system commonly used in the contemporary "Dictaphone" system for taking dictation. It featured a needle scribing a celluloid belt. The recording was of a police radio mounted on a motorcycle; the radio had been inadvertently left in the transmit position, with police providing verbal time checks during the recording.
The recording did not actually include the sound of a fourth shot as such; it did feature an "impulse" that could be seen on spectral analysis of the audio that was interpreted as a shot. However, porn king Larry Flynt published the Dictabelt audio on a flexible phono recording in his GALLERY magazine in July 1979. An Ohio rock drummer named Steve Barber was listening to the recording and noted that during the interval in which the HSCA report said the shots were being fired, he could hear chatter among the law enforcement officers involved reacting to the shootings, placing the supposed shots a minute after the actual shootings took place.
There were further criticisms of the HSCA's interpretation of the recording from Dallas police officers who had been present at the assassination, saying the scenario of events envisioned by the HSCA simply did not correspond to what had happened. One revealing hint was that the sound of the motorcycle from where the radio transmission took place was of a tricycle motorcycle, not one of the two-wheelers that were accompanying the motorcade. The two types of vehicles had easily distinguished sounds and a police dispatcher concluded the stuck radio was on a trike operated by an officer at the Dallas Trade Mart, where the motorcade was to end up.
In 1982, the US Justice Department set up a panel from the US National Academy of Sciences (NAS) to examine the acoustic evidence to see if it had any merit; the NAS report flatly rejected it. A counter-rebuttal was published by a conspiracy theorists in 2001, leading to a volley of counter-counter rebuttals -- most prominently one released by ABC NEWS in 2003 that once again discredited the acoustic evidence. Another study released roughly at the same time and funded by the COURT TV channel showed that application of the same methodology to other acoustic samples, used as experimental controls, identified "gunshots" in random acoustic noise.
* Although conspiracy theorists still like to cling to the acoustic evidence, it isn't entirely helpful to them even if it's accepted as valid. The problem is that the HSCA said the fourth shot had missed, while conspiracy theorists had generally determined the existence of the "grassy knoll" shooter from JFK's wounds and the gruesome bullet impacts recorded on the Zapruder film, which of course implied that the second shooter scored hits. As far as the validity of the acoustic evidence goes, a disinterested observer might be forgiven for wondering if a recording that was of low quality to begin with, doesn't actually play back the audible sound of a shot, and which has produced nothing that resembles a consensus of opinion after over three decades of examination should be given much more credence than, say, supposed voices of the deceased in audio recordings.
Such is the story in general of the legacy of the JFK assassination -- claims and counterclaims traded until dramatic revelations simply disappear into a turbulent gray fog. There's a certain black humor in it that would be easier to appreciate if it wasn't so frustrating.
Since the HSCA and the Justice Department rebuttal of the acoustic evidence, there have been no formal US government investigations into the JFK assassination. There have been endless books, proclaiming that Kennedy was assassinated by the CIA, the FBI, the Mob, the Cubans, even by a secret committee directing US relations with space aliens. Probably the most elaborate private investigation of the assassination was a mock trial conducted in the UK by London Weekend Television. It featured high-profile American lawyers Vincent Bugliosi, for the prosecution, and Gary Spence, for the defense, with an American judge presiding. Actual witnesses were brought from the US to testify in the trial. The jury found Oswald guilty. Bugliosi went on to continue his investigations into the assassination, publishing a book titled RECLAIMING HISTORY in 2007 that fleshed out the case against Oswald. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* SPACE NEWS: Space launches for December included:
* 5 DEC 09 / WIDEBAND GLOBAL SATCOM 3 -- A US Delta 4 booster was launched from Cape Canaveral to put the US Department of Defense's "Wideband Global/Gapfiller Satcom 3 (WGS 3)" geostationary comsat into space. WGS 3 was built by Boeing and was based on the company's BSS-702 comsat platform. The satellite had a launch mass of 5,805 kilograms (12,800 pounds), carried a payload of Ka-band and X-band transponders, and featured a spot beam capability. The satellite was placed in the geostationary slot at 12 degrees West longitude to support the US military European and African commands. The Delta 4 was in the "Medium+ (5,4)" configuration, with four solid rocket boosters.
* 09 DEC 09 / YAOGAN 7 -- A Chinese Long March 2D booster was launched from Jiuquan to put the "Yaogan 7" Earth observation satellite into orbit. While Chinese sources claimed it was a civil remote sensing satellite, it was believed to be a military reconnaissance satellite.
* 14 DEC 09 / GLONASS 730,733,734 -- A Russian Proton M Block DM2 booster was launched from Baikonur to put three GLONASS navigation satellites into orbit. They were designated "Cosmos 2456 / GLONASS 730", "Cosmos 2456 / GLONASS 733", and "Cosmos 2456 / GLONASS 734".
* 14 DEC 09 / WISE -- A US Delta 2 7320 booster was launched from Vandenberg AFB to put the NASA "Widefield Infrared Survey Explorer" astronomy satellite into orbit. WISE had a launch mass of 662 kilograms (1,460 pounds) and carried a 40 centimeter (16 inch) wide-field infrared survey telescope, with a mirror made of gold-coated aluminum. The telescope featured an imaging system operating in four infrared bands and with a resolution of 4 million pixels. The telescope was fitted inside a cryostat containing solid hydrogen to cool it to very low temperature, permitting high sensitivity. Images were to be obtained every 11 seconds over a field of view 47 arc-minutes wide. The spacecraft itself was built by Ball Aerospace, with the instrument provided by Logan Instruments of Utah. The mission was planned to last for nine months until the coolant gave out.
* 15 DEC 09 / YAOGAN 8, XIWANG 1 -- A Chinese Long March 4C booster was launched from Jiuquan to put the "Yaogan 8" Earth observation satellite into orbit. The flight also carried the 50 kilogram (110 pound) "Xiwang 1 (Hope 1)" amateur radio satellite as a secondary payload.
* 18 DEC 09 / HELIOS 2B -- An Ariane 5 ECA booster was launched from Kourou to put the French "Helios 2B" military surveillance satellite into a near-polar Sun-synchronous orbit. The primary payload of the 4,200 kilogram (9,260 pound) satellite was a high-resolution imaging system with both daylight and infrared capabilities. The spysat also carried a wide-field imager for mapping applications, with this instrument derived from a predecessor flown on the civilian "Spot 5" Earth remote sensing satellite. Helios 2B followed "Helios 2A", launched in 2004, and "Helios 1A", launched in 1995.
Helios 2B was built by EADS Astrium for the DGA, the French defense procurement agency, with Thales Alenia Space providing the high-res imaging system. While France provided most of the funding for the spysat, Belgium, Spain, Italy, and Greece each had a 2.5% stake in the program. The partners received intelligence from the satellite, with Italy providing France information in return from the Italian COSMO-SkyMed radar spysat system. Germany, though not a funding partner for Helios 2B, also received intelligence from the satellite, while providing France with intelligence from the German SAR-Lupe radar spysat system.
The six nations have plans for a unified military space surveillance system, the "Multinational Space-Based Imaging System (MUSIS)", with Sweden and Poland interested in joining. The current plan is for France to lead the optical-infrared segment of MUSIS, while Germany and Italy work on the radar segment. The French are also working on a commercial space imaging system named "Pleiades", with the first two satellites to be launched in 2010.
* 20 DEC 09 / SOYUZ TMA-17 (ISS) -- A Russian Soyuz booster was launched from Baikonur to put the "Soyuz TMA-17" manned space capsule into orbit on an International Space Station (ISS) support mission. The crew included Oleg Kotov of the RKA Russian Space Agency, Soichi Noguchi of the JAXA Japanese space agency, and Timothy Creamer of NASA. This was Kotov's second flight, the cosmonaut having already done a stint on the ISS in 2007, while Noguchi was on a NASA shuttle flight in 2005; Creamer was a rookie. Soyuz TMA-17 docked with the Russian Zarya module on 22 December 2009. The three spacefarers joined the ISS "Expedition 22" crew of commander Jeffrey Williams of NASA and Maxim Surayev of the RKA.
* 28 DEC 09 / DIRECTV 12 -- A Proton M Breeze M booster was launched from Baikonur to put the "DirecTV 12" direct-to-home TV geostationary comsat into orbit for the DirecTV company. DirecTV 12 was built by Boeing and was based on the company's BSS-702 comsat platform. The satellite had a launch mass of about 5,900 kilograms (13,000 pounds), carried a payload of 131 Ka-band transponders, featured two 2.8 meter (9.2 foot) dish antennas plus nine smaller dishes, and had a design life of 15 years. It was placed in the geostationary slot at 102.8 degrees West longitude to provide HDTV service to the continental US, Alaska, and Hawaii. It joined two similar DirecTV satellites, launched in 2007 and 2008.
* OTHER SPACE NEWS: In early December Richard Branson's Virgin Galactic commercial space company rolled out the first "WhiteKnightTwo" carrier aircraft for the firm's "SpaceShipTwo" passenger-carrying suborbital spacecraft. While Virgin Galactic is publicly focusing on the "space tourist" market, the company is not ignoring the low-cost satellite launch market, having being studies for a "LauncherOne" booster to be carried aloft by the WhiteKnightTwo, and capable of putting a 50 kilogram (110 pound) satellite into orbit for $2 million USD. The US Air Force's Operationally Responsive Space program has expressed strong interest in the LauncherOne concept.
* Once upon a time, Japan actually had two space agencies: the "National Space Development Agency (NASDA)", which was essentially a space technology organization, and the "Institute for Space & Astronomical Sciences (ISAS)", which was a space science organization. The two organizations had their own boosters and launch facilities. While some believed the dual approach had its advantages, the duplication of effort finally led in 2003 to the consolidation of Japan's government space activities in one organization, the "Japanese Aerospace Exploration Agency (JAXA)".
Before ISAS became part of JAXA, its largest booster had been the medium-lift solid-fuel "M-V" launcher. It was only launched seven times in nine years, and unsurprisingly that led to a very high per-launch cost of about $80 million USD. Now JAXA is proposing to develop a replacement, the three-stage "Advanced Solid Rocket (ASR)", that would be able to put 1.2 tonnes (1.3 tons) into low Earth orbit. That's only about two-thirds the lift capacity of the M-V, but the ASR is planned to cost only about $30 million USD per launch.
Work on the ASR is now underway at Ishikawajima-Harima Heavy Industries LTD (IHI). The first stage is to be based on the solid rocket booster developed for the Japanese H-2 / H-2A booster, while the second stage is to be based on the third stage of the M-V, and the third stage is to be based on the fourth stage of the M-V. The upper stages will have some improvements and more thrust than their M-V predecessors. Avionics will be leveraged off the H-2A. The launch mass of the ASR will be about 90 tonnes (99 tons), compared to 140 tonnes (154 tons) for the M-V, and will have a height of 24 meters (78.7 feet), compared to 30.8 meters (101 feet) for the M-V.
The ASR will be designed to be easy to handle and fly, with considerable on-board autonomy greatly reducing the number of personnel needed to supervise a launch. Initial flight of the ASR is expected in 2012 or 2013. JAXA would like to then develop an even cheaper follow-on, with a launch cost of only about $20 million USD.COMMENT ON ARTICLE
* GRAPHENE FOR THE FUTURE? A few decades ago, chemists and materials researchers began to get a handhold on the possibilities of graphite, the form of carbon based on sheets of hexagonal cells of carbon atoms like microscopic chicken wire. They discovered how the sheets could roll up into balls to create "soccerene" or "buckyballs", and into tubes to form "carbon nanotubes". Five years ago, single-layer continuous sheets of graphite known as "graphene" became stars in the field.
There was some suspicion that graphene was a fad that would go away after its 15 minutes of fame, but as reported by an article from AAAS SCIENCE ("Carbon Sheets An Atom Thick Give Rise To Graphene Dreams" by Robert F. Service, 15 May 2009), graphene seems to be hanging on handily. Papers on graphene and Google hits on the subject have been proliferating at a rate that doesn't seem to be falling off. Graphene is a miracle material, a sheet only an atom thick that's incredibly strong, but as flexible as plastic wrap. It is also an excellent thermal and electrical conductor, with very high electron mobilities.
Since graphite is effectively piles of graphene sheets, it seemed possible in principle as far back as the 1980s to slice graphite and get single sheets, but researchers had trouble figuring out how to do more than obtain stacks of graphene about a hundred layers thick. In 2004, a team at the University of Manchester in the UK under physicists Andre Geim and Konstantin Novoselov, working with colleagues at the Institute Of Microelectronics Technology & High Purity Materials in Chernogolovka, Russia, reported they had figured out a way to obtain single graphene sheets. The technique turned out to be almost embarrassingly simple: Novoselov placed a fleck of graphite between two layers of cellophane tape, peeled them apart, and kept on doing it until he got down to a single layer.
Even in the initial 2004 paper, Geim had his eye on applications. Graphene had amazingly fast electron mobilities, no doubt because of the regularity of the molecular arrangement and the lack of "defects" to interfere with electron conduction. Early experiments showed that it had electron mobilities an order of magnitude greater than that of silicon, the current solid-state workhorse, and even incrementally greater that gallium arsenide (GaAs), the solid-state material used in high-speed communications circuits. Later investigations with deeply cooled graphene were able to obtain electron mobilities two orders of magnitude greater than silicon.
* Early on, the Manchester researchers were able to attach electrodes to graphene to create a crude "field effect transistor (FET)", and since that time the major focus on the material has been for its possible applications in solid-state electronics. Conventional silicon technology has advanced tremendously in the past decades, but now it is gradually running into diminishing returns, with reductions in size and improvements in performance demanding ever more effort. One possible way to improve on current transistor technology is to come up with a faster material for the "channel" that links the input and output electrodes of the transistor. Graphene looks like a good candidate since it conducts electrons so easily, but it has the problem in that it's hard to shut off current flow. In digital electronics, transistors are used as ON/OFF switches, either allowing or shutting off the flow of current, but so far it's proven troublesome to figure out how to make graphene devices that pass substantially less current in the OFF state than they do in the ON state.
Graphene may be useful in the shorter term for analogue electronics, where transistors vary their current flow over a continuous range, in generally the same way as a volume knob on a stereo. IBM researchers have been able to fabricate graphene transistors for analogue applications that have good performance, though they're still not the equal of devices made of GaAs or other high-speed semiconductor materials. The researchers feel they can do better.
There's also work on getting graphene transistors to work as proper switches. It was known even before 2004 that ultranarrow strips of single-layer graphite had properties that made control of electron flow easier; researchers have made progress in fabricating such strips and building devices from them. However, nobody's built really practical graphene electronic devices just yet, much less tackled the next problem of how to mass-produce them.
* That leads to the troublesome issue of mass-producing graphene itself. While obtaining small samples of graphene by stripping two pieces of tape apart was workable for early lab studies, nobody in their right mind would consider that a practical approach for mass production. However, progress is being made in producing large-scale graphene sheets.
One approach has been to cook wafers of silicon carbide, with the silicon on the top boiling off and leaving a layer of graphene behind. The trouble with this scheme is that the graphene layer remains stuck to the silicon carbide wafer, there being no practical way to peel it off. Researchers have been able to use a "chemical vapor deposition" technique to lay down nickel thin films on top of the wafer, with the nickel layer then peeled off along with the graphene, and transferred to a plastic substrate. Workers elsewhere have been able to grow graphene on thin copper foils. There's optimism in the field that large-scale graphene production is at least possible.
However, few working in the field are guaranteeing that anything useful will come out of graphene research. Graphene is only one of a wide range of different materials being examined by materials researchers, and not all the materials are likely to actually live up to their promise. Despite that warning, for the time being graphene researchers remain very excited about the potential of their work.COMMENT ON ARTICLE
* FALSE CLUES? It has long been known that the genome is loaded down with what's been called "junk DNA": along with the sets of genes that honestly perform functions, the majority of the genome doesn't seem to do anything. There's been some unhappiness with the term "junk", since though some of the genome clearly is nonfunctional -- for example, our genome contains the broken genetic sequences of a number of viruses, inserted by viral infections in the distant past -- there's a strong suspicion that some of the "junk" actually has a function of some sort.
One of the reasons this has been believed is because some supposedly nonfunctional sequences in the genome are "conserved". Mutations are always cropping up in the genome; if they do the organisms some good, they're generally carried along to new generations, if they do some harm they gradually fade out of the gene pool. Truly nonfunctional DNA, so the thinking went, would mutate without any constraint, but if it had some function then selection pressures would conserve it.
That sounds plausible on the face of it, but nature doesn't have any concern for if we think something is plausible or not. As reported by an article in AAAS SCIENCE ("Genomic Clues To DNA Treasure Sometimes Lead Nowhere" by Don Monroe, 10 July 2009), the reality is turning out to be subtler in a particularly confusing way.
As a case in point, for several years Eddy Rubin, Len Pennacchio, and colleagues at the US Lawrence Berkeley National Laboratory (LBNL) in California have been examining genomes for "enhancer" sections that regulate gene activity. Finding enhancers is hard because they're not generally associated with the genes they control, and so the LBNL researchers focused on "ultraconserved" sections of DNA about 200 base pairs long that were not identified with any known function and were common to rats, mice, and humans -- plus some others that were similar even in fish. Such conserved sequences obviously had some function, right?
The strategy seemed to work. When the researchers spliced the supposed enhancer genes into mice, more than half turned on a corresponding "reporter" gene in particular tissues at specific developmental stages. So far so good, but then the LBNL team focused on four particularly promising candidates, knocking them out of the mouse genome to see what would happen. The expectation was that the deletions would be lethal, but to the surprise of the team, the deletion of the genes made no difference -- the mice seemed perfectly normal.
For another example, the most extensive data relating genome function and conservation come from the 2007 results of the pilot phase of an effort by the "Encyclopedia Of DNA Elements (ENCODE)" consortium, which focused on a selected 1% of the human genome. Of this sample, the ENCODE team estimated that about 5% is conserved to some degree, which is consistent with the generally assumed ratio for the human genome as a whole. Of this conserved fraction, the researchers found that no more than a third matches with protein-coding regions. Most of the remaining conserved DNA is actually transcribed into DNA by the cell, which suggested that it really does have some functions. However, attempts to trace the action of these conserved genes led to confusing results -- with about 15% of the conserved DNA showing no particular function at all.
On the other side of the coin, a decade ago Lawrence Hurst and Nick Smith, two researchers at the University of Bath in the UK, showed that a number of critical genes in mice -- the mice lose these genes, they die -- had accumulated mutations at a rate indistinguishable from clearly nonfunctional genes. This was such an outrageous idea that Hurst and Smith were told it was baffling as to why they would want to investigate the matter, and their results were generally ignored. Now some other studies have found no correlation between the degree of conservation of a sequence and its function, even for clearly critical genes in yeast, and the idea is being taken much more seriously.
* The correspondence between genetic function and conservation seems to be muddled both ways, with investigators beginning to wonder not merely if conservation doesn't necessarily imply function, but that function doesn't necessarily imply conservation. The result is that researchers are taking a step back to get a better understanding of how the genome actually works. Obviously something has been misunderstood -- but what? Scientists are now in idea-generating mode and have come up with some interesting possibilities:
The notion that functional genes tend to be conserved hasn't been thrown on the scrapheap, but it has become apparent that the vision we had of the process was much too simplistic. There's something going on that makes sense, we just have to figure out what. As Einstein once put it: "Subtle is the Lord, but malicious he is not."COMMENT ON ARTICLE
* THE TAMING OF THE CAT (1): It has always been a bit ambiguous if the common housecat is actually a domesticated animal. To be sure, the housecat often pays its way on farms by catching mice, and is at least occasionally a source, though rarely a prized one, of meat and fur -- but sometimes it seems more like cats have domesticated us than the other way around.
As discussed in an article in SCIENTIFIC AMERICAN ("The Evolution Of House Cats" by Juliet Clutton-Brock, Carlos A. Driscoll, Andrew C. Kitchener & Stephen J. O'Brien, June 2006), we share our homes with an estimated 600 million housecats worldwide -- but how this arrangement came to be has long been unclear. It was long assumed that the Egyptians were the first to keep cats, from about 3,600 years ago; they certainly did keep cats at that time, but new discoveries now suggest cats had been living among humans well before that.
The modern housecat has generally been thought to be the descendant of the Old World wildcat, Felis silvestris, but there was some doubt. In addition, F. silvestris is widely distributed, from Scotland to South Africa and from Spain to Mongolia, with a considerable number of subspecies over that range. If the housecat did descend from the Old World wildcat, which subspecies was the founder? Were there multiple domestication events from different subspecies?
About a decade ago, a research effort was begun to obtain DNA samples from almost a thousand wildcats and domestic cats in southern Africa, Azerbaijan, Kazakhstan, Mongolia and the Middle East. The assumption was that local subspecies of wildcats had remained genetically stable over the past several thousand years, and so genetic comparison with housecats could reveal which wildcat subspecies were most closely related to the housecat.
The analysis, which was published in 2007, focused on two kinds of DNA commonly used in taxonomic analysis of mammals: DNA from the cellular organelles known as "mitochondria", and short, repetitive sequences of DNA from the cellular nucleus known as "microsatellites". Mitochondrial DNA is only inherited from a mother, not a father; microsatellites include contributions by both parents. The genetic patterns obtained were then organized by geography. The results revealed five "genetic clusters" or "lineages" -- breeds or races, if you like -- of Old World wildcats:
The really interesting result of this analysis was that all domestic cats fell into the F.s. lybica group, which was as good a proof as anyone would like that cats were only domesticated once, in the Middle East.
But when did it happen? Geneticists can often figure out rough estimates of the time of divergence between different but related lineages of organisms by comparing the number of genetic changes between the two groups: the more changes, the longer ago the divergence. However, although such "genetic clocks" work well for determining relative ages, they don't work well for estimating times on as fine a granularity as 10,000 years, regarded as the very rough timeframe for cat domestication.
That means that we have to fall back on archeology to get a better idea of the history of the relationship between humans and cats. In 2004, Jean-Denis Vigne, an archaeologist of National Museum of Natural History in Paris, and his colleagues reported a spectacular find in a grave on the Mediterranean island of Cyprus, dating back 9,500 years. The grave contained the skeleton of an adult human, buried with stone tools, seashells, and other items of presumable value -- including an adolescent cat. Cats were unlikely to have made it to an island on their own; it is more than merely suggestive to think that humans brought them to Cyprus, and valued their company enough to want to have one around after death. It appears that cats were keeping company with people long before they were rubbing ankles in Egyptian households. [TO BE CONTINUED]NEXT | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE WARREN REPORT: Although the Dallas Police Department had begun an investigation into the killing of President Kennedy after the assassination, the DPD's investigation was never completed, being overshadowed by an FBI investigation. The bureau's investigation was complete by 9 December 1963, less than three weeks later. The FBI report concluded that Oswald had acted alone, and that he had fired three rounds from his "sniper's nest" on the sixth floor of the Texas School Book Depository. The report claimed the first and third rounds hit the president, the second round hit Governor Connally.
The FBI report was criticized because it was issued so quickly, and because it paid little or no attention to the possibility of a conspiracy behind the assassination. However, a more thorough investigation had already been initiated when the FBI issued the report. On 29 November 1963, President Johnson had set up a special commission, headed by Chief Justice of the Supreme Court Earl Warren, to investigate the assassination of Kennedy. The commission also included:
The "Warren Commission", as it was known, conducted its business mostly in closed sessions, though the sessions were not secret: they were closed simply for convenience, there having been no real pressure for open hearings, and full transcripts of the sessions were eventually published.
The commission presented its final report, 888 pages long, to President Johnson on 24 September 1964, with the report made available to the public on 27 September. The commission had examined the testimony or depositions of 552 witnesses and considered 3,100 items of evidence. The FBI report was used in the investigation, and the Bureau provided support for the commission's investigation. 26 supplementary volumes were published, with the materials examined by the commission placed in the US National Archives.
The Warren Commission report concluded, like the FBI report, that Oswald had acted alone and had fired three shots with his rifle. However, the Warren Report concluded that one shot had missed, one shot had hit both the president and Governor Connally, and a following shot had hit the president in the head. Even from the outset, there was widespread criticism of the Warren Report, with the critics claiming sloppy methodology, a failure to seriously confront the possibility of a conspiracy, and faulty interpretations of the evidence.
The critics in particular believed that there had been at least one other "shooter", most generally placed on a "grassy knoll" in Dealey Plaza to the front and right of the president's limousine. The critics were also suspicious of the fact that the commission's unreleased records were sealed for 75 years. Most of the records were released to the public in 1992, with a further increment released in 1998, and the rest to be released in 2017. Nothing particularly controversial has been found in the records released so far, and those familiar with the small remainder -- which were kept sealed because they contain confidential Kennedy family information -- say there is nothing particularly controversial in them, either.
The Warren Report would be bitterly attacked, slammed as a superficial job completed in haste, when it wasn't called a complete whitewash. However, the fact is that the Warren Commission spent over eight months on the job, interviewed hundreds of witnesses, and produced a massive amount of documentation, which is now available online for anyone who wants to inspect it. Few criminal cases are ever explored to a comparable level of detail. [TO BE CONTINUED]START | PREV | NEXT | COMMENT ON ARTICLE
* SCIENCE NOTES: An article from DISCOVER Online discussed how insects, generally stereotyped as biomachines with clockwork behavior, are turning out under investigation to be smarter than long assumed. Honeybees, for example, have a brain with less than a million neurons, as compared to the 80,000 million neurons of the human brain. However, experiments by Lars Chittka of Queen Mary University of London show honeybees can classify objects as "symmetrical" or "asymmetrical"; pick out objects based on selection criteria such as "same" or "different"; and navigate by reference to sets of landmarks.
In addition honeybees, like their cousins the ants, are social creatures with an elaborate set of behaviors to support their life in a hive. A honeybee can perform simple communications with its siblings; fan its wings to provide ventilation for the hive; drive out or sting intruders; and hunt for food. Chittka points out they have over 50 distinct behaviors, more than some vertebrates, and comments: "They are fantastically smart. Perhaps we are only amazed by this because we think small brains shouldn't be able to do it."
In the shadows behind Chittka's comment lies the growing perception that human consciousness, though still a marvel, may not be as "evolved" a feature as we would like to believe, leading to the question of whether the tiny mind of a honeybee may support its own tiny consciousness. For now, that question is unanswerable -- but having posed it, it stands as an interesting possibility.
* The science blogosphere was having fun over a recently-discovered genus of deep-sea worms known as Osedax. They're members of a family known as "beard worms" that's unusual in that their adult forms have no mouth, gut, or anus; they survive by acting as host of a colony of symbiotic bacteria that obtain nutrients through "roots" extended by the worms, with the bacteria metabolically supporting the worms.
The Osedax worms do the rest of the beard worms one better by specializing in digesting the bones of whales, or at least so it seems. That's not as preposterous as it sounds, since the food chain in the depths of the oceans is based on the fall of food organisms from the upper levels, what is called "marine snow". The fall of a whale has an impact comparable to centuries or even millennia of marine snow. The Osedax worms are also interesting because the males are much smaller than the females, with the males living in colonies inside the females.
Five species have been discovered in the genus. Genetic analysis suggests there may be many more, though the decline of whale populations may be exerting pressure on Osedax populations. The genetic analysis also suggests that the Osedax worms go back at least 45 million years, to the early days of whales, and possibly earlier -- back to the days of great marine reptiles?
Incidentally, not all are sure that the Osedax worms live entirely on whale bones. The worms appear to be distributed beneath all the deep oceans, and researchers studying them find they live perfectly well on cow or pig bones.
* A DISCOVER Online article proposed yet another potential scheme for sequestering carbon dioxide. The country of Oman, the California-sized nation at the southern end of the Arabian peninsula, has an enormous deposit of mantle rocks known as "peridotite" about 350 kilometers long and 30 kilometers wide (about 220 x 19 miles). The interesting thing about peridotite is that when it's exposed to air, it soaks up carbon dioxide to form a reddish mineral named "listwanite".
Peridotite tailings from mines being dug for other minerals have demonstrated that they can soak up CO2 rapidly at moderate temperatures. Estimates show that if the entire Omani deposit was mined and exposed to the air, it would soak up 4,000 years of human emissions. That of course isn't realistic -- besides, if we got rid of too much CO2, we might end up with global chilling -- but the "peridotite gambit" might seem to have possibilities. Even more intriguingly, peridotite deposits are common, being found in places like Greece, Papua New Guinea, and the US West Coast.COMMENT ON ARTICLE
* A WIND BLOWS IN CHINA: China is confronted with a massive challenge in obtaining energy to support the modernization of its economy, and the Chinese government sees renewables as one of the means of obtaining that energy. According to an article from SCIENTIFIC AMERICAN Online ("The Answer To China's Future Energy Demands May Be Blowing In The Wind" by Sarah Wang), China currently has the world's fourth biggest installed wind power base, generating a total of more than 12 gigawatts (GW), a bit less than half of America's current wind capacity. However, that's seen as no more than a start. China has lots of wide open, relatively unpopulated spaces where the wind rarely dies down, and so there's massive room for expansion.
As an example of current, efforts, the "Jiuquan wind power base" is expected to provide 10 GW by itself when it reaches planned peak capacity in 2020. It's being built in the "Gansu Corridor", a narrow and breezy natural passage cutting through the Gobi Desert, Qilian Mountains, and the Alashan Plateau. Six other big wind complexes have been authorized by the government.
China added a total of 6.25 GW of wind capacity in 2008, towards a government target for 100 GW by 2020. A recent study suggests that even 100 GW is thinking small, that China could build a wind power infrastructure covering 500,000 square kilometers for $900 billion USD that would provide 25,000 GW. 500,000 square kilometers sounds like a lot, and it is, but China's land area is close to 10 million square kilometers, so that's only 5% of the land, and for the most part the wind farms would be compatible with farming and ranching.
However, a consideration of details of the "Chinese wind gold rush" suggests a bit of caution. One problem is the clumsiness of Chinese bureaucracy: by dictating targets for the installation of wind turbines but not for actual power production, many of the turbines that have been installed don't end up being used. Some wind farms have been built at locations where the wind isn't constant enough to make them pay. Another problem is the immaturity of China's wind turbine industry. Chinese wind turbines have a pricetag only 70% of that of wind turbines found elsewhere and, partly thanks to government mandates, 75% of China's wind turbines are built domestically. Unfortunately, the quality of China's wind turbines is notoriously poor, with companies that were only founded a few years ago throwing together product with little or no testing, and shipping turbines that don't work.
The biggest problem is the Chinese electrical power grid. The inconsistency of wind power requires a "smart grid" that can rapidly and efficiently switch power from different sources over very long distances to make sure that major urban centers get consistent power. This is a big problem in the USA; it's a bigger problem in China, since the power grid is not as well developed. There's no sense in driving ahead on huge wind farms if there's no smart grid to deliver their product. Chinese officials believe that all the problems can be worked out, and remain confident that wind will become a major factor in China's energy supply. China needs much more energy to catch up with the West, and wind seems like one of the cheapest and cleanest options.
* THE ECONOMIST had more comments on China's green development efforts, pointing out their formidable scale, with the government funding a green stimulus package at the equivalent of hundreds of billions of dollars. The stimulus package provides money not only for wind power, but also to subsidize low-emission cars to help clean up China's notoriously smoggy big cities, as well as generously subsidize large solar-power projects. Of course, the Chinese government has more than one iron in the fire on the solar-power subsidies: China is the biggest producer of solar panels in the world, but most of the product is exported, and the industry is currently saddled with a glut of production in the face of reduced global demand. China's government wants to avoid an industry shakeout in reasonable hopes of better times later.
The stimulus package also includes elements that not all would agree were particularly green, with almost half devoted to a modernized train system. To be sure, trains are a much more energy-efficient means of transport than cars or trucks, but labeling an improved rail system as an environmental program might seem to be just getting doubled points for something that would be done anyway. Whatever -- governments play such games, why complain?COMMENT ON ARTICLE
* GENOME 10K: Almost every month, the science news announces that the genome of yet another organism has been deciphered -- one of the latest being the genome of the codfish, churned out by two genome sequencing machines and funding of a half million dollars. According to an article from AAAS SCIENCE ("No Genome Left Behind" by Elizabeth Pennisi, 6 November 2009), a group of genome and museum experts believe that a genome a month is thinking small. They want to start a five-year project to decipher 10,000 vertebrate genomes, a sixth of the number of known vertebrate species, or roughly one genome from every genus across the vertebrates.
The new initiative, named "Genome 10K", amounts to little more than an organization with a goal right now, with no details of how the project is actually going to be implemented or where the funding is going to come from. Those involved in the project are still enthusiastic, pointing to the fact that when the Human Genome Project was launched, nobody had any clear idea of how to get that job done either, with critics declaring it hopeless. The way to Genome 10K is being paved by projects like the codfish sequencing, which was partly done as a demonstration of new sequencing technologies that could decipher complicated vertebrate genomes at low cost. Similar projects are in progress, and better tools are becoming available.
While the human genome was the most important goal of genome sequencing, once our genome had been published researchers were keen to track down other genomes for comparison. Five years ago, the US National Human Genome Research Institute (NHGRI) began to assemble a list of what would end up being 32 mammals and 24 other vertebrates seen as desireable candidates for genomic analysis.
The committee assembled by the NHGRI included David Haussler of the University of California at Santa Cruz and Stephen O'Brien of the US National Cancer Institute; later the two got together with Oliver Ryder of the San Diego Zoo, with the three of them cooking up ideas for a much more ambitious genome analysis program. In April 2009, they conducted a meeting involving about 50 participants from all over the world, with the participants identifying 10,000 candidate vertebrates for sequencing. The meeting then broke down into subgroups to identify frozen DNA samples that could be sequenced -- to finally conclude that there were samples for about 16,000 vertebrate species in the lab freezers of the world. It seemed the time was right to go ahead.
Advocates believe that Genome 10K will provide insights into human biology and the evolution of vertebrates; they also point out that such a catalogue of genomes would be a major tool to help conservation efforts. There remains the problem of how to get it done. True, gene sequencing machines have been getting cheaper and cheaper, but the cheap machines tend to be limited, only capable of deciphering short genome segments at a time. The short "read length" is particularly troublesome for large vertebrate genomes, and so teams working on Genome 10K have been focusing on techniques to combine low-cost sequencers with computer power to handle bigger genomes. Along with the codfish, various teams have tackled the panda bear, the stickleback fish, and the bush baby.
Even better technology is on the way, but what's on the horizon still doesn't seem powerful enough for Genome 10K. Haussler, O'Brien, and Ryder believe that the effort will demand that the cost of decoding a genome drop to about $2,500 USD, since they want the entire cost of the effort to come in at no more than $50 million USD. They want to keep costs down partly because nobody's identified major sources of funding yet; NHGRI is noncommittal, so the hope is that private philanthropists will step in, which will be more likely if the program doesn't cost too much. The Genome 10K project has also been criticized for not addressing the monster data management task associated with such an effort -- sequencing 10,000 genomes in five years will require archiving and making available a complete genome at least five times a day, every day.
Still, the people pushing Genome 10K are confident that the obstacles can be overcome. Says Haussler: "We've got real momentum now ... I don't think it's a question of IF, but of WHEN."COMMENT ON ARTICLE
* EVOLUTION OF THE NERVOUS SYSTEM (2): To hunt down the evolutionary origins of the nervous system, researchers have turned to one of the most primitive and ancient of multicellular organisms, the sponge. To be sure, there is really no such thing as a "living fossil" -- living organisms are cousins of each other -- but the sponge is seen as much like early multicellular organisms.
As mentioned in the previous installment, sponges don't have a real nervous system, but they have various parts of a kit that could be precursors of one. Analysis of the genome of the sponge Amphimedon queenslandica shows that it contains genes to code for proteins generally found on the receiving side of a neural synapse -- despite the fact that no synapses have been found in sponges. Obviously these proteins are serving other functions, though nobody has a clear idea of what. That hints at how neurons may have acquired basic components from earlier biosystems.
Further examination of this sponge shows that its larvae contain developmental genes that, when spliced into frog embryos or fruit fly larvae, lead to the growth of neurons. The researchers involved in these experiments suggest that the sponge cells that contain these genes, which are on the surface of the larvae, are a kind of "protoneuron", with a limited ability to sense where to "plant" themselves on the seafloor for their future sessile life. In addition, also as mentioned, some sponges can react to stimuli, they can generate action potentials: when a glass sponge detects sediment in the water it is filtering, an action potential sweeps over it and shuts down its filtering cilia. The response is slow, but that's exactly what would be expected for organisms with early "intermediate" schemes that would eventually lead to true nervous systems.
However, some biologists believe that the primitive nature of sponges is misleading, that the actual baseline for multicellular organism is provided by the "ctenophores" (pronounced, more or less, "ten-oh-fours") or "comb jellies", which are organisms much like jellyfish but lack stinger cells. The comb jellies do have real neurons and a simple nerve network; if this suggestion is correct, and it is controversial, then the ancestors of sponges may have had a true nervous system as well and simply lost it.
Other biologists like to promote the "cnidarians" (pronounced, more or less, "nigh-derry-anns"), which include the true jellyfish, anemones, and corals, as the model for early multicellular organisms. Like the comb jellies, the cnidarians have neurons and simple nervous systems -- anemones have diffuse decentralized nervous systems, though some jellyfish do have bundles of nerves running around the base of the bell. Genetic analysis of some anemones show they have the genes to express a wide variety of neurotransmitters and matching receptors, meaning their neural action is sophisticated even if their overall nervous system organization is not.
* That leads to consideration of the evolution of the nervous system itself as opposed to the neurons. The general consensus is that early nervous systems were diffuse, simple webs of neurons, and that they gradually evolved towards centralized nervous systems, with a cluster of neurons at the front of the body -- the brain -- and a nerve cord that provided a central "bus" for neural connections to the brain. There appear to be developmental genes common to widely different animals, from fruitflies to earthworms to us, that direct such an organization of the nervous system, suggesting that this pattern was established in a distant common ancestor to modern "bilateral" animals.
However, some researchers suspect that's too glib. There are invertebrates, the "hemichordates", that have most of these developmental genes, but have diffuse webs for nervous systems. Possibly the developmental genes are useful but not essential for a centralized nervous system -- or possibly some animals with centralized nervous systems gradually lost them, a process that biologists tend to find bewildering, feeling as if nature is playing devious tricks on them.
That's hardly the end of the list of the questions about the evolution of the nervous system, either. Where did the myelin sheath that speeds electrical conduction come from? How about the emergence of specialized neurons, like the "glial" cells that tend to be heavily represented in the more elaborate neural systems?
And then there's the huge question of how brains capable of sophisticated behaviors arose. There was long a tendency in evolutionary biology to read a sequential progression into the emergence of intelligence, with humans as the end product of a lineage going back to the flatworms, but that's misleading. To be sure, humans are obviously vastly more intelligent than flatworms, but evolution operates far more as a branching tree than in a straight line. One particularly interesting question is: how many different ways are there to build complex brains? Like the eye, brains may well have evolved along distinctly different paths that resulted in distinctly different capabilities.
That opens up the interesting if purely abstract question of what would happen if we ever made contact with sentient organisms from distant star systems. The likelihood that their brain organization would have close resemblance to our own seems very low. However, in the absence of any communications from distant worlds, we're stuck with the challenge of understanding the operational basis of our own brain -- and that seems formidable enough. [END OF SERIES]PREV | COMMENT ON ARTICLE
* THE KILLING OF JFK -- THE MURDER OF A PRESIDENT: The assassination of US President John F. Kennedy (JFK) in Dallas, Texas, on 22 November 1963 remains a matter of controversy even in the 21st century. The official story is that JFK was killed by a solitary assassin, but there are persistent claims of a conspiracy. This series discusses the Kennedy assassination and examines the claims for conspiracy.
It is worthwhile to start with what is unarguably known about the JFK assassination. The president planned a tour because Texans had not voted enthusiastically for him in the 1960 presidential elections, despite the fact that Texas power politician Lyndon Baines Johnson was his running mate. Kennedy felt he needed to cultivate Texas and in particular Texan politicians to support his bid for reelection in 1964. The president decided on the visit in a meeting with Vice-President Johnson and Texas Governor John Connally on 5 June 1963. The president was to make appearances in five Texan cities.
The president's visit to Dallas, the fourth stop and planned for November 22, was announced by the Dallas newspapers on 26 September 1963. On 24 October 1963, Adlai Stevenson, the American ambassador to the United Nations (UN), visited Dallas, only to end up being roughed up by Right-wing protesters, even being hit over the head with a protest sign. The incident raised worries about security for the president's visits, both with the Dallas Police Department (DPD) and the Secret Service.
In Dallas, the president was to pass through the city on a public motorcade for a luncheon speech at the Dallas Trade Mart. The route of the motorcade was described in the Dallas papers on 19 November 1963, and a map of the route published to allow the public to observe the visit. The DPD and the Secret Service worked together on security.
The president and his entourage flew in to San Antonio on 21 November, to finally end the day at the Texas Hotel in Fort Worth that evening. The next morning, 22 November, it was raining. However, by the time the presidential jetliner AIR FORCE ONE had performed the absurdly brief jump from Carswell Air Force Base near Fort Worth to Love Field in neighboring Dallas -- arriving at the airport was judged more "presidential" -- the sky had cleared, turning pleasantly sunny for the motorcade through Dallas.
The president, Governor Connally, and their wives rode in an open-top 1961 Lincoln Continental limousine. At the time, the White House didn't have a car with an armored top, though efforts were being made to obtain one. Vice President Johnson was also in the motorcade, two cars back from the president's limousine. Just before 12:30 PM, the limousine rolled into Dealey Plaza in downtown Dallas and passed the seven-floor Texas School Book Depository building. Several shots rang out; both the president and the governor were hit. The limousine driver rushed the wounded occupants to Dallas's Parkland Memorial Hospital.
A garment manufacturer named Abraham Zapruder happened to be filming the president with an 8 millimeter home movie camera while the shots were being fired. The 26.6 second film would prove to the most famous amateur movie in history. Police were informed by witnesses near or in the Texas School Book Depository that the shots appeared to have come from the upper floors of the building, and the police sealed off the building. One of the witnesses, Howard Brennan, was able to provide a general description of the shooter, which the police then broadcast as an all-points bulletin. A bolt-action Italian Mannlicher-Carcano 6.5 millimeter rifle and three spent rifle cartridges were found on the sixth floor of the depository. One of the supervisors at the depository told police that one of his employees, Lee Harvey Oswald, was missing.
At about 1:15 PM, a DPD officer named Jaydee Tippit -- he wrote his first name as "J.D." but they weren't initials -- spotted a pedestrian matching the description from the all-points broadcast, and pulled up his patrol car. When Tippit got out, the man shot Tippit three times and then pumped a fourth bullet into the fallen policeman's head, killing him. The killer was seen by about ten witnesses, either in the act of killing the policeman or fleeing the scene with a pistol in his hand. Shortly afterward, Oswald was arrested after fleeing into the nearby Texas Theater; he pulled a pistol on the police, but they managed to subdue him. He was hauled off to police headquarters.
* Meanwhile, at Parkland Memorial Hospital, doctors were dealing with the president and the governor. Although the emergency room staff focused on wound to the throat, they soon realized that the president had a massive head wound and there was no saving him. He was given last rites by a priest and announced dead at 1:00 PM. Governor Connally was seriously wounded and taken to emergency surgery; the governor would survive and recover.
The president's death was officially announced to the public at 1:33 PM. A little after 2:00 PM, the president's body was taken in a coffin to Love Field, where it was loaded into the rear of AIR FORCE ONE; a row of seats had been removed to accommodate it. By Texas law, the body was supposed to remain in Texas until it had been examined by a Dallas coroner, but the Secret Service took it anyway. At 2:38 PM, just before the departure of AIR FORCE ONE, Lyndon Johnson took the oath of office and became president of the United States.
AIR FORCE ONE landed at Andrews Air Force Base near Washington DC and the president's body was taken to Bethesda Naval Hospital, where a team of doctors performed an autopsy that lasted until about 11:00 PM. The president's body was embalmed and prepared for funeral during the dark hours of the morning. Kennedy was laid to rest in Arlington National Cemetery on Monday, 25 November; the funeral was attended by representatives of 90 nations and watched on television by a collectively stunned nation.
* Oswald was not among the viewers. He had been arraigned for shooting the president and Tippit, but under repeated interrogations Oswald denied shooting anyone, claiming he was "just a patsy", that the police were trying to pin a crime on him he didn't commit. Representatives of the Secret Service and the US Federal Bureau of Investigation (FBI) also participated in the interrogations. On the morning of Sunday, 24 November 1963, Oswald was to be transferred from police headquarters to the Dallas county jail. However, as officers were moving him to an armored van at 11:21 AM, a Dallas nightclub owner named Jack Ruby dashed up to Oswald, shoved a revolver into his abdomen, and put one bullet into him. The shooting was caught live on national television. Oswald was rushed to Parkland Memorial Hospital, where he was pronounced dead at 1:07 PM.
Oswald's death meant that he was never tried for his alleged crimes. Ruby was charged with murder and judged guilty in a Dallas court on 14 March 1964; he was sentenced to death. His lawyers appealed the conviction, saying there was no way Ruby would have received a fair trial in a Dallas court, and the Texas courts agreed. Ruby was to be retried in February 1967, but he died of cancer on 3 January 1967.
* Those are the facts of the JFK assassination that nobody really disputes. All the rest have been the subject of endless argument. Did Lee Harvey Oswald actually kill JFK? If Oswald did kill the president, was he operating completely on his own as a "lone gunman" or "lone nut", or was he part of a wider conspiracy? If there was a wider conspiracy, then who was involved? Since JFK's assassination, huge numbers of books have been written to show there was a conspiracy -- and the list of those who could have been involved in the alleged conspiracies gets longer and longer. [TO BE CONTINUED]NEXT | COMMENT ON ARTICLE
* GIMMICKS & GADGETS: As discussed on WIRED Online, European Union automotive technology researchers have proposed a concept they call a "road train", in which cars, trucks, and buses can electronically link up on the freeway. In principle, it works like this:
The road train promises improved fuel efficiency and better utilization of congested freeways. The technology to implement it is available, and it requires no major modification of existing highway infrastructure. The lead vehicle, along with its control capability, might well be equipped with advanced sensor and communications systems for accident avoidance and traffic management that would be too expensive for most passenger cars; the control system in the cars in the train would be able to anticipate problems and provide a warning or a degree of automatic corrective actions.
The obvious objection to the idea is the potential for disastrous pile-ups, and the closely-related instinctive reluctance of drivers to give up control of their vehicles. Of course, such a scheme would demand strict qualification -- but the current highway system is without argument hazardous, and the end result might well be that the road train is statistically safer than going it alone, however unsettling it might seem.
* Also as discussed in WIRED Online, researchers at the Massachusetts Institute of Technology (MIT) have been working with car-maker Audi on a "dashboard robot" for cars that they call the "Affective Intelligent Driving Agent (AIDA)" that would monitor the driver's habits and routes along with driving conditions, providing real-time traffic information, useful advice, and warnings. AIDA is more than a mere smart system built into the car, however; it has a moving head, something like a streamlined version of that of Disney's WALL-E robot, that sticks up from the dashboard and turns to communicate interactively with the driver or passengers. AIDA's "face" is a flat-panel display that can provide cartoony expressions, or display data as needed.
AIDA, on the face of it, seems a much better idea than the EU road train -- distracting, possibly, but not much more or less than a passenger, and certainly paying its way by helping process information useful to the driver. AIDA is a bit reminiscent of the "persocoms" discussed here a few years back. It gives a vision of the future that shows the error in the ideas of trying to build a robot that thinks like a human -- AIDA only deals with a restricted set of circumstances and doesn't need general-purpose smarts -- or trying to build a fully autonomous car -- for the time being, maybe one that just offloads part of the task will be plenty good enough. Right now, however, nobody needs to ask just how much it's going to cost.
* Finally, in other automotive news, WIRED Online reports that the USA is seeing an upsurge in the use of passenger buses for inter-city travel -- for example the "DC2NY" bus service, which shuttles between Washington DC and New York City, a distance of about 370 kilometers (230 miles). On a bus, the trip of that length could take up to five hours, depending on traffic, but the bus has become more popular. That's not just due to its low cost, but also because airport security has become such an obnoxious pain, and in particular because bus lines like DC2NY have wired up their vehicles to provide free wi-fi, allowing passengers to surf the internet down the highway.
Internet-enabling a bus costs about $5,000 USD, but the payoff is quick. In many cases, people are taking the bus as an alternative to driving, preferring to sit behind the keyboard of their laptop instead of the wheel of their car. Digitally-enabled trains are also seeing an increase in customer traffic.COMMENT ON ARTICLE
* MOCKBUSTERS: As illustrated by an article from WIRED magazine ("Now Playing: Cheap & Schlocky Blockbuster Ripoffs" by Brian Rafferty), parasitism is almost as inevitable as death and taxes, or as the old nursery rhyme has it:
Big fleas have little fleas, Upon their backs to bite 'em, And little fleas have lesser fleas, and so, ad infinitum.
This principle is revealed by a visit to a video rental outlet, where an inspection of the DVD racks shows that when Hollywood blockbusters like TRANSFORMERS appear, they promptly acquire strange "shadows" with names like TRANSMORPHERS, whose DVD case artwork just seems to scream CHEESE! CHEAPNESS! SCHLOCK!
Welcome to the land of "mockbusters", the low-budget direct-to-DVD movies that ghost Hollywood blockbuster releases. The field, such as it is, is dominated by a studio named "the Asylum", which has been churning out low-budget derivative videos for over six years, with titles like THE TERMINATORS, MEGA SHARK VERSUS GIANT OCTOPUS, 300000 LEAGUES UNDER THE SEA, and THE DAY THE EARTH STOPPED. The Asylum's latest effort is SHERLOCK HOLMES, shadowing the Robert Downey JR blockbuster of the same name -- Holmes is in the public domain now, nobody has exclusive rights to him -- and featuring tacky robots, hookers, cheap digital effects, has-been actors, and incomprehensible scripts, all at a cost of about half million USD per movie.
It might seem that even a cool half million would be too much for the schlock market to bear, but in reality the Asylum pulled in $5 million USD of revenue in 2009. OK, that's almost pocket change for a big-name star or a major studio, but for the rest of us five megabucks is fairly good money. The Asylum is even growing, having moved to a new studio in Burbank, California, flush with contracts for twenty new movies to feed the SyFy network and other NBC cable channels.
* The Asylum is the brainchild of Paul Bales, David Michael Latt, and David Rinawi, all now forty-somethings. Latt and Rinawi founded the studio in 1997, with Bales later arriving to replace a partner who had left. Originally the Asylum distributed arthouse films, but then the indie movie craze dried up. The Asylum moved on to direct-to-video horror, but then the major studios began to release their own horror titles and the market got flooded.
In 2004, Steven Spielberg was working up for release of the Tom Cruise WAR OF THE WORLDS remake. The Asylum decided to make their own version -- once again, Wells' original novel was in the public domain, anybody could play -- and somewhat to their surprise, it did well in the rental market, particularly at Blockbuster video. The Asylum was onto a good thing and decided to keep on going. The studio didn't determine what movies to rip off; they set up an arrangement with distributors, who would tell them what movies to make, with the Asylum then throwing a movie together in about four months. It's not too hard to be profitable when the production costs are so low, and the Asylum's only marketing overhead is to come up with garish DVD package art.
Rinawi coyly says the Asylum is "a little audacious" about the way the studio's movies tie into blockbuster releases, but points out correctly that it's not a new practice. Schlock movie producer Roger Corman did much the same for decades. Hits like JAWS, STAR WARS, and E.T. all were ghosted by B-flic derivatives. The Asylum is just being methodical about it. The people who run the Asylum admit they're not making art, but they're making money and having fun. Besides, they're keeping starving actors fed, and people rent the damn videos. Where's the problem?
* It certainly is hard to figure out who does rent the videos, but the promotion that accompanies a major blockbuster simply generates an audience demand of sorts for the mockbuster shadow -- and thanks to the similarity in titles, the mockbuster usually ends up sitting right next to the blockbuster on the racks, giving the shadow a higher profile. To be sure, a little mistaken identity does happen, but if that were the driver for the business, video rental outlets would be flooded with complaints. That's not happening. People just want to kill some time, they can't find anything more interesting, they're a bit curious, and so they pick up the DVDs. The Asylum's flics may be low budget, but as far as the story and script go, they're not any schlockier than some bigtime Hollywood hits like TRANSFORMERS. They just don't have Megan Fox.
The Asylum does face a challenge as video rentals are increasingly displaced by movie downloads, which means shifting the distribution channel. On the plus side, the studio's relative economic success is allowing the management to increase the budget a bit, and thanks to the current economic downturn -- it's an ill wind that blows nobody good -- they've got a better pick of Hollywood talent. There's a market for the product, and the Asylum can churn it out.
As for myself, however, though I have been slightly curious about mockbusters on the DVD rental racks, I've never been curious enough to pay any money to rent one, and in fact I wouldn't expend the time to watch one if somebody gave it to me. I'm not against trash, but I am selective about the kind of trash I like.COMMENT ON ARTICLE
* COLLATERAL DAMAGE: We have a balanced relationship with vast numbers of bacteria that live in our digestive tract and elsewhere in or on our bodies -- in fact, we live with ten times more foreign cells than the human cells in our body, though the foreign cells are much smaller. Many of these bacteria cause us no harm, and some of them seem to be necessary to maintain our health. As reported by an article on SCIENTIFIC AMERICAN Online ("Bugs Inside" by Katherine Harmon), our overuse of antibiotics is threatening to wipe out some of these cohabitating bacterial species, with potentially unpleasant results for us.
For example, yeast infections are often kept in check by our resident bacteria, with the ironic result that antibiotics can promote yeast infections by killing off the bacteria that would normally stop them. Similarly, although the gastric bacteria Helicobacter pylori has been pegged as a cause of stomach cancers, its decline seems to be promoting various "reflux" disorders of the gastrointestinal tract by altering our body pH and hormone balances. It has also been demonstrated that children who are positive for H. pylori have reduced incidence of childhood allergies like asthma and skin rashes. It is suspected that the absence of the bacteria may be even contributing to obesity and diabetes.
H. pylori is actually much better understood than our other bacterial hitchhikers -- with the unpleasant implication that other species that inhabit our gut and we don't understand so well may also be disappearing, with uncertain effects on our health. Vaccines that target the strep bacteria that can cause pneumonia do seem to be effective in reducing the incidence of that disease, but strep bacteria are a fairly normal and usually benign component of healthy individuals. The strep bacteria seems to restrain less benign staph bacteria, and so the vaccine against strep seems to be encouraging staph infections -- a circumstance made all the worse because staph is notorious for its increasing resistance to antibiotics.
Squeaky cleanliness is a virtue in a hospital since sick people can be vulnerable to deadly opportunistic infections, but it is neither possible nor desireable to wall ourselves off from the sea of microbes around us. It is becoming obvious that we need to attain a more natural balance with our microbial environment. Standing up for bacteria does sound a bit strange, of course, and those who suggest the need for such a natural balance have been accused of encouraging parents to let their children eat dirt. Putting it that way is a bit over the top, but some researchers believe the time may come when children will be tested for their bacterial hitchhikers, and given appropriate doses of those that they need, but are lacking. The advocates know that sounds alarming, and so they carefully refer to such treatments as "vaccines" ... one could make a legitimate case that they actually are.
For now, the idea of balancing our microbial load remains science fiction because we have such a poor understanding of the "microecology" defined by the human body. The Human Microbiome Project -- mentioned here some months back -- is now trying to inventory the species involved in that microecology and characterize their genomes. However, nobody sees that effort as any more than just a start to the complicated job of figuring out the interactions of those organisms. As the saying goes, once we've got into a can of worms, the only way to contain them is to get a bigger can.COMMENT ON ARTICLE
* EVOLUTION OF THE NERVOUS SYSTEM (1): The nervous system was a significant innovation in the history of life, leading to the rise of animals with increasingly elaborate behaviors and rising levels of consciousness. The nervous system was not an inevitable development; nothing in evolution occurs as a plan for the future, and most of the organisms on this planet -- single-celled life-forms, fungi, and plants -- get along fine without a nervous system.
As reported by an article in AAAS SCIENCE ("On The Origin Of The Nervous System" by Greg Miller, 3 July 2009), the nervous system has clearly evolved from simple nerve networks to more elaborate structures such as the human brain. Although no sponge has a nervous system, some can react to stimuli in a sluggish fashion; jellyfish have a nerve network, but no central node or brain; while flatworms have a simple brain that would prove the basis for much more elaborate brains, like our own. It has been observed that some of the key molecular elements of the nervous system can be found in single-celled organisms, suggesting that the nervous system was based in good evolutionary fashion on elements that originally served some entirely different purpose. The consensus is that the nervous system arose very early in the evolution of multicellular animals, more than 600 million years ago, and that circuits of interconnected neurons arose soon afterward -- first as diffuse webs, then as a centralized brain with a hierarchical nervous system.
However, the details of how the nervous system evolved remain poorly understood. Exactly how did the first nerve cells arise? And what were the paths of evolution of the nervous systems that followed? Humans, even scientists, have a tendency to read a nice neat linear progression of forms into evolution when it doesn't actually work in any such tidy fashion. In reality, evolution produces unending branches of divergence, with many of the branches dying out. We're not actually even certain that there was a single origin of the nervous system -- after all, the eye has evolved along many different parallel paths, so why would we assume the nervous system hasn't?
* That question may seem a little over the top, but in fact neurons -- nerve cells -- come in many varieties. Neurons are easily recognized but a bit hard to neatly describe. They all share "directionality", receiving information at one end and transmitting it to the other. Another defining feature is "electrical exciteability": a neuron regulates the flow of ions across its surface membrane to conduct electrical impulses. Nearly all neurons form "synapses" -- gaps between cells bridged by chemical neurotransmitters -- while many have branches called "dendrites" that receive synaptic inputs, and "long axons" to transmit outgoing signals.
Once organized in circuits, neurons permitted much more sophisticated behavior in animals. Electrical signaling by neurons is faster and more specific than the slow diffusion of chemical signals, allowing quick detection and a coordinated response to threats and opportunities. With a few refinements, the nervous system can remember the past and identify patterns in events to allow prediction of the future.
Precisely how neurons themselves got started, however, remains mysterious. In 1970, George Mackie of the University of Victoria in Canada suggested that the starting material might have been something along the lines of the sheet of material that makes up the bell of a jellyfish. Cells in the sheet can both detect physical contact and then contract in response; Mackie suggested that, in the distant past, such "multifunctional" cells led to two distinct types of cells, with sensory cells on the sheet's surface and muscle cells underneath. Initially, Mackie suggested, cells in the two layers touched, with ions passing through pores in the cell membranes to conduct electrical signals between them. Further evolutionary adaptation led to an increase in spacing between the two types of cells, with axons emerging to bridge the gap. Eventually this process led to the modern nervous system.
There's much to like in Mackie's scenario and it still remains popular, but there are other scenarios, not necessarily mutually exclusive. As one researcher puts it: "Neurons may have appeared in multiple lineages in relatively short time." The fact that different types of neurons have similar features may be traced to their emergence from components that appeared before them. Voltage-gated ion channels -- tiny pores that control the flow of ions across a neuron's surface membrane to produce electrical signals -- can be found in bacteria and archaeans. And in the 1960s, researchers found that when the single-celled organism Paramecium caudatum bumps into an obstacle, a voltage change sweeps from one end to the other, much like the "action potentials" that transmit a signal down a neuron. This electrical blip reverses the beat of the organism's cilia, allowing it to change direction.
Paramecium doesn't have a nervous system -- it can't have nerve cells, since it only consists of one cell to begin with -- but its single cell includes neuron-like electrical exciteability in its multifunctional repertoire. Few doubt that basic elements of neuron operation well predate the origin of neurons. [TO BE CONTINUED].NEXT | COMMENT ON ARTICLE
* ANOTHER MONTH: I've mentioned in the past that I have an old laptop participating in the BOINC distributed scientific computing system, in which people "donate" time on their computers to help solve scientific problems. DISCOVERY CHANNEL Online reported that one BOINC contributor named Brad Niesluchowski was known as a particularly heavy contributor to the BOINC "SETI@home" application, which crunches sky survey radio data to see if it contains alien communication signals.
Niesluchowski's contribution was so huge, in fact, that it was puzzling how he had access to such massive computing horsepower. The answer turned out to be simple: he worked for a school district in Arizona and had set up BOINC software on all the PCs in the district. Obviously, Niesluchowski had some problems with thinking out the consequences of his actions; his poor judgement of course extended to other areas, for example downloading porn on his work computer. He ended up being investigated, and the investigation revealed his clandestine BOINC network.
For what it's worth, the school district claimed that the additional power used by BOINC and the wear and tear involved cost well over a million dollars. Niesluchowski was asked to resign.
* I spent some time this last month in troubleshooting sessions, which I find fun. I had a plastic illuminated Santa out front of my house, connected to a power socket in the back through a long extension cord looped around the house and hooked up through one of these cheap (but surprisingly effective) electromechanical timer modules. Although I'd tried to protect the connections from wet weather, snow shorted them out and the Santa went dark.
I poked around and found the socket out back was dead. However, the breaker for the socket in the breaker box in the garage was ON. I went so far as to pull the socket and check the wires; they were dead. They were hefty wires, the breaker would trip well before the wires would go -- what gives?
I ended up staring intently at the breaker box and then, either by a hunch or sheer chance, noticed the power socket panel in the garage near the circuit breaker box had its own breaker buttons, just like a panel in a bathroom does. The breaker button was out; I shoved it in and checked the outside socket. It was hot, all good. I would have never guessed there was a second breaker in series with the one in the breaker box.
I kluged up better weather protection for the cabling with a plastic kitty litter bucket I'd found and some other items. Following this exercise, I got into a couple of other household repair tasks I'd had in the queue for a while. I find that "handyness" is about half skill, half tools; not being entirely certain of the skill factor, I never hesitate to buy a tool if I think I have use for it. However, even then Murphy's Law applies in its variations: No matter how many wrenches you own, you'll still find out you don't have one that's the right size.
* In the "news from the fringe" category from last month, in the pre-dawn morning of 9 December, much of Norway was treated to a spectacular lightshow in the form of a transitory spiral climbing into the black sky, leaving a blue trail behind it. There was considerable international puzzlement over the object. One conjecture was that it was a Russian missile launch failure -- which was exactly what I had thought almost instantly on reading about the incident, recalling a US Trident sub-launched ballistic missile (SLBM) failure from years ago where the missile popped out of the water and did a triple spiral, until the range safety officer blew it up.
The Russians later admitted it was a third-stage failure of a test launch of a new Bulava SLBM. The exhaust system apparently ended up venting to the side, throwing the stage into a spiral. Despite this revelation, the tinfoil-hat brigade played up the incident, claiming it was an alien visitor, a wormhole, or an Illuminati global mind control effort -- and that the Russians had to be performing a "cover-up" with their "ridiculously unbelievable" missile story, that the blue trail was "completely unexplainable".
The actual explanation of the trail was interesting. Modern SLBMs use solid fuel, which effectively consists of a binder, for example butadiene-type synthetic rubber, mixed with aluminum powder and ammonium perchlorate oxidizer. The aluminum powder burns to produce aluminum oxide crystals -- effectively particulate-sized sapphires. The spiral appeared not long before sunrise and the blue trail was sunlight reflecting off the trail of sapphires left by the missile upper stage above the atmosphere. I wish I'd seen it -- it would've been thoroughly unforgettable.COMMENT ON ARTICLE