* This weblog provides an "online notebook" to provide comments on current events, interesting items I run across, and the occasional musing. It promotes no particular ideology. Remarks may be left on the site comment board; all sensible feedback is welcome.
* NEW PC ADVENTURES 2016: Although I have two notebook computers that run Windows 10, I was chafing because my old Compaq tower desktop couldn't -- primarily because I wanted to run Win10 apps, and the old desktop couldn't be upgraded from Win7. Besides, it was six years old, and had seen plenty of hard use; I didn't have confidence that it would stay running much longer.
I had established an austere budget and hadn't the money to upgrade right away, but on reviewing my finances over the past few years, I decided I could revise my budget to be more flexible. I still had to wait a few months to meet savings target, but I finally broke down and bought an Acer ACX-703 mini-tower PC for about $350 USD, including an extra 4 GB RAM card. Although it cost effectively the same as the Compaq, it was substantially more powerful, with a 2.41 GHz quad-core CPU, compared to a 2 GHz dual-core CPU, a 1 TB hard drive instead of 500 GB, and (with the RAM card) 8 GB of RAM instead of 4 GB.
I knew that getting the new PC set up would be troublesome -- and it was, knocking a full day out of my work schedule. The first problem, a big one, was that I put in the extra 4 GB RAM card, and the PC wouldn't boot. I yanked the extra RAM card and continued with the install, while trying to troubleshoot the RAM card installation at intervals. To make a long painful story short, after a few days of faffing around, I finally despaired of getting the thing to work -- and decided, in desperation, to switch out the RAM card the PC came with and plug the extra RAM card in its place. It booted. I played another hunch, putting the default RAM card in the auxiliary socket, and it booted fine. I'm an experienced trouble-shooter, this sort of thing doesn't surprise me too much.
That was the worst problem. Since I'd already come up with a procedure for porting my desktop system to my two notebooks, most of the rest of the installation was straightforward, if time-consuming. Where it fell down was in trying to get the Win10 email app to work. I had been having problems with the Windows Live emailer on Win7, and figured the new Mail app on Win10 would clean things up.
I was 100% wrong. The Win10 app is very braindead, delegating maintenance of email address lists to another app, which was very hard to use. I decided to look around for a free email utility, and quickly found Mozilla Thunderbird. It's been around for a long time, but I had never thought to look for it. In any case, I downloaded and installed it, and it worked perfectly well.
Another problem was that Windows Explorer didn't have any real capability to set colors, with everything in black and white. I found that hard on my eyes, so I looked around for a utility again, and found the "QTTabBar" toolbar. It was easy to install, but configuring was a bit tricky. It seemed like text was disappearing from Windows Explorer panels I hadn't selected, but it turned out the background color I set it to, gray, was the same as the de-selected font color. OK, I changed the font color and all was OK. I also had to make sure I set "Compatible Folder View" in the QTTabBar setup menu before I set colors, and then reboot the PC.
OK, once I had the system from my old PC set up on my desktop, I wanted to add to it the few Win10 apps I had already installed on my notebooks, particularly the Lexis Audio Editor. That led me to the next problem, which was the fact that, over time, I had by bumbling set up two different accounts in the Windows Store. I got hung up on conflicts between the two accounts, and could not get account information from one to another. That problem ended up solving itself; somehow, all my account information finally migrated to my current account, likely due to my fussing around giving the system a clue to make the connection, and I didn't have to worry about the old account any more.
The last big problem in installation was getting my wireless laserjet printer to work. It was painful, until I realized the installation program on the CD that came with the printer was broken; I didn't recall it was broken before, possibly it just didn't like Win10. I downloaded the installation program from online, and got the printer to work in short order.
Other than various tweaking, and making sure all three of my Win10 PCs were in a common state, that was the end of the installation. The new PC is very zippy, by the way. There was the problem of what to do with the old PC; after some exasperation, I managed to do a clean install of Win7 on it, wiping out all my old files, and added all the OS updates I could find. That done, I donated the computer to a shop that handed old but workable PCs off to poor folks.
I did find that the keyboard that came with the Acer was uncomfortably small -- nice for occasional use, not so good for full-time use -- so I kept the Logitech keyboard I had been using with the Compaq, and donated the original Compaq keyboard along with the old desktop. So, now I'm cooking with Win10 -- as well as Android on my new BLU smartphone -- and don't think I'll need more serious computing hardware any time soon. I am thinking of getting a Win10 hybrid tablet with detachable keyboard, but no rush, when my budget permits I might get a older "remainder", if simply as a toy to play with. Or maybe I'll get a ChromeBook ... I'll see.COMMENT ON ARTICLE
* GENOMICS & BEER: The genomics revolution, still in its early days, continues to expand its domain rapidly. As reported by an article from Nature.com ("Ale Genomics: How Humans Tamed Beer Yeast" by Ewen Callaway, 08 September 2016), geneticists have now traced the evolution of the yeasts used to make beer. By sequencing the genomes of nearly 200 modern strains of brewer's yeast, the research reveals how, over hundreds of years, humans transformed the wild fungus Saccharomyces cerevisiae into a range of strains tuned for particular brews.
Yeast generates the alcohol and CO2 bubbles in beer by its booze and bubbles by fermenting sugar, while generating hundreds of chemicals that suggest flavours such as banana or cloves to a drink. Brewing yeasts differ in their production of such metabolites, and in other traits such as their tolerance to alcohol.
A team led by geneticist Kevin Verstrepen at the University of Leuven and the Flanders Institute of Biotechnology in Belgium, sequenced the genomes of 157 S. cerevisiae strains used to make ale and other fermented products, including wine, sake, and bread. The evolutionary tree of the yeast strains revealed distinct families of yeast used for making wine, bread, and sake -- along with two distantly related groups of ale yeast, including strains from Belgium, Germany, Britain, and the United States. Verstrepen's team is now using genomics to churn out new strains of beer yeast.
Beer has been around for a long time, with a 5,000-year-old Sumerian tablet showing its group consumption, while pots of similar age from western Iran and northern China hold residues of beer ingredients, including barley and fermentation by-products. Verstrepen originally expected the ancestors of modern brewing yeasts to date back thousands of years -- but it turned out that modern yeast strains only became established in the late sixteenth and early seventeenth centuries. Earlier beer production, it seems, was based on yeasts that were not carried over in production over the long term.
This, Verstrepen says, coincides with a period in Europe when beer-making moved from homes to pubs and monasteries. He believes early professional brewers took yeast with them when they moved around Europe and even to the New World: US beer strains, for instance, are closely related to British strains. Brewers did not actually isolate yeast strains until much later, in the late nineteenth century -- but Verstrepen thinks they may have inadvertently shaped the genomes of yeast by brewing each new batch of beer on top of the dregs of the last one. Through this practice, brewers might have slowly selected yeast strains that perform well and produce desirable flavors.
An independent study by a team under Jose Paulo Sampaio -- an evolutionary geneticist at the New University of Lisbon -- came to many of the same conclusions as those found by Verstrepen's team, after sequencing 28 beer yeast strains. However, skeptics have suggested that Verstrepen's team envisioned an evolutionary rate 50 times faster than that found in other studies. Verstrepen replies that yeast does in fact mutate rapidly when exposed to alcohol, which is something of a toxic waste product to the fungus.
Although all the industrial yeasts bore signs of human influence, the beer yeast genomes were the most dramatically altered. Beer-making strains carry variations and duplications in genes involved in consuming maltose and maltotriose, the main sugars in beer. Most of the beer yeasts had variations that limit production of 4-vinyl guaiacol (4-VG), which generates clove and smoke flavors that many beer drinkers dislike. One exception was yeast used in German wheat beers called "Hefeweizens", which typically smell of cloves. The genomes of these strains contain stretches of DNA, including the genes that make 4-VG, that seem to originate from wine yeast. Verstrepen thinks that these strains are hybrids of yeasts used to make ales and wines.
As for future strains, Verstrepen's lab has come up with a new strain that features high alcohol tolerance, but does not produce 4-VG. The lab has also come up with a genetically modified yeast with very high-levels of chemical that yields the taste of bananas -- though Verstrepen has been careful not to give GM variants to brewers just yet.
However, the CRISPR-Cas9 genetic modification technique is so easy to use that home brewers, who tend to be a dedicated lot, with access to a biotech lab might well start tinkering with GM yeasts on their own. If the yeasts themselves were carefully filtered out of the brew, the beer would not, at least by US rules, even have to be identified as a GM product. Not to worry; if it happens, it won't be right away.COMMENT ON ARTICLE
* 21ST-CENTURY ECONOMICS (2): Although the free-market model has done much to raise a good proportion of the world's population out of poverty, it did suffer a major setback with the economic collapse of 2007. As discussed in an editorial by Buttonwood, THE ECONOMIST's rotating financial columnist ("Advancing, Not Retreating", 8 August 2015), the Left believe that the crisis shows capitalism is in fact outmoded, and that modern information technology will undermine it further.
In a book titled POSTCAPITALISM, author Paul Mason suggests that the "sharing economy", in which outright ownership of goods is de-emphasized, with consumers sharing rides or temporary lodgings through coordinated networking. Long-time radical Jeremy Rifkin, in his book THE ZERO MARGINAL COST SOCIETY, talks even more grandly about "the internet of things, the collaborative commons and the eclipse of capitalism."
However, on closer inspection, 21st-century information technology seems to be more democratizing capitalism than shutting it down. After all, sharing systems are simply a way of streamlining transactions between providers and renters, and can only in a loose sense be regarded as "communalization" of resources. As discussed here in 2015, the internet has revolutionized the selling of used books, while ebooks are turning traditional publishing on its head. Similarly, online photo-sharing sites have revolutionize the business of photography, as discussed here in 2012.
The example of ebooks and photo-sharing shows that the new technology does undermine traditional business models -- but the expectation is that money will be made in the new order eventually. Rightly so, because if a project doesn't make some sort of money, it won't persist. It might end up being a lot of money; Google started as a free internet-search business, but has become a corporate monster. While the economy still remains heavily based on physical goods, it is has become ever more based on "software" of all sorts, ranging from operating systems to applications software to games to video and audio entertainment. The new business models also can, in some cases, reinforce the old -- if downloading entertainment is cheap, people are still willing to pay top dollar for live performances.
Another new-economy effect is that the old idea of lifetime employment is fading. More people will follow "portfolio careers", switching from one employer, or even industry, to another as the economy changes. They will have to keep on picking up skills, and monitor the flow of information to find opportunities to sell them. Employees are becoming less important; people will take jobs as they are available, then move on to another one.
That instability means workers cannot rely as much their parents did on a pension plan, instead contributing to a retirement account. That places much more burden, not only on their prudence, but also on the management of their retirement account in the context of the global economy. Information systems will help, but they will also provide an opportunity for fraud. The future, in short, has promises and insecurities. The insecurities could push society to the Left; but that doesn't seem to be happening in the new economy. [TO BE CONTINUED]COMMENT ON ARTICLE
* THE COLD WAR (130): President Eisenhower couldn't do much in his last weeks in office but tie up loose ends. One was to deliver a farewell address, which was broadcast to the nation on 17 January. It was one of Eisenhower's most memorable speeches:
Three days from now, after half century in the service of our country, I shall lay down the responsibilities of office as, in traditional and solemn ceremony, the authority of the Presidency is vested in my successor ...
We now stand ten years past the midpoint of a century that has witnessed four major wars among great nations. Three of these involved our own country. Despite these holocausts, America is today the strongest, the most influential, and most productive nation in the world. Understandably proud of this pre-eminence, we yet realize that America's leadership and prestige depend, not merely upon our unmatched material progress, riches, and military strength, but on how we use our power in the interests of world peace and human betterment.
Throughout America's adventure in free government, our basic purposes have been to keep the peace, to foster progress in human achievement, and to enhance liberty, dignity, and integrity among peoples and among nations. To strive for less would be unworthy of a free and religious people. Any failure traceable to arrogance, or our lack of comprehension, or readiness to sacrifice would inflict upon us grievous hurt, both at home and abroad.
Progress toward these noble goals is persistently threatened by the conflict now engulfing the world ... We face a hostile ideology global in scope, atheistic in character, ruthless in purpose, and insidious in method. Unhappily, the danger it poses promises to be of indefinite duration. To meet it successfully, there is called for, not so much the emotional and transitory sacrifices of crisis, but rather those which enable us to carry forward steadily, surely, and without complaint the burdens of a prolonged and complex struggle with liberty the stake ...
Crises there will continue to be. In meeting them, whether foreign or domestic, great or small, there is a recurring temptation to feel that some spectacular and costly action could become the miraculous solution to all current difficulties ... But each proposal must be weighed in the light of a broader consideration: the need to maintain balance in and among national programs, balance between the private and the public economy, balance between the cost and hoped-for advantages, balance between the clearly necessary and the comfortably desirable, balance between our essential requirements as a nation and the duties imposed by the nation upon the individual, balance between actions of the moment and the national welfare of the future ...
A vital element in keeping the peace is our military establishment. Our arms must be mighty, ready for instant action, so that no potential aggressor may be tempted to risk his own destruction. Our military organization today bears little relation to that known of any of my predecessors in peacetime, or, indeed, by the fighting men of World War II or Korea.
Until the latest of our world conflicts, the United States had no armaments industry ... But we can no longer risk emergency improvisation of national defense. We have been compelled to create a permanent armaments industry of vast proportions. Added to this, three and a half million men and women are directly engaged in the defense establishment ...
Now this conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence -- economic, political, even spiritual -- is felt in every city, every Statehouse, every office of the Federal government ... In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist ...
Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades. In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers. The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present -- and is gravely to be regarded.
Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite. It is the task of statesmanship to mold, to balance, and to integrate these and other forces, new and old, within the principles of our democratic system -- ever aiming toward the supreme goals of our free society.
Another factor in maintaining balance involves the element of time. As we peer into society's future, we -- you and I, and our government -- must avoid the impulse to live only for today, plundering for our own ease and convenience the precious resources of tomorrow. We cannot mortgage the material assets of our grandchildren without risking the loss also of their political and spiritual heritage. We want democracy to survive for all generations to come, not to become the insolvent phantom of tomorrow ...
Disarmament, with mutual honor and confidence, is a continuing imperative. Together we must learn how to compose differences, not with arms, but with intellect and decent purpose. Because this need is so sharp and apparent, I confess that I lay down my official responsibilities in this field with a definite sense of disappointment ... I wish I could say tonight that a lasting peace is in sight.
Happily, I can say that war has been avoided. Steady progress toward our ultimate goal has been made. But so much remains to be done ... Now, on Friday noon, I am to become a private citizen. I am proud to do so. I look forward to it. Thank you, and good night.
The next day, 18 January, Eisenhower had his 193rd and final presidential press conference, praising Congress, wishing Kennedy well, and emphasizing some of the things he had said the night before. When asked about his warnings against the excessive influence of the "military-industrial complex", the president commented:
... some of this misuse of influence and power could come about unwittingly but just by the very nature of the thing. When you see almost every one of [US] magazines, no matter what they are advertising, has a picture of the Titan missile or the Atlas or solid fuel or other things, there is becoming a great influence, almost an insidious penetration of our own minds that the only thing this country is engaged in is weaponry and missiles. And, I'll tell you we just can't afford to do that.
Eisenhower also managed to inject a bit of humor:
REPORTER: Mr. President, have you come to a firm decision on the value of the ... no third-term amendment?
PRESIDENT: A funny thing, ever since this election the Republicans have been asking me this. [Laughter]
If Eisenhower had ever considered a third term, he hadn't taken the idea seriously; two terms were plenty for him. The dispersal of power in the US political system, in which even the president is only in charge to a restricted and contested extent, tends to erode the will to cling to office. It was time to retire, with Eisenhower leaving behind careers in the military and government, to be remembered as one of the most popular presidents ever.
[ED: END OF SERIES. I do intend to eventually come up with a COLD WAR document, but it has become clear that it is unrealistic to persist in weekly postings of it for the next five years or so. It is simply unmanageable and so being discontinued.]COMMENT ON ARTICLE
* SCIENCE NOTES: As discussed by an article from THE GUARDIAN ("Antibacterial Soaps Banned In US Amid Claims They Do More Harm Than Good" by Alan Yuhas, 2 September 2016), in early September the US Food & Drug Administration (FDA) banned antibacterial soaps. Dr. Janet Woodcock, director of the FDA's center for evaluation and research, said that certain antimicrobial soaps may not actually serve any health benefits at all, declaring in a statement: "Consumers may think antibacterial washes are more effective at preventing the spread of germs, but we have no scientific evidence that they are any better than plain soap and water. In fact, some data suggests that antibacterial ingredients may do more harm than good over the long term."
The new rule applies to any soap or antiseptic product that has one or more of 19 chemical compounds, including triclocarbon, which is often found in bar soaps, and triclosan, often in liquid soaps. It does not affect alcohol-based hand sanitizers and wipes, which the FDA is still investigating, or certain healthcare products meant specifically for clinical settings. The FDA has given manufacturers a year to change their products or pull them off shelves.
The FDA had been considering the rule since 2013, in the wake of studies that suggested they might affect human hormones, or change natural resistance to bacteria. Manufacturers were asked to back up their health claims, but they were unable to provide relevant data, if they answered at all.
Triclosan can be found in 93% of liquid soaps labeled "antibacterial" or "antimicrobial", according to the FDA, though some companies, including Proctor & Gamble, have already begun phasing the chemical out of products. There are partial triclosan bans in the European Union and Minnesota, but the chemical remains common in toothpaste, as it is believed effective against the bacteria that cause gum disease.
Professor Patrick McNamara, who has conducted research on antimicrobial, says he believes "there is no added benefit to having these antimicrobial chemicals in soaps".
He added: "After these chemicals are used in our homes, they go down the drain to wastewater treatment plants and eventually to the environment where they can select for antibiotic resistance genes. In short, triclosan and triclocarbon present a risk towards propagation of antibiotic resistance. Since they do not offer added benefits when washing hands, their use is not worth their environmental risk.
* In related news, as per an article in THE GUARDIAN ("UN Agrees To Fight 'The Biggest Threat To Modern Medicine': Antibiotic Resistance" by Amanda Holpunch, 21 September 2016), the entire membership of the United Nations has signed a declaration committing them to work to head off antibiotic resistance. This was only the fourth health issue to be the subject of a UN general assembly meeting, the other three having been HIV-AIDS, non-communicable diseases, and Ebola.
It is estimated that more than 700,000 people die each year due to drug-resistant infections, though it could be much higher because of inadequate global tracking of the problem. Awareness has risen over the past decade, thanks to studies of the issue -- as well as by advocacy by health officials, such as Britain's chief medical officer, Sally Davies, who said: "Drug-resistant infections are firmly on the global agenda but now the real work begins. We need governments, the pharmaceutical industry, health professionals and the agricultural sector to follow through on their commitments to save modern medicine."
Signatories to the UN declaration committed to encouraging innovation in antibiotic development, increasing public awareness of the threat, and developing surveillance and regulatory systems on the use and sales of antimicrobial medicine for humans and animals. In two years, groups including UN agencies will provide an update on the superbug fight to the UN secretary general.
* As reported by NATURE.com ("France Launches Massive Meteor-Spotting Network" by Traci Watson, 10 June 2016), on 28 May French researchers formally launched an unprecedented campaign to catch meteors, an effort that will rely on thousands of volunteers to comb the ground for bits of space rock. The "Fireball Recovery & Inter-Planetary Observation Network (FRIPON)" already includes 68 cameras that scan the skies for shooting stars, with 100 cameras to be in place by the end of 2016, making it one of the biggest and densest meteor-spotting networks in the world.
Meteorites are effectively bits of the Solar System that fall to Earth, where they can provide insights into the origins, evolution, and current state of the Solar System. FRIPON's goal is the collection of one tracked meteorite per year from the French landscape. In contrast, the similar but smaller Spanish Meteor Network, has only claimed two meteorites in the past 12 years.
The French network's cameras are very densely and evenly spaced, sitting roughly 70-80 kilometers apart at laboratories, science museums, and other facilities, giving enough coverage to permit reasonable estimates of where meteors land. FRIPON is also fully connected and automated, the first such network to be so. If a camera spots a meteor track, it sends a message to a central computer in Paris. If two or more cameras spot the fireball, FRIPON scientists receive an email describing where it was seen. Eventually, the email will also include the meteorite's probable landing zone, pinpointing it to an area roughly 1x10 kilometers in size.
Once the landing zone has been boxed in, then it gets down to mobilizing volunteers to search the box. Scientists will do the searches initially, but the expectation is that an army of interested citizens will be recruited to search for meteorites. That means a fairly big army, since experience shows that only about one in a thousand volunteers will show up for a search, so the objective is to get hundreds of thousands of volunteers. Searches are likely to be most difficult in the extensive forests of northern France -- but FRIPON organizers believe the exercise will still pay off. It's not like it couldn't do well better than tradition: only one meteorite was recovered per decade in France during the 20th century.COMMENT ON ARTICLE
* GOING CHROME: As discussed by an article from WIRED Online blogs ("How Chromebooks Are About to Totally Transform Laptop Design" by David Pierce, 9 September 2016), when Google introduced its first Chromebook laptop in December 2010, the response was not entirely enthusiastic. The initial "Chromium" model was heavy, rubbery, and black; although Google officials admitted the hardware was mostly for evaluation of the software, there was also a lack of enthusiasm for the software. "Chrome Operating System"? What operating system? It was just a laptop that ran a web browser, and that was it.
Okay, if this was madness, there was definitely a method to it. Chromebooks outsold Macs for the first time in the first quarter of 2016, and according to Google, US schools buy more Chromebooks than all other devices combined -- putting a big dent in Apple's long-standing domination of the educational market. Chrome laptops are poised for further growth, now that they can run all the millions of apps in the Google Play Store, with the apps working just as they do on Android phones. Google officials have been making a pitch to business for using Chromebooks, and the company is releasing a new generation of Chromebooks.
Chrome essentially started out in 2006. At that time, Google was working on Windows apps -- yes, back then Google supported Windows, generating a toolbar plugin for Internet Explorer, and a Desktop Search app that indexed a PC's contents. Windows seemed like a pain, however; for example, it could take minutes for a PC to boot up. Why all this bother, if all a user wanted to do was run apps? A few people on the Google development team started hacking around on an old netbook computer in the office, trying to strip out everything that wasn't really needed. It didn't take them that long to come up with a Linux-based system that booted in ten seconds.
And so Chrome was born. Felix Lin, now Google's VP for all things Chromebook, came to the company to, in effect, turn the Google Chrome browser into an OS that would make computing more accessible to more people. That didn't just mean making them cheaper; it meant tearing out the settings, options, icons, buttons, and toolbars to allow the large component of the world's population to use them.
When the first Chromebooks hit the streets in 2010, the computer-literate were appalled. Where were provisions to check out the system properties? How were power settings established? They weren't real computers; Chrome was mocked for being "just a web browser".
But why, Chrome developers asked in reply, did people need to control their power settings? Most people could get along fine with the defaults; and maybe to the extent they wanted control, the OS could provide a smart assistant that could do the job. People could do all they wanted with the web browser, with storage handled in the online cloud -- where files wouldn't be lost if a PC went down, and users could to access their personal resources from any Chromebook. Indeed, they could get at those resources from any PC with any OS, as long as it had a web browser.
Schools quickly jumped on board: they liked the fact that Chromebooks were cheap, easy to administer, and worked well for multi-user environments. Businesses then decided they liked the idea as well. The Chrome gang is now out to convert everyone else.
They did miss a trick, however; the Google Chrome and Android efforts were disjoint, and it seems to a degree competitive; the Chrome team was caught flat-footed when Android took off. However, the two groups did work for the same company, and the Chrome group quickly saw that Android, or at least Android apps, were a gold mine for Chrome. A Chrome engineer threw together a virtual environment that allows Android apps to work on Chrome, With more work, it hit the streets -- and now Chrome is not "just a web browser", it's a big smartphone. It doesn't require much computer literacy to use a web browser; it doesn't require any more to use a smartphone.
A Chromebook is lacking many of the items found in smartphones -- GPS, near-field interface, fingerprint reader, and so on. Some of those bells and whistles are irrelevant on a laptop, but some are necessary to get apps to work. Not a problem, at least over the long run: Google maintains a precise spec on what a laptop must to do to be called a "Chromebook", and the document is clearly edging towards requiring that future Chromebooks have GPS, NFC, a fingerprint reader, and so on.
Will Chrome take over? The computer-literate are still leery of an OS that doesn't want to give them control, and leaves them heavily dependent on online resources. There are, however, vast numbers of people who don't care, and who find the lower cost and simplicity of Chromebooks irresistible selling points. Besides, even though Android is idiot-proof, or at least tries to be, it does have a fairly comprehensive system settings system -- and it's not hard for those who like more control to find apps, such as file managers and shells, to get their hands dirtier. Since Chrome is now more or less Android for the bigger boys, what applies to Android is likely to apply to Chrome.
What makes Chrome particularly attractive is that small Chromebooks are very cheap, cheaper than ordinary smartphones. It's too early to say the future is Chrome; but neither is it possible to rule it out.COMMENT ON ARTICLE
* GETTING HOTTER: As discussed by an article from THE GUARDIAN ("NASA: Earth Is Warming At A Pace 'Unprecedented In 1,000 Years'" by Oliver Milman, 30 August 2016, Gavin Schmidt -- director of US National Aeronautics & Space Administration's (NASA) Goddard Institute for Space Studies -- announced that the Earth is warming at a rate not experienced within the past 1,000 years, at least, making it "very unlikely" that the world will stay within a crucial temperature limit agreed by nations just last year.
Although 2016 has not yet ended, it is clear it is breaking temperature records, with the average global temperature peaking at 1.38 degrees Celsius (2.5 degrees Fahrenheit) above levels experienced in the 19th century, disturbingly close to the 1.5C (2.7F) limit agreed in the landmark Paris climate accord. July was the warmest month since modern record keeping began in 1880, with each month since October 2015 setting a new high mark for heat.
Schmidt says that records of temperature that go back far further, obtained from analysis of ice cores and sediments, suggest that the warming of recent decades is out of step with any period over the past millennium:
In the last 30 years we've really moved into exceptional territory. It's unprecedented in 1,000 years. There's no period that has the trend seen in the 20th century in terms of the [rise in temperatures]. Maintaining temperatures below the 1.5C guardrail requires significant and very rapid cuts in carbon dioxide emissions or co-ordinated geo-engineering. That is very unlikely. We are not even yet making emissions cuts commensurate with keeping warming below 2C.
Temperature reconstructions by NASA, using work from its sister agency the US National Oceanic & Atmospheric Administration, found that the global temperature typically rose by between 4C to 7C over a period of 5,000 years as the world moved out of ice ages. The temperature rise clocked up over the past century is around 10 times faster than this previous rate of warming. The increasing pace of warming means that the world will heat up at a rate "at least" 20 times faster than the historical average over the coming 100 years.
To gauge temperatures before modern record-keeping, climate researchers use proxies taken from ancient layers of glacier ice, ocean sediments and rock. They can gauge greenhouse gas levels stretching back more than 800,000 years, but the certainty around the composition of previous climates is stronger within the past 1,000 years. While it's still difficult to compare a single year to another prior to the 19th century, a NASA reconstruction shows that the pace of temperature increase over recent decades outstrips anything that has occurred since the year 500 CE.
* As discussed by an item from Reuters ("Coastal Land Expands As Construction Outpaces Sea Level Rise" by Alistair Doyle, 25 August 2016), according to the Dutch research group Deltares our planet has has gained coastal land equivalent to the size of Jamaica, over the past 30 years, with human construction outpacing erosion caused by rising sea levels.
The increase in coastal area was due in large part to expansion of ports off China, construction of luxury resorts off Dubai, and land reclamation in the Netherlands. A total of 33,700 square kilometers of land was added, while sea level rise and lost 20,135 square kilometers, with locales such as Vietnam and the Mississippi delta in the US showing high rates of loss. Using satellite data and Google Earth, a report from Deltares says coastal regions had gained a net 13,565 square kilometers (5,237 square miles) of land since 1985, roughly the size of Jamaica.
According to Fedor Baart of Deltares: "We expected on average the coast to shrink ... as sea level has risen. But the coasts are actually growing. We have a huge engineering power. [Off China] the coastline all the way from Hong Kong to the Yellow Sea has almost been redesigned."
It is estimated that seas rose 20 centimeters (8 inches) over the past century. While this news was greeted with jubilation by climate-change deniers, the facts remain that the seas are rising; the rate is likely to increase; the ultimate effect will be the deep inundation of the Earth's coastal areas; and the ability of humans keep out the sea is likely to fall behind as sea level rise accelerates.COMMENT ON ARTICLE
* ANOTHER MONTH: For those who like to be geeky with style, the website Fun.com sells superhero suits. There are several options:
They're not badly priced at about $250 USD each. Usually, such specialty items cost an arm and a leg. If I saw one I liked, I might buy one for Halloween, when I came into a bit of extra money. I was tempted with the Iron Suit -- but it's really not my style.
* A Saudi teenaged girl living in Germany named Rayouf al-Humedhi, 15 years old, has proposed making another sort of statement, suggesting that The Unicode Consortium -- a non-profit corporation that reviews and develops new "emojis" -- support a "headscarf" emoji. The idea gained the backing of the co-founder of online discussion forum Reddit, Alexis Ohanian. A member of a Unicode subcommittee replied to her suggestion, offering to help her draft a formal proposal. If approved, her emoji will be available in 2017.
Oddly, despite the fact that there's plenty of prejudice against Muslims on this side of the Pond, only soreheads would object to the idea of a headscarf emoji. It is more controversial in Europe, some seeing the headscarf as a sign of Islamic backwardness, even terroristic. One wonders what seems the stranger notion: that people would object, or that there will come a time when nobody will object.
* I had started contributing computer time to the BOINC distributed supercomputing effort back in 2010, originally starting out running BOINC on an old Windows XP notebook. It started having hangups due to disk errors, and so I switched over to my Samsung Galaxy Android tablet, which only has a flash drive. I've been letting it run 24:7:365 since then, and more or less forgot about BOINC.
After setting up a new Win10 PC, I got to thinking I might want to get BOINC back under control. I checked the Galaxy tablet, and found that of the five different projects I had chosen to run, two were stilling clicking away, one was idle, one had been inadvertently disconnected, and one had been ended. I shrugged, cleaned things up, and decided to add more projects.
That turned out to be tricky, though having been through it before, it wasn't complete news. BOINC is an umbrella scheme for projects from different organizations, and so adding projects to run means signing up for each one of them. I went through my existing accounts and made sure they had consistent usernames and passwords, and then started to add more accounts.
This is supposedly a bit simpler nowdays because BOINC has added a "BOINC Account Manager (BAM)" page, where one can select a set of projects and sign up for them in block. However, this quickly led to further difficulties, since I found out that some of the projects wouldn't run on an Android platform like my tablet, at least at the present time. I tried to shut down my accounts for the projects, but it turns out that is simply impossible to do, it seems because the project management systems are too stupid to adjust for it -- even though I wasn't running, could not run, the projects.
OK, so I would just forget about them -- except for the fact that BAM still kept the listing for an inactive project. OK, nuts, I'll just set up a link list to the projects, and forget about BAM, since it only adds complications. The lesson was that BOINC was put together by academics who were, for the most part, not professional programmers, and was not comparable to a salable product.
What else might I have expected? In any case, along with my Galaxy tablet, I decided to install BOINC on my new BLU / Amazon smartphone. I don't use it more than a small fraction of my time, so it might as well be getting something done when I'm not using it. I leave it plugged into a USB charger, and it sits there, crunching away day and night.
* In September, I inventory things that need to be fixed or updated -- sort of a carry-over from the "back to school" tradition -- with one task being to replace batteries in flashlights and such. No use hanging on to them until they're dead; replace them once a year, and have flashlights that are likely to work.
I have a large police-baton flashlight that I've had for decades and keep in my car. I wanted to replace the batteries, but while I was pulling out the old batteries, I got to thinking: Shouldn't somebody be making LED replacements for flashlight incandescent bulbs?
Although Walmart didn't have any such thing, they were easy to find on Amazon.com. I bought a high-intensity LED bulb; it was on the expensive side, but it was something that would probably outlive me. It did seem brighter than the old lamp, and I'm sure it has less drain on the batteries. Everything old is new again.
* Thanks to one reader for a donation to support the websites last month. That is very much appreciated.COMMENT ON ARTICLE