tsujigiri

The editorial comments of Chris and James, covering the news, science, religion, politics and culture.

"I'd take the awe of understanding over the awe of ignorance any day." -Douglas Adams

Friday, April 29, 2005

Your skills are great, but...

I have had some conversations recently about the psychology and sociology of Computer Science and peripheral fields. Much of this conversation has centered around the "End-to-End Research Vision" for "making the world (of communications) a better place." A friend of mine, who does research in networking protocols, elaborated on the identities of the report's authors. Apparently these guys are well known for inventing various protocols, algorithms for network congestion control, and other networking bits and pieces. Apparently they perceive these contributions to be of Nobel proportions. My friend admits that all of these protocols and algorithms are essentially arbitrary, derived from little or no underlying theory, and evaluated mostly through simulations and emperical data.

This might shed some light on the psychology of Computer Science practitioners. In many or most instances, they produce arbitrary instances of algorithms which they might derive out of the clear blue sky. When things work out well, the designers/programmers are said to have "skills." But these are not ordinary scientific skills. They are magic guru skills. In the mind of the Great Inventor, there can be only one inventor of this or that world-shaking algorithm. The one who devised it did so through an innate talent, and it could not have been produced by someone else.

In the case of the End-to-End group, it seems that "world-shaking" is equivalent to "deployed ubiquitously in manufactured systems." This makes their claim to greatness somewhat specious, since commercial deployment depends mostly on "right-place, right-time," not on some inherent perfection of the design. In the case of networking, ubiquity also arises from the propagation of a few narrow standards into millions of devices. Their claim to fame is at least in part due to the luck of the draw. Theirs were not monumental technical innovations; they were business successes.

I pulled this random sample off of the web-o-Minkowski-space today, in which the author explicitly suggests that "programming skills" are innate and perhaps genetic. The real goal of the article is to explain a supposed finding that "90% of people think they are of above average intelligence." I think the author's reasoning involves some really wild underlying assumptions ranging from genetics to epistemology:

People will polarize towards the jobs they are good at. If you are born with amazing computer skills, you are likely to get a computer-related job. If you have an amazing ability to play football, you may well end up playing a lot of football.

Now, most people aren't going to want to spend the rest of thier lives worrying about thier inadequacy in certain areas. Thus they will tell themselves that thier skills are important. After all, they (the skills) got them (the person) where they are today. They may disregard other skills, rather like PC gamers who don't own consoles will rarely say that yes, a console is overall a superior platform.

Anyway, here's my point: People often judge other people's intelligence in the same category at thier own, and vice-versa.

For example, I think my skills (programming) are better than some other guy's (memorising the winners of the last 25 years' Boulton Wanderers games), because I've been just fine up until now without knowing what he knows. But I've used *my* skills a lot.

I am confidently above average in the programming league, because most people can't program. I'm right at the bottom of the Accounting Skills league. I class myself as above-average, because I'm an above-average programmer.

In conclusion: Different people are value different things. By my own standards, I am above average. By yours, I could be really dumb.

The triumph of reason!

The Rumor Mill

I've commented a little bit about weblogs as "rumor mills." For the past few months, I've been following the online rumor mill surrounding my own work. Shortly after I finished graduate school, my lab produced a press release involving my work. We tried to keep everything as accurate as possible, but due to its rarity my work is difficult to explain even to experts.

As the report spread from press to press around the world (my favorite was a tech magazine in Vietnam), it inevitably moved onto blogs and online fora. I actually responded to one post which (for reasons unknown) was posted in a forum titled "Mini-ITX form factor" (the mini-ITX is a low-power, small form-factor PC motherboard). As the story spreads, adjectives get deleted and the story gets a little out of control. For example:

The Informatics Circle of Research Excellence (iCORE) High-Capacity Digital Communications Laboratory researchers have designed a computer chip that uses around 100 times less energy than current market-leading chips. The iCORE Processor, developed by Dave Nguyen and Chris Winstead, former engineering graduates of the University of Alberta uses new analog processing technology that is currently used by Winstead to build the largest analog decoder chip ever fabricated.

It employs a new method of processing digital data via methods of analog decoding, the iCORE chip uses extremely low levels of power to execute its detection algorithm. No other chip has been recorded to function with such a low level of energy. With this low level of energy consumption, theoretically a cellphone could run for a full year on a single charge.

[Emphasis Added]

I've seen a lot of little twists like that. It should read "No other iterative decoder chip...". The distinction is important. Some articles almost make it sound like we've created a new manufacturing technology for low-power chips.

As I collected these rumor-mill examples, I became reminded of the very point I've been making for years: the web is not a rumor mill. Here I am: the source. Direct to the public. I may not be as widely read but I can be found by anyone who looks. And my comments are as much published as anything else on the web. I've even spotted peripheral nodes on the "rumor tree" and responded directly.

Thus in my own work I see clear evidence that the blog is open to instant correction by the very sources of information. The press, however, is a closed peer-to-peer chain of transmission and mutation; a true rumor mill.

Tragedy of the Commons

I guess it isn't really a tragedy so much as an amusement. The text below is from the Wikipedia entry on "Atomic Theory." Note the discordant voice indicating a possible different author for the last paragraph.

The importance of this theory cannot be overstated. Arguably, the atomic theory is the single most important theory in the history of science, with wide-ranging implications it holds for both pure and applied science. John Dalton, 17-18th century British chemist, is the scientist credited with this titanic discovery.

The entirety of modern chemistry (and biochemistry) is based upon the theory that all matter is made up of atoms of different elements which cannot be transmuted by chemical means. In turn, chemistry has allowed for the development of the pharmaceutical industry, the petrochemical industry, and many others.

Much of thermodynamics is understandable in terms of kinetic theory, whereby gases are considered to be made up of either atoms or molecules, behaving in accordance with Newton's laws of motion. This was, in turn, a large driving force behind the industrial revolution.

Indeed, many macroscopic properties of matter are best understood in terms of atoms. Other examples include friction, material science and semiconductor theory. The latter is particularly important, as it is the foundation of electronics.

In same case the study of a property at atomic level is very complex and easier result are obtained with a study at bigger scale. This does not means that atomic theory does not work in these case. The problem is the mathematical complexities given by treating such problem with the atomic theory. Till nowdays there are no case where atomic theory does not work, there are only case in which the result is easier obtained, in the limit of the wanted approximation, with easier theories. Dispite of that it may be of some worthness to point out that a general vision should always kept and considered and consider the world or the entire universe only as series of atoms is reductive.

Addendum: browsing the "History" tab of Wikipedia revisions is also entertaining. For some reason, the above last paragraph has survived many revisions without being polished. I guess the edits are mostly aimed at things like this:

The existence of atoms was first proposed by Greek philosophers such as Democritus, Leucippus, and the Epicureans, but without any real way to be sure, the concept disappeared until it was revived by Rudjer Boscovich in the 18th century, and after that applied to chemistry by John Dalton. YEA Nigga!!! Rudjer Boscovich based his theory on classical mechanics and published it in 1758

Thursday, April 28, 2005

What REALLY happened



Stolen from mathowie's photo stream.