Friday, November 27, 2009

What's wrong with the Indian left?

Well, this for example. The article is about Kian Tajbakhsh, an Iranian-American (ie, a dual citizen of those countries) social scientist who studied in the UK and the US, earning his Ph.D. in Columbia University and has worked with many respected organisations, chose to move back to Iran, and was arrested there and charged with espionage. Reportedly he is likely to be executed. The writer of the above article adds:

Those of us in India who have been consistently anti-imperialist and critical of the US, and who respect Iran’s anti-US imperialist position, have been deeply disturbed by the Iranian regime’s crushing of the pro-democracy protests and its attempts to characterize these massive uprisings as fomented by the US. It’s tragically ironic that the US should be dubbed as “pro-democracy” by the Iranian regime!

I wonder why the writer did not ask herself the obvious question: if Iran is the land of the free and it is ironic to call the US "pro-democracy", why, one may ask, do so many Iranians (and nationals of so many other countries) want to make the homes in the US, and why do so few Americans (or Indians or anyone else) want to immigrate to Iran? Does she think Tajbakhsh (whom she calls an "Iranian patriot") chose to move to Iran because he approved of the Iranian regime, or thought it better than what the US offers (even in the Bush era)? Do Burmese patriots, or North Korean patriots, approve of their rulers too?

Well, at least the writer is disturbed by the recent crackdown -- unlike fellow travellers of the Soviet Union who justified the atrocities there until Stalin's genocidal excesses became impossible to ignore (and, in many cases, even later).

Monday, November 23, 2009

Useless skills

Long ago, an American and I were counting something (I forget what, probably occurrences of some motif in a sequence or something like that), and when the American looked at my hand, he said "hey, that's a clever way to keep count!" Rather than using fingers individually, I was using the lines that separate the phalanges on the fingers, and the tips of the fingers, to count up to 16 on each hand, or 32 totally. I've always done that, and in India I'm not the only one -- I think it goes back to Vedic times (in particular, I think I was taught some such thing as part of some ritual or the other, when I was a child). So it didn't occur to me that anyone else would be surprised by it.

Can one go higher than 32? I didn't think much about it until I read this User Friendly strip. By representing 1 with a raised finger, and 0 with a lowered finger, and using both hands, one could in principle go up to 1023. There are two catches: one has to be familiar with binary numbers, and the fingers have to move very independently. The second is the bigger problem for me.

But the other day I realised that one can go up to 256 (or 255) very easily, by modifying my phalange technique. Count from zero to 15 (rather than 16) on each hand, but instead of adding the hands, use one hand as a 16's placeholder -- that is, use the hexadecimal system. One still needs to be comfortable with hex, but it is a useful skill for anyone who programs computers.

I haven't actually started counting that way yet, but next time I need to count a number that is likely to be much greater than 32 but less than 256, I'll give it a try.

In theory, with sufficient independence of finger movements, one could do 255 in one hand, as follows: use the phalanges as the lowest-position hex digit, and use the binary readout of the finger positions (raised/lowered, omitting the thumb) for the next position. With 16 inter-phalange lines/fingertips, and 16 possible combinations of raised/lowered fingers, one can do 255 in one hand. And combining the two hands, one could then count up to 65,535. But that is certainly too much for my level of digital dexterity or mental arithmetic. (The latter would have been so much easier if the world had standardised on base 16 to start with. Using base 10 was a huge mistake, but is now one of those suboptimal choices that are frozen and irreversible.)




UPDATE 24/11: Though I linked the Wikipedia article on hex above, I didn't read it. From this section, it seems I'm not the first to think of counting to 255. It doesn't attribute the originator but says counting on phalanges is common "in South Asia and elsewhere". It attributes the idea of counting to 1023 in binary to Arthur C. Clarke.

Saturday, November 21, 2009

Macs and developers

Here are two articles about Apple, both by seasoned software experts ("hackers" in the original, good sense of the term).

The first is an old (2000) article by Jordan Hubbard, one of the founding developers of the FreeBSD project. FreeBSD is a Unix-like operating system that was once almost as popular as Linux. It evolved from BSD, which was originally a set of enhancements to the original Unix operating system of AT&T, developed at Berkeley; over time BSD became a nearly full-fledged operating system in its own right, and in the early 1990s, gave rise to several operating systems that achieved considerable respect among the cognoscenti. (Among other things, the network protocol that governs the Internet, and several widely used pieces of internet software, came from BSD.) Apple heavily borrowed from FreeBSD (and continue to credit it on their developer page) in creating Mac OS X, which is a fairly traditional Unix system under its shiny hood. Hubbard's article is a review of a beta version of OS X, written from the perspective of a Unix enthusiast and FreeBSD developer. It turned out that he liked OS X so much that, shortly afterwards, he left the FreeBSD project and went to work for Apple (as did several others).

The second is a recent essay by Paul Graham, perhaps best known as an evangelist of the Lisp programming language, but a very influential online essayist. It is prompted more by the iPhone than by Apple's computers, and the way Apple has handled the iPhone "App Store" leads him to claim that they "don't understand software". Moreover, he argues that the loss of goodwill among talented programmers (ie, people like Hubbard, who were attracted to Apple ten years ago by the vision of a solid, Unix-based, user-friendly desktop system) will cost Apple dearly in the future.




My previous post on Linux "just working", compared to the Mac, was titled partly in jest, though the point about the different development models was serious. There is no question that the Mac offers a more user-friendly interface for beginers than the best Linux systems, has a better set of applications included (especially the multimedia stuff), and -- perhaps most crucially -- allows you to install most well-known open-source Linux/Unix applications via third-party efforts like the Fink project.

Macs also have a far better reputation for stability than Windows, and comparable to Linux. Here, however, we have to consider the great advantage Apple has in this respect. Apple builds the hardware, and can ensure that it works optimally with its software. With external peripherals (such as the HP printer/scanner I mentioned in my previous post), things aren't so certain. Linux, and Windows, meanwhile, work fine on the vast majority of PCs out there. Microsoft works closely with every important PC hardware vendor to ensure this. Linux is more volunteer-driven but, these days, several of the major manufacturers -- notably Intel and IBM, but also many others -- fund Linux development, employ individual developers, and make the specifications for their hardware available. Still, testing for every possible combination of hardware devices is impossible. Computer makers will generally test that their machine works with previous releases of Windows, but Microsoft cannot ensure that all existing computers will work with Windows 7. Linux, meanwhile, receives little testing support from manufacturers. So it is remarkable that these systems work as well as they do.

Apple has received brickbats over the years for their proprietorial control of their hardware and their refusal to permit installation of Mac OS X on third-party systems. I personally think the world would have been much worse off if Apple, and not Microsoft, had dominated the personal computer industry. But Steve Jobs, Apple's CEO, is a rare example of a hard-headed businessman who also understands the intricacies of software. (Bill Gates is another. But Gates was never interested in the hardware side of things, so MS-DOS and, later, Microsoft Windows drove the widespread availability of cheap, "commodity" hardware, which in turn spurred the growth of Linux and other free operating systems. But I digress.)

Many people these days forget -- or are not even aware -- that Apple went through dark days for nearly a decade from the mid-1980s, after power struggles between founder Steve Jobs and CEO John Sculley (a former Pepsi executive hired by Jobs) resulted in Jobs being ousted from the company. Sculley's tenure was not a resounding success. Jobs was not much more successful commercially with his next venture, NeXT, but it was a highly influential platform in the academic community (among other things, it was the platform used by Tim Berners-Lee to develop the World Wide Web). After Apple bought out NeXT in 1997, Jobs was restored as CEO of the unified company. The fortunes of Apple have soared wildly since. On the hardware side, he brought out revamped Macintosh computers and laptops, the iPod audio device, and the iPhone; on the internet side, he launched iTunes, the online music store that (together with the iPod) changed the music industry; and on the software side, he decided to entirely scrap the "classic" MacOS and develop a new system, Mac OS X, from scratch. Well, not from scratch: as noted above, it uses FreeBSD (and the Mach microkernel, and much other free/open-source software) at its core; and its graphical interface, and underlying infrastructure, is heavily based on the NeXTstep interface developed by his previous company. Today, OS X powers not only Apple's computers, but also their iPhone and iPod Touch handheld devices.

Two software-side decisions of Jobs are particularly interesting. When choosing an underlying "base" operating system for OS X, he did not go with Linux, which was then (as now) the best-known free version of Unix: he instead picked FreeBSD, which was lesser-known but highly respected by those in the know. And some years later, when Apple decided to develop a web browser, they chose as a base not Mozilla (whose ancestor was Netscape, the dominant browser of the 1990s, and whose Firefox browser is widely used today), but Konqueror, a browser used only by a small subsection of the Linux community. Specifically, they used the HTML and Javascript engines of that browser, on top of which they built their own interface. They also released their modified version of the html engine as the open-source project Webkit, and it is used by several other projects including Google Chrome.

In both cases, the decision to spurn the well-known name in favour of a much lesser name was justified by pointing to the smaller, cleaner, more maintainable source code; and in both cases, the result has been wildly successful. I wonder how many CEOs would have taken such gambles on relatively "unknown" software -- indeed, in the case of OS X, betting the future of the company on it. Google, in contrast, has chosen to use Linux, the "safe" choice, as a base for their operating systems (Android and the Chrome OS); and though they chose Webkit for their Chrome browser, they did so only after it had already been "proven" by Apple.

Small wonder, then, that Apple's share price takes a hit every time worries about Jobs' health surface.

Though the co-founder of Apple, Jobs was never a hacker (that was Steve Wozniak) but he understood software and software hackers. Which is what makes Paul Graham's article above a bit worrying. Has Jobs lost touch with the "hacker ethic"? Or has he lost full control of his company, perhaps due to his still mysterious health issues?

Wednesday, November 18, 2009

Linux "just works"!

We just got two new toys: a Mac Mini (the basic model), and a printer/scanner (HP Deskjet). The Mac -- both the tiny little machine and the OS that it runs -- looks spectacular, even though the peripherals are all non-Apple (Dell monitor, Logitech keyboard and mouse).

The printer came with a Mac OS driver CD; we inserted, installed, and... it didn't work. The scanner program quit with an "unknown error", and offered to send a bug report to Apple. We suspect that a newer driver is required: the CD says it is for Mac OS X 10.3, 10.4 and 10.5, but the Mini has 10.6 (Snow Leopard).

So I tried plugging the printer into my Linux laptop. No driver downloads, because the HP-supported linux printer drivers were already installed. I then tried scanning an image: the scanning program (xsane) detected the scanner, and it Just Worked.

I expect the updated Mac drivers will work too. If not, it's over to consumer support. But the interesting thing to me is this: unlike many Linux drivers, which are developed by third parties, the HP drivers are from HP -- as "official" as things ever get on Linux. But the way it is done is very different. HP's Mac (and Windows) drivers are binary "blobs" that are designed for a particular OS and its particular driver model; when the OS is updated, the driver can break. Windows Vista was notorious for malfunctioning peripherals, for this reason; apparently Snow Leopard is not immune. Linux drivers are provided (mostly) as source code, and compiled against the kernel source, which is a constantly evolving object (essentially no two Linux distributions use exactly the same Linux kernel, and the kernels they use are not the "official" kernels released by Linus Torvalds). So the user does not get the HP drivers from HP's website or installation media, but bundled with the OS itself -- and it works.

Linux is sometimes criticised for evolving too rapidly, making it extremely difficult for device manufacturers to supply binary "drivers". The linux developers respond that it is better for the companies to work with the kernel community, open-source their drivers, and, if possible, include them in the "mainline kernel". That way, first, users will not need to download drivers or run "installation CDs" to get their hardware to work; second, the drivers will generally stay working. (Here's a position statement from Linux developers.)

Based on my one data point (I don't use Windows and this is my first experience with a Mac), I'd say the Linux developers have a point.

Wednesday, November 11, 2009

"The pilot is the weak link"

According to a new book by William Langewiesche, reviewed here by the NYT, the true hero of the incident earlier this year when a US Airways plane was landed safely on the Hudson river after losing both its engines was not the pilot, Chesley "Sully" Sullenberger. It was Bernard Ziegler, a Frenchman who perfected the "fly-by-wire" technology used by Airbus. Langewiesche asserts that the Airbus was nearly capable of landing itself, even after losing its engines, and while Sullenberger made the right choice to land in the river, the landing itself required only moderate skill and any decent pilot could have done it.

It is an interesting claim, because the other headline-making air accident this year was the loss of Air France flight 447 on June 1, en route from Rio de Janeiro to Paris. On that occasion, there were several suggestions that fly-by-wire, and the lack of manual pilot overrides on Airbus aircraft (in contrast to Boeing), were responsible.

Langewiesche further claims, credibly, that being an airline pilot is such an incredibly monotonous job that the best and brightest do not want to do it today. (Some pilots may find unusual ways to alleviate that boredom.) Michael Moore says that pilots in the US are so poorly paid today that many of them work second jobs. No wonder so many foreign pilots are now working for Indian carriers, who not only continue to pay well, but Air India pays expat pilots more than Indians. In fact, their annual bonuses (up to $15,000) are comparable to the total annual pay ($17,000) of some pilots for major US airlines, if Moore is correct.

Air India, of course, is in a financial crisis, as -- to a lesser extent -- are all Indian carriers; so such generous pay may not last long. But good pilots are still required, even if Airbus planes require little skill to fly.

Unless Airbus invents a pilotless aircraft (Boeing doesn't seem very impressed with fly-by-wire), I think interesting times are ahead for the airline industry, and for passengers.

Tuesday, November 10, 2009

CSIR, bitten by the one it fed

Suppose you are heading the country's largest and most important scientific organisation. You know that, despite some very bright spots, it has been getting creaky and bureaucratic over several decades. Being a dynamic and go-getting scientist, you have several ideas on what is to be done. One of the things you want to do is to set up a new department aimed at streamlining the process of commercialising new technologies, and establishing better links with industry. Would you hire this guy? Would you offer him a job the very first time you met him?

The current director-general of the CSIR apparently did. And there, in my opinion, started the trouble that has since then accounted for much column-space and blog-energy. Good reviews, and links, are on Abi's blog: here and here (some of the comments are interesting too).

V A Shiva, also called Shiva Ayyadurai, seems to have had an interesting career. He is not a career scientist, but has bachelor's and master's degrees in electrical engineering and computer science, visual studies and theoretical mechanics from MIT; and recently, apparently, he earned a Ph.D. in systems biology from the same institution. In between, he has mainly been in what we like to call the IT sector, primarily running an e-mail provider, EchoMail. Read his biography for more.

The DG of CSIR, Prof S K Brahmachari, is a smarter man than me, and no doubt a better and more experienced judge of others' CVs and abilities. It appeared to me that Shiva Ayyadurai has few notable academic achievements, and his primary commercial undertaking, EchoMail, is not exactly a household name. He seems prone to bombast: for example, he claims to have created "one of the world's first e-mail systems" in 1979, but e-mail has been around since the 1960s. Nevertheless, a thorough interview and review of the man's abilities and accomplishments may have led Prof Brahmachari to conclude that he was the right person to head his pet project, CSIR-Tech. But was that done, or was it an instantaneous decision, as Shiva suggests?

Having been hired, at a generous salary, he was apparently asked to produce a report on the functioning of the CSIR and future improvements. This he did, and that is when all hell broke loose, and the CSIR terminated his appointment (the CSIR claims that he was not employed in a permanent position, only hired on contract, and there are also claims that he was asking for too much money). Shiva went ballistic, complaining to everyone in the media who would listen that he was being victimised for his genuine and well-meaning criticisms of the organisation. He claims also to have written to the Prime Minister. What the PM thinks of it, we don't know.

So what did he say? The Deccan Herald excerpts the report here. I have seen the full report but do not think it is worth "leaking": it seems hastily put together, is unprofessional and often personal in tone, identifies obvious problems that I'm sure are well known to all CSIR scientists, and prescribes remedies that would be within the province of a first-year MBA student. Nevertheless, do read the DH link for its entertainment value, if you like. As Abi asks, if this is Shiva's opinion of the man who hired him, why does he want to keep the job, and having aired such an opinion, why should he expect to keep the job?

As some commenters on Abi's blog suggest, maybe he was already told not to expect to be hired in a permanent position, and his report was his way of venting his grievance at the CSIR DG. Which makes it even more unprofessional. Regardless of the truth or otherwise in his observations, I don't think the report will now be taken seriously, nor should it be.

But that shouldn't detract from two key issues. First, exactly what sort of position was Shiva hired under in the first place, why, and what was his mandate? Second, the need to reform and streamline CSIR remains: what does the DG plan to do in this regard? I suspect the CSIR DG made a mistake (caused by over-eagerness to "get things done") in hiring V A Shiva, and knows it; he should now make amends -- first, by coming clean on exactly what happened; and second, by making sure the need for reforming and modernising CSIR is not sidetracked.

Monday, November 09, 2009

This is it: review from a non-fan

Two days ago we saw "This is it", the film of the rehearsals for Michael Jackson's planned last concert series that never took place.

I was never much of a MJ fan -- "Thriller" happened when I was under 10, and by the time "Bad" happened he was already being viewed as a bit of a joke, with his skin-bleaching and plastic surgery and oxygen tents and whatnot. By the time I started listening to rock and blues (and, later, jazz), the Michael Jackson brand of pop seemed too tame.

After he died, my view was coloured by articles like this one. Supposedly the guy was skeletal, unable to sing let alone dance, and only being kept alive by insane quantities of medication; if he hadn't died before the scheduled 50-concert series, he would not have survived that ordeal.

So it was a surprise to see Jackson in the movie. Thin he certainly was -- skeletal, it was harder to say. But the rest?

He could sing, and did sing. He wasn't lip-syncing. His voice was a bit different from the old days, still high and child-like but somewhat thicker (an improvement, in my opinion). He talked frequently about needing to "preserve his voice", but it sounded more like hypochondria than a real problem -- no doubt it was a bigger problem than even he knew, but it did not show in the performances. "I just can't stop loving you", in particular, ended in an extraordinary extended bluesy call-and-response sequence between him and a female singer that would not have sounded out-of-place on a 1950s album from Chess Records, and showed some improvisational ability that I had never associated with him.

He could dance. Not like a 20-year-old, but better than most 50-year-olds, and certainly not like someone who only had weeks to live.

He was in control. Directing the choreography, the film-editing, and the musicians with authority -- telling the lighting and video people to "watch his cue", telling them that he would sense the video changes without needing to see them, telling his musicians to prolong a pause and "let it simmer"...

And the musicians were outstanding. If I had expected MJ to lip-sync his performances, I had also expected him to use recorded music, like most other pop singers these days. But no, he had a small, tight band -- two guitars, bass, keyboard, drums -- and while what they did wasn't too different from his recordings, it sounded much punchier and more intimate. The bass was funky -- I'd never noticed a bass in MJ before. The lead guitarist, a young woman called Orianthi Panagaris, ripped it up, not missing a step even when MJ was dancing in her face and all over her guitar. "Black and white" climaxed with a guitar duel between her and another guitarist. I found I could relate to the music: I could hear Motown and the blues in it, which I never did before, perhaps because I never listened to it very much, or perhaps because it was over-produced.

Kenny Ortega chose to put together a raw montage from the rehearsals, consisting of complete or nearly-complete songs, some shots of the team planning the performance, some interviews with the crew, and nothing else. It is obvious that an enormous effort must go into an MJ show -- let alone a 50-night run -- but the sheer scale of it all hadn't really come home to me. Nor had the level of commitment and enthusiasm of all the performers and crew, and their interaction with MJ, who was like a god to most of them. It must have been absolutely shattering for them when Jackson died a week before the concerts. But this movie has brought them to a wider audience than they would have planned for. I expect to hear more of Orianthi Panagaris, in particular.

Monday, November 02, 2009

A truck on a pedestrian bridge

Chennai has three waterways running through it, one of which is the Buckingham Canal -- once an elegant canal on which barges transported people and goods, now essentially an open sewer. But the canal divides the city, especially the southern parts of the city: there are very few motorable crossings south of the Adyar river. The first major one is at Sardar Patel road, the next at Tidel Park about 2km further south, and the next at Shozhinganallur nearly 10 km beyond that.

In between the first two above, there was a narrow pedestrian bridge, which served as the main route for residents of our institute's hostel and guest house -- and several others in the neighbourhood -- to access the commercial areas of Indira Nagar, Adyar and Besant Nagar on the other side. The bridge was designed for pedestrians and cyclists; motorbikes were always frequent users, but recently auto-rickshaws and even cars have been using it heavily. Whenever police tried to barricade it so that only pedestrians could access it, the barriers were removed.

Early on Sunday morning, a truck laden with bricks tried to use it. This was the result.

Thanks to one truck driver who had no idea what his vehicle weighed, and several inconsiderate souls who kept removing those barricades, residents of this mainly academic neighbourhood will no longer be able to walk across (or cycle across) to the commercial area on the other side. And when I cycle to work, which is fairly often, I will need to use the busy main road and not the quiet inner roads that I earlier favoured.