Wednesday, July 12, 2006

The changing face of computer science

People seem to have different ideas about computer science. To some, it is all about algorithms and data structures, while others swear by operating systems and networking. For me, computer science is more of formalizing abstractions through research and analysis. It is (or rather, should be) a healthy mix of various disciplines that have to do with computing.

I was recently going through the essay The Art of Unix Programming by Eric Raymond. The history of Unix, described in vivid detail by Eric, makes fascinating reading. Also worth reading is the origins and history of the hacker culture which grew hand-in-hand with the Unix culture.

A point worth pondering over is the relatively dismal state of computer science research in today's world. Traditionally, computer science has been nurtured in academia; the best of radical ideas and groundbreaking technologies in the computing arena have been bred in universities and technology institutes. Computing concepts in industry that seem so obvious today have evolved through years of dedicated and focussed research in the likes of MIT and Carnegie-Mellon.

It is quite true that computer industry (commercialization, in other words) is the dominant face of the computer science phenomenon today. But that should not be a lame excuse for the stagnation in computer science research. And as almost everyone knows, long-term intellectual stagnation is not good for society.

For the most part, we ourselves are to blame. Salaries for computer professionals are growing more lucrative day by day, and the demand is also there. People freshly out of college think of getting high-pay jobs and settling down into the rhythm of their professional life. After all, who wants to keep "withering" in academics?

1 Comments:

At 3:57 PM, Anonymous Anonymous said...

It is often said that no new invention - how technologically advanced and complex will sell unless it has a convincing application. How else can one explain the failure of all those electronically wired shirts that were inventer nearly 10 years ago and still have to make inroads into regular markets.
In this regard, industry has a very important role to play. Industries are in direct, daily contact with consumers, tracking consumer patterns and behaviors, demands, etc... This gives them the opportunity to define the future of their business, and in doing so, the future of their products and ultimately technology. While digressing from computer science research, I strongly feel it is their sensing of the need and their forevision that led companies like Philips and Google to develop the DVD and search engine technology and not vice-versa. Applied research cannot be anything but need based. True, most of this research takes place in universities, but where does it originate? The DOD, Bell Labs, IBM.
Where is applied research taking us today? Perhaps IIT Bombay's focus on low-cost computing or the increasing standards coming in for pervasive computing are a pointer. Needless to say, these ideas probably originated in young minds in computer companies. Are university researchers then the vanguard of advancement in the field of computer science? I don't think so.

 

Post a Comment

<< Home

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 2.5 License.