Showing posts with label ramblings. Show all posts
Showing posts with label ramblings. Show all posts

February 15, 2011

Kindle for a skeptic

I like the smell of books. I like the carefree grip of a paperback, and the sense of accomplishment as the lighter side gets progressively heavier and vice-versa. I love having a bookshelf, with a solemn procession of books with shiny covers, adding to a collection of fond memories. Which is why the e-book revolution was, like Facebook, part of the technology basket that I stubbornly refused to embrace.

Until the wife broke our no-gift pact to buy me a Kindle 3 on Valentines' Day.

Notwithstanding my personal misgivings about ebook readers, the Kindle is a gorgeous gadget. The third iteration of this iconic reader is practically calling out to be held, switched on, and just be read. Everything about it seems natural somehow - the weight, the matte finish, and even the color of the print on the device (it is a dull brown and not a bright white which could potentially interfere with the eye). I couldn't wait to get started.

But about five minutes into exploring the new device, I felt a familiar sense of foreboding. Everything about the Kindle was designed to sell me ebooks from Amazon.com and I wasn't yet ready to ditch their dead-tree cousins. All I wanted to do was upload some of the pdf files I had with me onto the reader and test it out. Instead I had to spend time figuring out the difference between the @kindle.com and @free.kindle.com addresses. Then understand formats, and what could be converted and what couldn't. And most irritatingly, what I was going to be charged if I used Whispernet as opposed to a USB cable to throw files into my Kindle. This precisely is what I intensely dislike about these ‘simplified’ approaches to using something I just bought. I bet the idea was beautifully designed by an engineer, monetized by a suit, stratified by a marketer and documented by someone that did not give any hoots. Yes, in the end it works, but it is not pretty trying to get answers to silly questions when all you want to do is take it for a test read.

There are other, less annoying idiosyncrasies that take a little getting used to. The biggest is the screen refresh when you flip a page. The entire screen goes black, then the letters first appear as holes, and then the holes and print are reversed. Sounds annoying? Well it actually is. And then there is the interface. For the abnormally large number of keys on a reader, any navigating feels clunky. Think blackberry style interfaces without a similar unity of purpose. As I said, these are annoyances. And once you manage to suppress the reflex action of reaching out with the second hand to flip pages things look up quickly.

The screen though quirky, is wonderful for reading. The choice of font (Caecilia) is very eye friendly. You can customize the font size to your heart's content. And it really can be read in direct sunlight, as long as you are not holding it to directly reflect the sun itself. As a reading device, the Kindle is astonishingly well suited.

For me however, the value of Kindle is not in replacing the books on my shelf, but in extending my own sphere of reading. I am and will continue to be nostalgic about my paperbacks. But there are things that a Kindle can do that no dead-tree book can - be alive. Sign up to Instapaper.com and set it to send you daily reading digests makes Kindle the perfect way to read those long articles. Get Calibre and stop fretting about formats. And most importantly, I can now get to services like getabstract.com, click a button and actually have the ability to catch up on the thousands of books people seem to write everyday.

Now if only I could tweet about all the reading I do. Oh wait, I can.

December 05, 2010

Snow from a distance

Growing up the tropics there are certain earthly processes that have been alien to me. Like have four distinct seasons, as opposed to the rainy hot and the regular hot. Or like living with snow. Everything I knew about snow was thanks to the propensity of Bollywood to choreograph elaborate dance scenes in exotic locales, and have the hero and his romantic interest roll around in the snow. Here are some things I never knew till I came to live in Wisconsin.

Snow can get really dirty. Yes the first snow is indeed pure and white. But once on the roads and the parking lots, it quickly gathers grime. And shows it. What remains a couple of days after the snow showers is a grimy cold mix of mud, ice and grease. Very unlike the postcards of the pure white.

Snow is, uncomfortably, wet. Everyone has seen the image of the girl catching the first snowflake and blowing it free. What happens in reality is that the moment a snow flake hits your skin it melts. Leaving you damp, cold and miserable.

Snow is heavy. It isn't ethereal powdery stuff waiting for the first breeze to float by. Instead scraping a 4-inch deposit off your car makes you wish you had spent more time at the gym.

Snow can get really hard and slippery. As feet pound the powdery stuff and sunlight melts it a little, everything freezes back like an impromptu skating rink. Inviting unsuspecting passers by to make fools of themselves, just like in the regular skating rink.

Snow never really occurs looking like those beautiful hexagonal crystals. Well, it probably does, but the crystals are always too small to see. And if you try to catch one, see above, all you end up with is a little wetter and a little colder.

For me, the first snowfall was definitely a wonderful occasion. So was the second. and maybe the third. But very soon, the slushy roads, the slippery sidewalks and the scraping of the car made me realize the truth. Snow is just like rain, but colder, sticks around for a awful lot longer and makes you wish you had just seen it from a distance.

May 24, 2010

How LOST it should have been

Yesterday was the big LOST finale. For me and the wife, it was a major let down. She insisted, she needed some closure after the six years worth of episodes. So, well, here is the alternate end to LOST, the way it should have been. Read more after the break.

April 22, 2010

In defence of whole words

Whole words are like whole grains. The difference may be just bran - but bran contains fiber. And we all know what happens when there isn't enough fiber in our diet - posterior occlusion.

The twitter lifestyle has taken its toll on thoughts as well as words. When it is acceptable to yell ideas to the whole world in 140 characters or less, there is little need to keep words whole. What started off as a measure of relief for cramped fingers on ancient keyboards, has now morphed into a dissonant assault on language sovereignty.

Da tru stmts v make wid da cripld words, do lil 2 convy da msg, let alone beauty. Instead dey tnd 2 frstrait n slo u dn. ur readin speed goes down, n da pleasure of readin disappears. When you tend to favor the shortened words 2 regular words, it directly impacts ur credibility. It is easy to judge a book by the cover. And the cover you present the most when you write is the words you choose. Choose the word, and nothing but the whole word.

I make no claims to even remotely being a purist. I have been known to occasionally dab in the forbidden myself. But, lets make a pledge, you and I:

If you have different keys for B, C & A,
we keep the L-O-Ls at bay.

SMS speak and l33t speak are a precious commodity, use them as sparingly - like expensive china.

Chirp in the comments if you agree.

May 08, 2009

Talking to myself

When I re-started the blog a few days ago, one thing that played on my mind was the audience or lack of it. Over the last few years, I had been indoctrinated incessantly over the need to line up messages to an audience. Writing without an audience, as with this blog, seemed to be a hark to the old days and the old ways.


It took a bit of an effort to shake that feeling. What helped was the time I spent re-designing the website, hosting the blog on my domain, and learning a bit of XHTML, CSS and Javascript to jazz up the place. Nothing is better than a bit of creativity and mental challenge to drive clarity.


That is when I realized that the joy in what I did on this blog was for me and me alone. Writing is something I liked to do, and this provided a means for me to instinctively fulfill that urge. The website provided me with a means to basically bring together stuff I did over time to one place. There was a definite satisfaction in seeing a project come to fruition, the beats having to pander (!?) to an audience.


As Krishna says in the Bhagwad Gita:


To action alone hast thou a right and never at all to its fruits; let not the fruits of action be thy motive; neither let there be in thee any attachment to inaction.


It is the joy of writing and working on my website that is my biggest gain. And thus I go, talking to myself.

September 11, 2005

The Brand Name

For those of us who have been bought up on a diet of management jargon or clothing catalogues, the word 'brand' invokes images of name labels, glitzy ramps and press conferences. We believe that branding is a third party activity, which we then use to show off to our friends.


Many of us miss the universal meaning of the word brand. A brand is the sum total of images and feelings that a word or image invokes in us. The reason a brand becomes powerful is that it associates a lot more information on this one word or image, than the image could possibly hold on its own. And many a times, such association is not necessarily logical - and this lack of reason is the source for the assymetrical power of the brand.


Take, for example, the branding that each and every one of us do everyday as we profile the people around us. "He is a libran - never decides". "She stay out late, god only knows what for". "He is fun and intelligent, must be gay". Profiling is one of the fundamental effects of branding that we constantly overlook in our daily lives. Such profiling was probably built over time, with constant reinforcement, giving these words, images and ideas such a hold over our though process, that we not only let them rule our actions, but we frequently fail to see through to the profiles we carry.


Take another example - probably the biggest brand on earth - God. No one has seen him. No one has interviewed him. But everyone knows everything that is to be known about it. No, dont get me wrong. I am not against theology - just that this entire study seems to be an massive exercise in brand building.


This post is not about theology, or a delve into the idea that make up god. Rather it is about some other brands that we carry along with us. The people that we meet.


I have been carrying around a brand, that recently came to the fore again. This man, was a professor of mine at a point in time. During the period, my interactions with him brought out ideas that I live by to this day. And a by-product of this was that I have this brand image of this person, his idea and his thought process. So when I found out that he had published a book, there wasnt even the slightest hesitation in my mind as I clicked through the steps online to get it delivered home. I knew it would be a while before I would return home to read it, but there was no hesitation is ordering the book on the spot. The power of the brand image.


The book reached home and my mom read it. And she loved it like she loved no other book till now. We spent an inordinate amout of time talking about it and reflecting on the book. I realised, that I was indeed talking about the book and the ideas therein without ever having seen the book. The power of the brand - the ideas that the brand carries along.


Make you take a pause and reflect - when you form such powerful impressions about others, wonder what your brand is talking about you. Of course, everyone wants to be known for their plusses - but the brand is a whole lot different. It is not just your features, it is the entire feeling that your recollection evokes in the person. What is it? A warm fuzzy feeling? A cold dread? Silent admiration? Masked disgust? What does your brand speak about you?


warm regards and heloo if you came here via the IIMC website


-- nrk
You might also want to take a dekko here.

February 16, 2005

Freedom to Share

The thing about thought process for me, is that it does not go forward unless there is a IO port attached to it. In other words, if I ever had to think a thread through, I need to either be talking or typing or wriggling toes in the language of the three toed sloth.

So, as it happenned, I knew deep down that I felt quite strongly about music, piracy, sharing etc. However, never got to put down ideas, and never realised how much it affected me.

So, we were walking down home after a particularly indifferent movie, and this topic came up. So the thought process ran. When you are faced with having to justify yourself to a particularly sharp and acid person, your brain wakes up cells that were never used to be awake, cleans up cobwebs and cranks up the storage devices. And the results are particularly consistent. Till, you plan on putting things on paper when you start to choke, under self doubt.

But, this is my blog, so what the hell.

Lets start with a question. When HBO tells you, "Say no to piracy" and be HIP and what not - what exactly are they telling you. Are they telling you to stop buying stuff you think is pirated off the vendor on the corner, or are they telling you to stop downloading stuff from the internet, or are they asking you not to watch a movie with your friend. HBO in its clips seems to say the first thing, MPAA in their lawsuits seem to say the second thing, and companies like Microsoft with their DRM ideas seem to say the third.

On the whole, they are pretty confused about it, but ask any one of them, and they will tell you to respect "Intellectual Rights". Granted, I want to respect intellectual rights - but will that mean they will stop treating me as a thief?

That is the crux of my problem with the entire anti-piracy thing. I have already been labelled a thief. Read the small print written by some lawyer on your CD you bought. You did not buy music, you bought some rights to do something with music - like listen to it. And whenever you do anything with the CD you bought, which has not been explicitly okayed by the small print, you are a thief. You go to a friends party and play it in his player loudly - you are a thief. You rip the CD to play in your HD player, flash player, car-mp3 player and a backup copy - you are a thief. And for a CD you supposedly bought, you dont own anything about the CD, not the songs, not the artwork, only the plastic. And unless proven otherwise you are a thief.

I would still consent to being labelled a thief, if music and movies were priced any cheaper. Technology has driven down the cost of music and movies. Songs can now be recorded and mixed at home on a laptop. You no longer need expensive recoding studios and their costly help in making music. But music is still priced the same. Artist costs and their profits are shrinking, but the end price is still the same. And who is getting an increasing slice of the pie - RIAA, MPAA and the rest of them.

One could have argued a decade ago, that these were very important and worth more than their share of cut in on the music prices. They helped artists get launched, the paid for recording up front, they help advertise, they promoted and most importantly they distributed. But today, it is no longer that crucial. The growth of the Internet has put a critical question mark on the reason for the existence of such associations. Downloading costs next to nothing, the Internet looks after advertising and it does a much better job than the RIAA ever could. The Internet breaks even faster, helps artist make money faster, reaches out to all kinds of audiences, does not differentiate between a super-hit or a hit or a moderate hit or a niche hit. It does not differentiate between rock or pop or hip hop or techno. It does not differentiate based on age, religion, region or sex. And the MPAA/RIAA are not happy.

Not only are they not happy, but they dont want to accept this change. Making costs are dropping, selling costs are dropping, shipping costs are dropping, advertising costs are dropping, reviewing costs are dropping, but the CDs are still priced the same. And all users are thieves.

This is why I dont care for their message against piracy. Of course I respect the artist's rights, of course I want to pay him. But I want to pay only the artist. I want to play a small and reasonable amount. I want to pay for songs I like, not entire albums. I want to share it with friends and when they like it, I want them to fully own the song by paying a similar small and reasonable amount. And most importantly, I dont like people calling me a thief.

Next post - mechanism for such a world.

etc etc long live robin hood
-- ravi

November 19, 2004

New Blog templates

It is wonderful how a company makes a difference to a product. For all that dilbert says about companies, which I am sure I almost always agree with, it its own way the company is an indespensable part of getting something done. Yeah, it will always be slower than a motivated individual, but it will always be better than the majority of us randomly spending time who might effectively cancel each other out.



And I slowly start to believe that companies actually have a character of their own that rubs of very explicitly on its employees. Just a few days back I was writing to a groups of friends from way back in college. And they commented on how different I think. They asked me if my educations had anything to do with it. No. The company did.



Eerie, unacceptable, grossly unappropriate but true.



You might be wondering how this ties up to the heading. Well, I was looking through some of the options of blogger and somehow I felt that google was behind some of them. It may be the layout, the style, the wordings I really dont know. And additionally when blogger came out, it gave an awesome set of tools to make your own template. But a majority of us out here dont have the patience nor the expertise to make good templates for ourselves. We would depend to a great extent on templates given by blogger. And check out the awesome set of templates that are available now. I *think* google has something to do with it.



holding on to the me in the company

- ravi

February 10, 2004

Metacomm Pollution

I was sitting having a sandwich and cold coffee, reading BBCi over my T610, when this idea struck me. It started with someone making some sort of wearable computer which is intended for a human, but a robot can also use it. Somehow building a robot that used stuff made for humans seemed very funny to me. I thought about it. The reason I thought it was funny was that it seemed to be very dumb to make bots use interfaces developed for us.



Think about it, the reason we have these vague interfaces for humans is because we as a species are "interface constrained". All our interfaces for communicating with the outside world are low on bandwidth and are inefficient to go. Why would you want to build a bot that uses a mouse - you would rather connect the mouse port and allow the bot the directly talk to the mouse port. Rather, you would rather that there was no port at all, which would take care of the low bandwidth and restricted data that can is transferred through a mouse interface.



Now think about it. We as humans are restricted by the rate at which we communicate. Remember the peripherals of the computer which was the reason why the processor was never loaded? Are we not a similar analogy? It is our rate of data interaction with our surroundings that prevents full fledged thinking. Take a look at what I am doing - I got this idea in under 10 seconds, and for the last 5 minutes, I have not even come close to explaining the core concept.



Humans are like computers with extremely slow data exchange interfaces and without multitasking.



Life in the modern world is driven by nothing other than this aspect of the human. Humans had basic wants like food, clothing, shelter etc. These were met. He then had some higher needs, which were also met. However as his needs were met, he started facing the effects of his twin drawbacks.



He started finding difficult to concentrate, to be productive. He found it difficult to think and talk at the same time. He found it difficult to communicate with others, put ideas across, get ideas understood and accepted. This has meant that man now stopped communicating ideas and started communicating about the need to communicate better. Sort of like meta-communicating.



Meta communication is slowly but surely clogging up the modern world. The effects of meta communication pollution is visible all over the place. Management is now the most meta communication polluted of all human practices. All the buzz words, all the terminology, all the talk is little substance and all hot air. However it is understood. And reused. Thus meta communication takes precedence over communication when it comes to management. The how's, the font sizes, the styles, the dress of the presentor. All these take precedence over the content, which is delegated to the backstage. Typical of the meta communication polluted day in the life of a manager.



Lets get a word in. Metacomm is short for meta communication, or communication about communication. It refers to all communication that is not directly related to the communication that has to be done. Metacomm also refers to all activities that are an overhead to the action being performed.



The next most metacomm polluted environment is the new fangled "process oriented" software development companies. Software development is an art and a science. When it is treated like manufacturing, a number of steps need to ensure that this metaphor works - metacomm. The CMM levels, the ISO certification - all refer to process orientation and more importantly spew out metacomm.



I could go on, but I will not. Metacomm is the next big evil that humans have to face. In fact it is the biggest evil in the recorded history of mankind. Information overload with metacomm pollution. Degrading lifestyles, deteriorating minds, stressful environments.



Metacomm pollution is the mother of all future shocks that toffler could ever imagine. And believe it or not, it is here.



And I have a feeling mother nature knows about it too!



More on mother nature, up next.



- ravi

January 13, 2004

Intelligence

Is a question that has troubled me. Why is that a question? The question is, what is it?



People frequently confuse knowledge with intelligence. People confuse wit with intelligence. Or don't they? Where does knowledge end, and intelligence start?



Consider this, when we sit here, discussing stuff, what part of the discussion is intelligence? If I tell you something you dont know, is that intelligence? Or is that knowledge? When I give you a fresh perspective, is it because I can process stuff faster or because I have had this experience before? And this knowledge itself. How much of it is conscious and how much of it sub-conscious?



If knowledge can be sub-conscious, then where is the line between intelligence and knowledge?



Look at an example. Someone tells you that a deal he is involved in is going well because of something very attractive that the opponent is offering. Suddenly warning bells start ringing. And you try to analyse that something might be wrong.



Why did the warning bells ring. Is it only because you are stupenduously intelligent than he is, that you saw a mole where he did not? Or was it that you have been given a raw deal before? Or is it because you have been brought up in an environment that made it difficult to trust people (knowledge again)? Or was it that you have heard of something like this before? How much of it is original knowledge, and how much is original processing or intelligence.



The reason I am talking about this is because I had taken one general IQ test somewhere. And I scored a 136, which according to the scale meant I was up there with Einstein. Which of course is not true. But many of the questions in the test, I was able to ace through because I had worked with those types of questions before. So for me it was a cakewalk. That does not mean I am a genius.



If it is difficult to separate intelligence from knowledge, why are we trying? What is this IQ test all about? Finally, shouldnt we have different semantics to deal with this difference?



keep thinking :),

- ravi

December 18, 2003

Careful what you wish for...

A very popular line of thought in the FOSS camp is that we desperately need a shot of the corporate for true success - a successful business based on FOSS, a successful business partner, a corporate contributor, a successful corporate desktop. A favourite pastime for moderates and arm-chair supporters in the FOSS camp is to smugly wallow in the success of products, services, initiatives and companies that depend on the FOSS output.



I used to do the same too, but, I believe it is now time to review this mentality. For now we are faced with the grave danger of having the voice FOSS hijacked - by the corporate PR team.



FOSS and products like Linux have been coming of age. Common people, unlike the ones that read slashdot, are beginning to situp and take notice. The media has now stopped making moving pieces on FOSS and Linux and has started 'reporting' news in the same breath as IBM and Microsoft. FOSS is moving out of being a news worthy oddity to plain vanilla news.



But where do the news crews get their news from? When the common man and the common editor did not understand FOSS and Linux, the reporters who wrote needed to "understand" the phenomena for themselves and then report. But now everyone understands it, and all they need is quotes. And these quotes will now come from someone who has "credibility" with popular press - the corporate entities.



RedHat will now be a bigger "authority" on Linux than Torvalds. IBM will "understand" usability better than Ximian. Sun's views on Linux will will be "expert" views. And then Google will start spilling out sponsored links for searches related to FOSS.



And this is the next big hurdle that the FOSS camp will have to overcome - to gain a media prescence, a credible link to the world and not lose its voice to its corporate cousins.



This is bigger than any of the problems faced by Open Source and Free Software today - bigger than Microsoft or SCO. If a company makes an irresponsible remark, then the company will labour to correct it, because it will directly be affected by it. But with FOSS, as usual, you have new problems, unheard of before. Companies will now be free to make comments today, but there will be no one to retract those comments. Bruce Perens may counter SCO today; but how long and who all can be countered this way.



How can we go about giving FOSS a voice of its own?

- ravikiran n.

January 08, 2003

The weakening of the written word

With the explosion of the Information Age, there has been a great hoopla built about the easy accessibility of information. The great Information Divide it seems has been conquered. And Information is available to everyone and at everyplace. But what has probably been lost is the fact that this easy access to information has actually lessened the impact of information.

No, I am not talking about the Information overload that is causing people to spend lots of time just trying to find the information that is relevant to them. I am also not talking about the increase of information availability leading to people broadening themselves, speaking with reference to knowledge, and not gaining a sufficiently deep understanding at the same time.

What I am talking about is the relative weakening of the written word vis-a-vis the spoken word. Seems as though we have come a full circle, from the days before writing ever existed. When the only word was the spoken word. Now the Information age is restacking the odds for the spoken word - the word of the expert.

The reason is this. The explosion of the new era has driven down the costs of information disbursal - and the costs of information generation. Anyone can sit in front of a computer and can generate information - something like what I am doing now. This has meant that there is no automatic disincentive to generate information, which once allowed only those who actually had knowledge to embark upon disbursal. When a book was released, there was a certain certainty of quality associated with it. Though this has been coming down with the decrease in cost of publishing in the recent times, the information explosion has been among the last of nails onto its coffin so to speak.

The typical manifestation of it is seen in all sorts of situations. One is the proliferation of impersonation sites. These include sources of information that are not bonafide either by design or by accident. Those by design include the hoax sites, hoax email chains and so on. Those by accident include all the personal information sites which include and are not restricted to blog sites, information discussion fora, fun focus sites, ask a question sites etc. And the information is anything from health, to technology, to personal blacklisted email domains. These information sources have such a low signal to noise ration that it is increasingly becoming difficult to figure out signal from noise.

This has led to a rapid disillusioning of the information seeker. "I know this is true, I read it on the Internet" does not hold must water anymore. Once bitten twice shy, users are rapidly switching to not trusting the Internet for their informatino needs. Some who can actually separate the signal from noise are profiting, while there are a lot others for whom it is either mistrust or increasing exposure to quoting the wrong information.

Typically technology has responded too. A number of methods have come out which try to understand and review information. The volume of the Internet is so huge that it was deemed impossible to manually classify information. Hence there were a number of automatic, technical methods of information classification which came up. Some it seems succeeded, like the omnipresent google. However as with technology, somewhere the users got whiff of technology and go into the act of meta information manipulations. These moves is slowly rendering difficult data quality prediction using technology. The circle now is complete - it is back to man and manual methods to classify information. Back to the expert. May be technically it is the published word, but it is a good as the spoked word - the word of the expert.

A number of models which do this are currently in vogue. The about.com's initiative is one such effort. It aims to manually give the best sites of information for all the information needs of the Internet users. The other model is that of peer and continuous evaluation. Sites like experts-exchange and slashdot are typical of this method, where experts are the users and where these experts cull out the best of the information available. Hence is the relative weakening of the written word vis-a-vis the spoken word.

haffun

~!nrk

November 29, 2002

Source Perfect

I was reading this article and its prequel that was posted on /. As the title of the story suggested, the author makes a point that all software source should be open. That is, programs should not be sold without bundling the source that was used to produce it. The point he is trying to make is that, just as buildings and bridges do not hide what they bring along with them, so should software not hide what it was built from. He does advocate crippled source to make sure that people dont recompile and all that, but that idea still being that only by making source open can one actually make sure that people dont write sloppy code.

I like that idea. As in the point that source should be open to make sure that programmers do their work properly, and dont hide behind the compiler for producing bad code. But the idea as he has presented is not, according to me, viable. The reasons are simple.

  1. The analogy between buildings and software is not correct and does not hold. Firstly since seeing a building or a bridge is not the same as seeing a source. The analogy is more like blueprint and building. Secondly, what one can do with source, one cannot do with a blueprint - like reusing a part of it, copying it ad-infinitum and so on.
  2. Even if we do make it open, who is going to check it? How qualified is he going to be to have to see source that does not compile and tell you if it is good code or bad? When was the last time you saw source code and judged it?

But coming to back to what I was saying. I do find myself agreeing with that fact that source should be made available. Only then can we get some sort of responsibility as far as building source is concerned. And this is a major flaw in the entire process of software building which i believe is fundamentally creating problems with the software (read IT) world. So this is what I suggest. What should rather be done is that we should have some sort of third party certification. Just like html is checked for adherence to standards, code should also be checked for adherence to standards. And companies should be able to proclaim that their software is "Source Perfect". I dont really know if we have such an idea lying around, but this sure is worth trying.

Of course, this has its drawbacks. The standards that need to be checked adherence to. That is the need of the hour. We need to define what good software is. Everyone knows the properties of good software. We should be able to standardise that and make it platform, implementation independant. Then we can be on the first step towards building a world having software that is "Source Perfect".

We had a recent meeting, for some work. There we were meeting these alumni, who were 25 years down the line. I was making a presentation to them, and said "And that is the reason I think I can safely say that we might be having one of the the best websites in the world". To which the answer was "That is precisely the problem with you new generation. Have faith. Say 'Ours _is_ the best website in the world'"

Amen to that

~!nrk

November 22, 2002

What was that again?

hi

just a thought

if

a. ppl can control it

b. it gives u an easy temporary escape route for sometime as compared to the tougher alternative of goin thru a process and maintaining ur cool...and by getting away from painful situations, a person can concentrate on the main things on hand..that else get bogged down due to worryign abt them..

c. taken infrequently, shldnt do grt physical damage in short run, and in the long run all of us are anywys dead..

if one can use it to buy some time to deal with life so that other imp things can be done more efficiently, then

why shouldnt a person drink/smoke?...

I have never done this before. I mean posting other people private stuff on my blog. Ah well, everything has a first right? And this is fine, I guess, simply because there is little personal about this.

I dont smoke or drink. In the words of people who know me, I have the best dope profile. I have long hair, work on computers, read and believe in 0wnz0red, I listen to hard rock and death metal and front a scowl for a neutral face. That meant that no one ever fully believed me when I told them I dont really do em dope. That explains, I hope, to the third party reader about the italicized post earlier.

The answer to the question is just this - No he should not stop drinking or smoking

The world has people, and people are humans. Humans have consciousness and since time immemorial you have had this consciousness in trouble. There is pressure, there is pain. These negative feelings have been the bane of conscious thought ever since the first guy figure out that rolling is different from dragging. And humans have taken a zillion ways of dulling this all pervasive, all powerful consciousness. They drank, smoked, doped, took drugs, injections, morphine, invented GOD, started religion, invented prayers, formed associations, for institutions, established schools and colleges. All designed with one objective in mind. To dim the consciouness and dull the intellect. No, dont take me at face value. You think, prayers are different from drinking? Think again. What do you do when you are unhappy, or otherwise feeling down. Have you ever heard about the healing effects of auto-suggestion. Or the narcotic effects of the same. Ever seen Fight Club? Do you know what Oxygen is? My dear reader, the whole life structure you have been exposed to one that is designed to not let you be at full capacity. Music - Ah well, this is one of the most powerful of narcotics available to all of humanity. What are the only things that dont depend on language, customs, place of birth, color of the skin and sex? Music, sex, narcotics, religion. See the similarity?

Why? I have no idea. But this just makes me feel that we were not be be born on earth at all... But that is another story.

Back to the point of discussion. There is nothing fundamentally evil about smoking and drinking. Okay there is the angle of health right? I mean you will die sooner if you smoke and all that. Well here is the deal. You think the other solutions are any less deadly? Religion - kills inventiveness, kills the spirit, kills innovation, kills motivation and lets you live longer - why? Music - eats into time, halves effective ability to focus. Society - one of the biggest ills of the recent society - does everything that religion does and worse, makes you feel good about it. So how are drinking and smoking different - they probably shorten your life, but what if they allow you to make your life more productive?

Okay there are exceptions to all rules, including this one. And the exceptions can be found both ways. So lets forget that for a moment and focus on the thought at hand.

There is no good or bad about smoking and drinking. There ought to be no reason why you should not do them and till stay in a society and practice religion. But the decision not to smoke or drink should stem from what you feel about losing control over yourself. Do you really want to lose it? Then go ahead, you will not be worse of for it. And that is the closest you will get to truth, coz there is none of that out there.

~!nrk

ps: no thought is arbit, it is all in the mind after all.

October 26, 2002

Music

I love Metal. Metal Music. I really dont want to bother about definitions or try to tell you what *I* really think Metal Music is or ought to be. That is not how I work. Rather I will just try to make a little distinction for those of you, who are not necessarily into this kind of music.

The entire scene of Rock and Metal is not really clean. Once you start delving into it, it starts getting murkier and horribly convoluted. The convolution starting with the nomenclature. There are a variety of rocks, metals and other stuff all over the place, each band promoting yet another of its own variety to stand out. I dont have either the patience or the capacity to go into all these. Primarily look at three kinds of music, rock, metal and alternative. And that is more than enough for you to enjoy music in all its glory.

Rock basically stems from the early rock-and-roll bands of the beetles with their amazing (now dated) attitude towards life. It progresses with time becoming more modern and more rich with groups like Rolling Stones, Led Zepplin, Van Halen, AC/DC and peters down at groups like Guns N' Roses. Here you see attitude, a common aspiration set, and musically an increasing dependance on guitar strings, on distortion and a decreasing volume and increasing pitch of vocals. The technology use also increases across this set. The themes are predominantly emotional, with love hanging in big time. "she" is almost always there. So are misunderstandings and broken hearts. Of course you do also find traces of other stuff, like fear, animosity etc. But that is more attitude than anything else. Rock is one of the biggest influences on the lifestyle of a large number of people over a particularly long period of time.

Metal is the big brother of Rock. Whatever Rock can do Metal can, only harder. Metal is the more noisier, faster and more rich cousin of rock, so much that it almost becomes a disincentive to a majority of Rock fans. "GN'R is fine, but I just cannot listen to anything harder". Metal in its numerous hard and noxios forms - hard, trash, neo, industrial, etc. focusses on topics different from the traditional ones. Aliens, Satan, God, the Sandman, fear, anger, jealousy and other more baser and powerful influences are found in plenty. Guitars are particularly caustic, vocals throaty, lyrics explicit and sex being not the only reason. Metallica, Megadeth, Pantera, Slayer and other incredible powerful bands form the line up here reaching bands like Clown, Gwar and others that make words like 'music' a distant stretch of the imagination. Metal is more about making a statement, not to those around you, but to yourself. While rock shaped how people lived, ate, drove and danced, metal shaped the way people thought and felt. "she" and love a mostly missing.

Alternative is all else. In between, in the flanks, all around. Floyd, Linkin Park, Amorphis and so on and so forth. Probably all other forms like Gothic, Grunge, Rap and the lots can be dumped into this couldron. Alternative is what you do when your emotions are too raw and need either a time-out or healing. "she" might hang around, either to be dumped or slaughtered. Alternative is everything from bad-ass to everything sensible about music.

This is good. But the problem is not to tell a Rock fan what I feel about Metal, not to tell an Alternative fan about the stirrings of a Gothic Symphony. Rather, it is to address that mire that thrives in what we call the pop and the classical. Pop is all that rock is not. Infested with sugary femme fatales and boyish kids with neither the originality nor the substance to move from the disco floor to the home, from the lips to the heart. It is not that I am against pop. It is just that I hate it.

Classical is everything pop and rock are not. Classical is what I call all other forms of music, especially indegenous music evolved over the centuries. Music where everything is not just black or white. Music that realises what it really is, and the incredible power it holds over the listener. Music that is single minded in its pursuit to replicate the world in the air waves. Music that told you where to go, not what is or what can be. That is what I refer to as Classical.

Classical music is really a mature form. Evolved over time, it is aware of its potency. It knows what music does to the emotions, and knows how to manipulate them and how to effect them. Only that it is simple. Classical normally deals with music in its elements. I deals with singular, basic emotions. I knows happiness, sadness. It knows peace. But it does not care about higher emotions. Nor does it concern itself with more baser forms of emotions.

Rock and Metal fundamentally differ in this respect. These are considerably richer forms of music. Richer in the amount of emotional content that they can carry. They can tell a story and not just evoke an emotion. They can evoke fear, anger, angst, helplessness, anxiety. They are emotions from one mature intelligence to another. They deal with the adult and not the child.

People are put off from heavier forms of music because it is noisy. Because it will affect the ears and cause deafness. Hey, dont do that! Give Metal a Chance. If it is loudness that concerns you, just turn the volume down. But listen to harder stuff. If you thought noise was not for you, you are sorely mistaken. Nothing can replicate you as powerfully and as completely as metal or rock. While classical can make you yearn for what you are not, metal can show you what you are. Metal is noise, but noise is not bad, or silly or a cause for deafness.

People tell me that Metal really hits them when they are drunk, when their defences are down. If you want more proof that metal talks to your heart and your soul, you probably are better off listening to britney spears talking about how beautiful she is.

Dont discount something just because it is noisier.

Give metal a chance.

Ever wondered what would happen if you were all alone in a city. Everything else existing, but if you were the only living person in the entire city. How would that be.

I am thinking I will write a little fiction in the days to come. Am kinda fed-up with all this serious stuff. The first part of the story coming right up.

~!nrk

October 25, 2002

Objectivism

"Lets think objectively" is a common ring you hear, especially if things are not going the way the speaker wants them to. "Objectively speaking..." said that very important person when he was asked about his views on the oil embargo. "It is time people rose above their narrow considerations and thought objectively" said a great saint who was dressed like what seemed like a saint.

So I decided, I would do it. Objectivity in everything I do. Be in oil embargoes or just basic thinking. But I failed.

How would you go about thinking objectively. Someone said "do it with math. There is nothing more objective than math. There is nothing more objective than cold numbers which know not what the user wants." That sounded good. But I really wanted to objectively evaluate who among two girls in my class was more beautiful. Objectively, I assigned scores to various aspects and summed them up. I got an answer that demended a recount. So I objectively did the exercise all over again. A second re-count became a necessity, for the results just got flipped. This continued for sometime, when I realized that the answer actually depended on the time of the day. As time passed relative beauty changed. Objectively I concluded that beauty depends on the time of the day.

Then someone asked me if the actress in the new bond movie was beautiful, and objectively I asked him the time of the day. And he hit me on the head with a rolled copy of the filmfare magazine - quite objectively.

When I was a kid, some people at the place I studied showed a remarkable lack of imagination. "If I become Prime Minister for a day" was a favourite topic for essay writing. And the course was not FNT101 Introduction to Fantasy. Last I knew the people at that place were still that unimaginative and they still have that for a topic.

And I could never do justice to that topic, simply because I could never do justice if it actually happenned. Governmental role has been critisized so many times and so severly, that this would not even matter. The most common grouse has been that it is always partisan, and never neutral. Or in other words it is never Objective.

But every theory dealing with either human behavoir or with human action is based on the fundamental assumption that humans are selfish bigots, who dont care two hoots about what happens to the rest of the society. Surprisingly though, that is true.

The point is. WAKE UP PEOPLE, THERE IS NO SUCH THING AS OBJECTIVITY. I am sick of people using Objectivity as an excuse to do anything they feel like. I am sick of people complaining of lack of objectivity in others. I am sick of not understanding why other people cannot be as objective as me. This is my wakeup call. There is no such thing as objectivity. WAKE UP.

I love compl(I)ments, with an I and not complements which has an E instead of an I.

so long and thanks for the compliments

~!nrk

October 04, 2002

CATB Revisited

This is in continuation of an earlier post, where I talked about the Cathedral And The Bazaar (CATB), which I believe is one of the most powerful essays of its kind. In this post I want to talk about an experience which makes the Bazaar a more meaningful idea.

Okay the basics first. The CATB talks about two forms of software development - the cathedral or the proprietary model (the Microsofts of this world) and the bazaar or the open source and related models (the people behind GNU and Linux etc). In comparing the two models there have been sevaral ideas thrown in, economic, psychological etc. Here is another insight.

Was talking to a friend yesterday, who had worked in an Infotech company called Infosys. During the discussion we almost immeadiately agreed that she was not a person who hated technology, until she joined Infosys. Something happenned at Infosys that forever changed her opinion about technology. This somehow brought on deja vu for me. I remembered having the same discussion with more than one person earlier. Suddenly it struck me, that the reason why ex-Infocions were shying away from technology could actually lie with their stay in Infosys, and might not be a coincidence after all. I decided to find out more.

It seems that Infosys, during the days of boom, used to the hilt its USP of cheap Indian workforce. Infosys, periodically hired some of the best talent in the country by throwiing enormous amounts money at them. In fact, Infosys became such a phenomena that any engineer, irrespective of discipline, wanted to end up doing a job with Infosys. So much so that there actually were fears that there would be a resouce crunch in other engineernig disciplines in the country if Infosys continued on its path. Fortunately, the bubble burst, but that is another story.

Now all these engineers, intelligent people mind you, who were 'bought' by Infosys were taken to their imposing zoos and housed in A/C cubicles and were given crap as work. Infosys followed standard "cathedral" models of software development. Small groups were given specific tasks. Each individual was further given smaller, meaningless, 'coglike' projects. It primarily consisted either of repititive 'copy paste' of existing code, or testing and recommending changes to other code authors. Work was always in small modules.

There was no real development - no one developed a sensible module, everyone attacked a small very very focussed I/O situation. There was hence no learning, either of the programming language or the logic of the problem or hints of the solution. Further, due to reasonable code archives, development was little other than copy-paste. It generally ended up as dreary repititive work, but someone had to do it, and someone human. Not only was there no learning, but the reward for doing something well was repitition of the same job - over and over again. In the name of specialisation, absolutely no job rotation was possible, atleast not enough to retain interest. And about having ownership of written code - hah forget it.

One another class of operations was testing and bug-fixing. The testing section was another nightmare. Testing, is something not generally relished by builders of code themselves. Imagine having to test code that has been _assembled_ by another person. Sheer boredom - no quality work was possible. Still more stagnation in the entire process.

Bug-fixing was worse. Yeah it was. Bug-fixing naturally involved more than one person, the testor and the actual developer. And relations between the twain always managed to deteriorate. This meant that bug-testing was never with a view to improve code, or performance, but just to impress/escape observation of peers. All the wrong reasons, for the most critical of operations.

And this is a typical model of software development. So what did this result in?

First of all, it made the people involved _hate_ code. And not just code, the hatred extended to just about any technical issue. Considering that most of these people were engineers from established institutes, such hatred was no mean feat.

Secondly, the sheer monotonicity of the jobs resulted in a very high turnover of jobs in the organization. The high pays helped, but not for long, and not in the current environments. And not to mention, this did not help the quality of code in any positive way.

Third, the code suffered, and the costs sky rocketed. Some of the prices that Infosys quotes, almost makes me jump out of my skin. And this is just not restricted to Infosys alone. Ever looked at the prices of software? If you are a developer, you will know how high they *really* are. Have you ever looked at the quality/functionality of most software and wondered why it cost so much. Well here is the reason for you. Incredibly costly man-hours. And the model is to blame.

The best part of course, is that I have not even touched the great sink called maintenance, and services. That is another story for another day. But looking at the basic development model alone, doesnt it not look so incredibly inefficient. Compare this with a plausible model using open source components, and a bazaar style of development. Can the Cathedral ever match the prices of the Bazaar?

No, never. Once the customers realise that, the Cathedral will find survival difficult.

signing off as in the last post - Long live the bazaar.

~!nrk

October 02, 2002

Consciousness

Okay, here I am back. This is the first time I am doing a second post in the same day. But I just cannot help it. For I have discovered an incredible piece on the net. The page talks about some exerpts from the book "The User Illusion" by Tor Norretranders (Translated by Jonathan Sydenham;Viking Penguin, 1998; ISBN 0-670-87579-1; 467 pages). And I think it is amazing. You ought to read that book. As the author of the page said, if it does not excite you, check your vital signs. In the exerpt, the author of the page talks about the term "user illusion" which is again very similar to the concept of the "Sphere of perception" that we had talked about earlier. Amazing really. Googling a lot now. Will add more to this post as and when.

Update: Check out the first link that google threw up. Some more reading the book without buying it. I _am_ going to buy it, but this it till then. Check this out...

Shorthand: conscious self = "I"; unconscious self = "me"

...

(Ref: The Inner Game of Tennis. "When you short-circuit the mind by giving it an ‘overload’ of things to deal with, it has so many things to attend to that it no longer has time to worry. The "I" checks out and lets the "me" check in.)

...

Spirituality merely involves taking your own life seriously by getting to know yourself and your potential. This is no trivial matter, for there are quite a few unpleasant surprises in most of us. The dominant psychological problem of modern culture is that its members do not want to accept that there is a Me beyond the I. The Me is everything the I cannot accept: It is unpredictable, disorderly, willful, quick, and powerful.

From the Amazon.com editorial review of the book "The Inner Game of Tennis" (Paperback - 122 pages Revised edition (May 1997) Random House (Paper); ISBN: 0679778314)...

A phenomenon when first published in 1972, the Inner Game was a real revelation. Instead of serving up technique, it concentrated on the fact that, as Gallwey wrote, "Every game is composed of two parts, an outer game and an inner game." The former is played against opponents, and is filled with lots of contradictory advice; the latter is played not against, but within the mind of the player, and its principal obstacles are self-doubt and anxiety. Gallwey's revolutionary thinking, built on a foundation of Zen thinking and humanistic psychology, was really a primer on how to get out of your own way to let your best game emerge. It was sports psychology before the two words were pressed against each other and codified into an accepted discipline.

Primed to Discover

Did you ever notice that after you have just learnt a new word, it seems to pop up from all sorts of places. And you wonder how you were able to make sense of stuff in the era before you learnt the word. I have been watching myself watch new words pop out all the time, and have been wondering. It seems this is not just a problem with me, and in fact it is actually a documented fact. When you learn a new word, though it drops out of your consciousness, your sub-consciousness is 'primed' to tag these new words. This is a way in which the sub-consciousness tries to cement your learning through repetition. By tagging words to pop-out it is ensuring that you consciously learn them.

Remind me to thank my sub-consciousness the next time I meet it.

Hey, how would it be, to meet my own sub-consciousness. Remember the sphere of perception we had defined earlier. It was the set of all inputs that a person actually learns from. Well, I guess we ought to add the term 'consciously' to that now. And this also calls for us to rectify the entire setup of the learning process that we talked about.

In an earlier post, I had talked about the two differentiators that make a difference between the way people learn and develop. Taking on the model of the neural networks, there are two primary difference, the amount of input accepted and the learning rate on these inputs. The amount of input accepted was tagged with the term "sphere of perception". Now, we ought to make slight changes to it. The sphere of perception also includes that part of the "sub-consciousness learning" that is relayed back to conscious levels. Although our definition itself does not change, what changes is the kind of inputs available under the definition. Seems that the sub-consciousness is a really aware entity, and it probably gets a lot from its sphere of perception, than the average human does.

Hmm, and where do the twain meet? Dreams? Okay, more broadly sleep. hmm.. Sounds interesting. Imagine a person who did not sleep at all. Well, that person would not be in touch with his sub-consciousness. And what if one day, he finally falls asleep? That is that day, he can actually meet his fellow (sub)consciousness. Wouldnt that be great? I know there are a lot of wrinkles, but those can be ironed out. And what cannot be, we will submit that to a "temporary suspension of disbelief".

Hmm, so how will that meeting be like? Lets say, meeting your sub-consciousness will allow you to form opinions. Maybe that is the final stretch that man has not tapped. I mean, think about it. The meeting of the consciousness with the sub-consciousness could be the fertile ground for "imagination". The ability to "generate" new ideas out of nowhere. Maybe, this ability is not just a matter of conscious effort. Maybe the ability is nothing more than a meeting between the consciousness and the sub-consciousness, followed by an exchange of ideas. Actually it makes sense in a obtuse sort of way.

Lets see if I can put this in the form of a fictional piece. But for that I will have to sleep and see what my sub-consciousness has to say.

~!nrk

September 29, 2002

Data, Information and Knowledge

This in short is the brief history of the universe. I am not writing this blog after being overly fascinated with the book with the title that sounds obscenely like the quip I just quipped. In fact I think I read that book a long long time ago. What I am writing this is because I have an alternate view of the universe. A view that breaks down everything into a point on the line, defined with ends of data and knowledge, with information lying in the middle. I know I am getting a little too abstract here, but then I hope things will be clearer to be as we go along.

When I used to study science, something struck me as very odd. Physics, especially of the variety that is normally taught in the high schools, breaks down the universe into two sections. The physical universe and the law that govern this universe. What struck me as odd, is the fact that god actually defined such a cute little dichotomy in his world. Just like we have data and instructions, male and female, good and bad, we have matter and laws. Okay matter, energy and all that dark crap too, but basically the tangibles and the intangibles. Okay, this is not also very true, but... Wow, this is tough, getting the definition right. But basically the problem with god and his universe boils down to this - how did he come up with something that exists, and then threw away a hell lot for us to discover. Why all this segregation? Why this duality? Why was matter there, for all of us to see, and the rest of the relationships, laws etcetera for us to discover?

But then think about it. What was matter. Ask someone in the dark ages, (dark ages NOT defined as the time before the computer) and their *ologists will tell you that it is nothing but a combination of air, water, earth, fire and something else. Somewhere down the time line, people will tell you that matter was made of unbreakable balls, called atomz. Then people went berserk. Matter was made of all sorts of strange. mystical and mythical substances, which incidentally no one can see, but ought to be there for matter to make sense.

So what was different in matter in the dark ages and now? Nothing. It is the same old matter burnt and forged into different shapes, but still the same old matter. What changed is information. The information known to man and this knowledge has changed the way people look at matter. If this information was not available, a lot of people in nagasaki would have been nth-generation residents, instead of what they are now. Matter has changed because of what matter is to different people. To the ordinary man, matter is nothing more than just earth and air. Hence what is important about matter is not matter itself, but information about matter. What we see as matter is nothing but the extract of the information conveyed to us by the various input devices.

We will now look at a totally different way of seeing the universe. A way in which there is no difference between the various units of matter, energy, ideas, minds and everyother thing in the universe. This unified way of looking at the universe is going to help us define the entire universe on a one dimensional scale rating information content. This will then give us a powerful way of dealing with many problems on a vastly simplified, unified methodology.

But before this we need to get some basic framework necessary. We postulate the existence of three different types of entities in this universe. The first is the Data Source or the DS. The Data source is characterised by the fact that it contains data. It owes its existence to the data it contains. There is no restriction on the data it contains. Of course we havent defined what data itself is. But we are deliberately not defining it, since it will be globally defined with the circumstance under view. And moreover, we cannot define it in isolation from other units underconsideration. Now the second entity we postulate about is the existence of the Data Acquirer or the DA. The Data Acquirer or the DA can query the DS for data through the use of what is known as a Data Transfer Medium or DTM.

Given these basic units, we define some terms. The first term we will be defining is the Data Completeness (DC) of an entity. DC is defined as the relative content of data of a particular kind in a particular Data Source. Hence DC is defined for a DS and Data Type. For example a DS has 100% DC about itself. Any DS can answer any question itself. So its DC is complete. Note that DC is independant of the query for data, or the way the query is designed, or the DA itself or the DTM for delivery. The actual response of the DS to a query is a function of the ability and capacity of the DTM and the DA.

This leads to an interesting and obvious statement. Any DS is 100% Data Complete with its own data. In fact, any entity, which is 100% DS with the data of any DS, is virtually indistinguishable from the DS itself. This is because the said entity can answer any question about the DS. This means that any DA cannot distinguish between the impersonating entity and the Data Source itself.

Now the DC itself does not give any powerful medium for expressing data relationships. Since the DC is fully defined with the data type and the DS, we define another term called the Relative Data Completeness or the RDC. RDC is defined as the relative compelteness of data given the DS, the DA and the DTM. For example, a still photograph has an RDC of close to 100% for the original static setup, given that the DA is just seeing them with just sight as the source of data input. The moment the DTM expands to include say touch, the photograph no longer has 100% RDC.

The RDC therefore gives a powerful medium to express the quality of data relationships between the DS, the DTM and the DA. We will dwell more on various examples for these terms in later posts.

Data is always handled in packets called observations. This observation is not the observation that is defined for an experiment. Observation is a taggable block of data. Observations differ from one another in their quality. Observations are generally substantiated by data. The amount of data represented by an observation is its relative richness. Richness of an observation is defined on a scale called the Linear Scale of Information or LSI. Data is one end of the LSI scale, while knowledge is the other extreme. Information is lying in between. Data are the small individual pieces of information, that border on indivisibility. Knowledge is completeness of knowledge. An entity which has a 100% of Data Completeness (DC) is perfectly knowledgable, and can infact replace the DS itself. An entity having an RDC (Relative Data Completeness) of 100% implies that for a particulat DTM and a DA, the entity appears to be the DS itself.

We will stop this round of definitions here. Check back for more data and information on these terms soon.

tada for now

~!nrk