This came to me when I saw some students carrying their CPUs to the presentation for a course. There is this course we have. Dealing with databases. And for that course the students have to do a project - a program. The platform could be anything, as long as it used databases for functioning - ASP, VB, VC++ whatever. And at the end of the term, the students taking that course have to make a presentation.
For the presentation, the students had to carry their CPUs to the professor. Why? Well basically, the architecture for the project required the students to connect to a remote database. And this was generally a painful process, considering that we are not doing Computer Science here, and most of the times, a final connection is establised through a lot of trial and error. By the time a connection is established and a connection string is finalized, a number of changes would have been done to the system that the student is working on, and the student would not be in a position to replicate the same on another machine.
Thus, this database project used to fail regularly, if the student just carried around the program as code, or as the executable (remember the DAOs and ADOs required were part of the OS) or in any other format. This almost forced the program to run only on the machine it has been written to run in. Thus we saw people carrying CPUs to and fro the prof's room for the presentation. The idea was that the all that needed to be added was power, and the program would run.
What do we have as part of a computer system. An application and data. Right? Wrong. There is also a context. The context is the executing environment of the application. Now in modern computing, this context is defined by the OS to an extent. And the context is realised by .dlls, APIs etc. The idea being that those entities not inherent to a particular application should be outsourced and be maintained by another party, or the OS. But look carefully, there are a lot of cases when third party tools are installed only for a particular program.
Lets look at some examples from the Windows world. Plugins into programs are one such set. Say Photoshop plugins, or Internet Explorer plugins, or Acrobat Reader plugins. For most of the scenarios, the application (Acrobat) and the data (.pdf file) alone would constitute the complete context. But for say some other scenarios, the plugin would also be a part of the context. Without it, having both the application and the data would be effectively useless.
Lets look at another example. Codecs. Say you have an AVI file. An AVI file normally can encode its video and audio streams in different formats and the application requires codecs to understand the two streams. Now what good is a great movie (data) on your machine (media player) without the Codec.
The above examples illustrate the need for an execution context in addition to the application and the data. Now lets look at what a context sandbox is.
A context sandbox is that minimum amount of information which will allow an action to be performed on a secondary machine, when the action is currently being performed as such on a primary machine.
There are some qualifications to be stated here:
- It is assumed that the primary and secondary machines are fundamentally capable of performing the action. In other words no definable context sandbox can exist for your washine machine to play your favourite movie.
- Information will be assumed to mean only that relating to software. Software will also be loosely defined as a sum of data and instructions. This means that information such as "Go get a life, buy another mp3 player" is not a context sandbox for a primary machine which is an mp3 player.
- Quality of performance is not an issue we will be dealing with here. Fundamental capability does not promise quality of performance.
- A machine is defined as the sum of all units that allow performance of a particular task. This includes hardware, software and any other environmental issues including power, temparature etc.
- Performance "as such" implies without change to the machine. Of course the machine being as described above.
So that is the idea. We will look more into ramifications of it in future posts.
Watched Memento today. Really kewl movie. This is the second time. Nothing new was learnt, but spent some time on the nuances of the amount of overlap the screenplay writer allowed between the scenes. Really well thought out.