Thursday, June 25, 2009
device status table
02.interup and trap
Interrupts and Traps. A great deal of the kernel consists of code that is invoked as the result of a interrupt or a trap.
While the words "interrupt" and "trap" are often used interchangeably in the context of operating systems, there is a distinct difference between the two.
An interrupt is a CPU event that is triggered by some external device.
A trap is a CPU event that is triggered by a program. Traps are sometimes called software interrupts. They can be deliberately triggered by a special instruction, or they may be triggered by an illegal instruction or an attempt to access a restricted resource.When an interrupt is triggered by an external device the hardware will save the the status of the currently executing process, switch to kernel mode, and enter a routine in the kernel.
This routine is a first level interrupt handler. It can either service the interrupt itself or wake up a process that has been waiting for the interrupt to occur.When the handler finishes it usually causes the CPU to resume the processes that was interrupted. However, the operating system may schedule another process instead.When an executing process requests a service from the kernel using a trap the process status information saved, the CPU is placed in kernel mode, and control passes to code in the kernel.
This kernel code is called the system service dispatcher. It examines parameters set before the trap was triggered, often information in specific CPU registers, to determine what action is required. Control then passes to the code that performs the desired action.When the service is finished, control is returned to either the process that triggered the trap or some other process.
Traps can also be triggered by a fault. In this case the usual action is to terminate the offending process. It is possible on some systems for applications to register handlers that will be evoked when certain conditions occur -- such as a division by zero.
01.bootstrap program
In computing, booting is a bootstrapping process that starts operating systems when the user turns on a computer system.Most computer systems can only execute code found in the memory (ROM or RAM); modern operating systems are mostly stored on hard disk drives, LiveCDs and USB flash drive. Just after a computer has been turned on, it doesn't have an operating system in memory. The computer's hardware alone cannot perform complicated actions of the operating system, such as loading a program from disk on its own; so a seemingly irresolvable paradox is created: to load the operating system into memory, one appears to need to have an operating system already installed.
hardware protection
http://informatik.unibas.ch/lehre/ws06/cs201/_Downloads/cs201-osc-svc-2up.pdf
strorage herachy
-In computer science, a cache (pronounced /kæʃ/) is a collection of data duplicating original values stored elsewhere or computed earlier, where the original data is expensive to fetch (owing to longer access time) or to compute, compared to the cost of reading the cache. In other words, a cache is a temporary storage area where frequently accessed data can be stored for rapid access. Once the data is stored in the cache, it can be used in the future by accessing the cached copy rather than re-fetching or recomputing the original data.
A cache has proven to be extremely effective in many areas of computing because access patterns in typical computer applications have locality of reference. There are several kinds of locality, but this article primarily deals with data that are accessed close together in time (temporal locality). The data might or might not be located physically close to each other (spatial locality).
caching, coherency and consistency
Cache coherency problems can arise when more than one processor refers to the same data. Assuming each processor has cached a piece of data, what happens if one processor modifies its copy of the data? The other processor now has a stale copy of the data in its cache.
Cache coherency and consistency define the action of the processors to maintain coherence. More precisely, coherency defines what value is returned on a read, and consistency defines when it is available.
Unlike other Cray systems, cache coherency on Cray X1 systems is supported by a directory-based hardware protocol. This protocol, together with a rich set of synchronization instructions, provides different levels of memory consistency.
Processors may cache memory from their local node only; references to memory on other nodes are not cached. However, while only local data is cached, the entire machine is kept coherent in accordance with the memory consistency model. Remote reads will obtain the latest “dirty” data from another processor's cache, and remote writes will update or invalidate lines in another processor's cache. Thus, the whole machine is kept coherent.
storage structure
– only large storage media that the CPU can access directly.
• Magnetic Disks
– rigid metal or glass platters covered with magnetic recording material.
– Disk surface is logically divided into tracks, which are subdivided into sectors.
– The disk controller determines the logical interaction between the device and the computer.
Moving Head Mechanism

• Magnetic Tapes
Magnetic tape is a medium for magnetic recording generally consisting of a thin magnetizable coating on a long and narrow strip of plastic. Nearly all recording tape is of this type, whether used for recording audio or video or for computer data storage. It was originally developed in Germany, based on the concept of magnetic wire recording. Devices that record and playback audio and video using magnetic tape are generally called tape recorders and video tape recorders respectively. A device that stores computer data on magnetic tape can be called a tape drive, a tape unit, or a streamer.Magnetic tape revolutionized the broadcast and recording industries. In an age when all radio (and later television) was live, it allowed programming to be prerecorded. In a time when gramophone records were recorded in one take, it allowed recordings to be created in multiple stages and easily mixed and edited with a minimal loss in quality between generations. It is also one of the key enabling technologies in the development of modern computers. Magnetic tape allowed massive amounts of data to be stored in computers for long periods of time and rapidly accessed when needed.Today, many other technologies exist that can perform the functions of magnetic tape. In many cases these technologies are replacing tape. Despite this, innovation in the technology continues and tape is still widely used.
Tuesday, June 23, 2009
4.user mode
1.
A different use of the term bootstrapping is to use a compiler to compile itself, by first writing a small part of a compiler of a new programming language in an existing language to compile more programs of the new compiler written in the new language. This solves the "chicken and egg" causality dilemma.
For the historical origins of the term bootstrapping, see Bootstrapping.
3.
6. direct memory access
Without DMA, using programmed input/output (PIO) mode for communication with peripheral devices, or load/store instructions in the case of multicore chips, the CPU is typically fully occupied for the entire duration of the read or write operation, and is thus unavailable to perform other work. With DMA, the CPU would initiate the transfer, do other operations while the transfer is in progress, and receive an interrupt from the DMA controller once the operation has been done. This is especially useful in real-time computing applications where not stalling behind concurrent operations is critical. Another and related application area is various forms of stream processing where it is essential to have data processing and transfer in parallel, in order to achieve sufficient throughput.
7. difference of RAM and DRAM
08. main memory
The computer can manipulate only data that is in main memory. Therefore, every program you execute and every file you access must be copied from a storage device into main memory. The amount of main memory on a computer is crucial because it determines how many programs can be executed at one time and how much data can be readily available to a program.
Because computers often have too little main memory to hold all the data they need, computer engineers invented a technique called swapping, in which portions of data are copied into main memory as they are needed. Swapping occurs when there is no room in memory for needed data. When one portion of data is copied into memory, an equal-sized portion is copied (swapped) out to make room.
Now, most PCs come with a minimum of 32 megabytes of main memory. You can usually increase the amount of memory by inserting extra memory in the form of chips.
9. magnetic disk
10. storage hierarchy
VERY SLOW
Punch cards (obsolete)
Punched paper tape (obsolete)
FASTER
Bubble memory
Floppy disks
MUCH FASTER
Magnetic tape
Optical discs (CD-ROM, DVD-ROM, MO, etc.)
Magnetic disks with movable heads
Magnetic disks with fixed heads (obsolete)
Low-speed bulk memory
FASTEST
Flash memory
Main memory
Cache memory
Microcode
Registers