Thursday 26 January 2012

Difference between Buffer and Cache

 

Buffer

 
Buffering is the process of holding data in a region of the memory until the data is transported from one place to another. This region of memory that holds the data is called a buffer. Buffering is used when there is a difference between the speed in which the data is received and the speed in which the data is processed.

The buffer allows each device or process to operate without holding up by the other. In order to a buffer to be effective,the size of the buffer needs to be considered by the buffer designer. Like a cache, a buffer is a "midpoint holding place" but does not exist so much to accelerate the speed of an activity as for supporting the coordination of separate activities. This term is used not only in programming but in hardware as well. In programming, buffering sometimes needs to screen data from its final intended place so that it can be edited or processed before moving to a regular file or database.
 

Cache


Caching is the process of storing data in a separate place also called the cache such that they could be accessed faster if the same data is requested in the future. When some data is requested, the cache is first checked to see whether it contains that data. Cache memory is type of random access memory (RAM). Cache Memory can be accessed more quickly by the computer microprocessor than it can be accessed by regular RAM. Like microprocessor processes data, it looks first in the cache memory and if there, it finds the data from a previous reading of data, it does not need to do the more time consuming reading of data from larger memory.

In the CPU, caching is used to improve the performance by reducing the time taken to get data from the main memory. In web browsers, web caching is used to store responses from previous visits to websites, in order to make the next visits faster.

No comments:

Post a Comment