also size of cache memory is smallest , cost is highest and access time is lowest
cache stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory. in simple term it is faster to access data present in cache than main memory or disks , CPU start searching for a data from cache and if data is not present in cache it looks into main memory and then in disk
manzione
2016-12-11 16:38:49 UTC
Characteristics Of Cache Memory
ajai
2010-02-05 04:04:09 UTC
Hi
A CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.
When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or writes to the cache, which is much faster than reading from or writing to main memory.
2016-04-04 05:45:45 UTC
For the best answers, search on this site https://shorturl.im/axeCK
In computer engineering, a cache (pronounced /kæʃ/ kash in US and /keɪʃ/ kaysh in Aust/NZ) is a component that improves performance by transparently storing data such that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere. If requested data is contained in the cache (cache hit), this request can be served by simply reading the cache, which is comparably faster. Otherwise (cache miss), the data has to be recomputed or fetched from its original storage location, which is comparably slower. Hence, the more requests can be served from the cache the better the overall system performance is. As opposed to a buffer, which is managed explicitly by a client, a cache stores data transparently: This means that a client who is requesting data from a system is not aware that the cache exists, which is the origin of the name cache (from French "cacher", to conceal). To be cost efficient and to enable an efficient lookup of data, caches are comparably small. Nevertheless, caches have proven extremely effective in many areas of computing because access patterns in typical computer applications have locality of reference. References exhibit temporal locality if data is requested again that has been recently requested already. References exhibit spatial locality if data is requested that is physically stored close to data that has been requested already.
2010-02-05 04:01:45 UTC
Cache memory is random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data there (from a previous reading of data), it does not have to do the more time-consuming reading of data from larger memory.
Cache memory is sometimes described in levels of closeness and accessibility to the microprocessor. An L1 cache is on the same chip as the microprocessor. (For example, the PowerPC 601 processor has a 32 kilobyte level-1 cache built into its chip.) L2 is usually a separate static RAM (SRAM) chip. The main RAM is usually a dynamic RAM (DRAM) chip.
In addition to cache memory, one can think of RAM itself as a cache of memory for hard disk storage since all of RAM's contents come from the hard disk initially when you turn your computer on and load the operating system (you are loading it into RAM) and later as you start new applications and access new data. RAM can also contain a special area called a disk cache that contains the data most recently read in from the hard disk.
Malima
2015-06-05 00:24:30 UTC
cache is very higher semi conductor memory that speed up cpu, it act as a buffer btn cpu and main memory.
2010-02-05 11:18:32 UTC
It will speed up the pc performance.
ⓘ
This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.