Hey guys, In this blog I am going to explain you about Cache Memory
Cache is a small amount of high-speed random-access memory (RAM) built directly within the processor. It is used to temporarily hold data and instructions that the processor is likely to reuse. This allows for faster processing as the processor does not have to wait for the data and instructions to be fetched from the RAM.
Cache memory is fast and expensive. There are three general cache levels:
L1
cache, or primary cache, is
extremely fast but relatively small, and is usually embedded in the processor
chip as CPU cache.
L2
cache, or secondary cache, is often
more capacious than L1. L2 cache may be embedded on the CPU, or it can be on a
separate chip and have a high-speed alternative system bus connecting the cache
and CPU.
L3
cache, is specialized memory
developed to improve the performance of L1 and L2. L1 or L2 can be
significantly faster than L3, though L3 is usually double the speed of DRAM.
Each core can have dedicated L1 and L2 cache, but they can share an L3 cache.
If an L3 cache references an instruction, it is usually elevated to a higher
level of cache.
In the
past, L1, L2 and L3 caches have been created using combined processor and
motherboard components. Recently, the trend has been toward consolidating all
three levels of memory caching on the CPU itself.
Implementing flash or more dynamic RAM (DRAM) on a system won't increase cache memory. This can be confusing since the term’s memory caching (hard disk buffering) and cache memory are often used interchangeably. Memory caching, using DRAM or flash to buffer disk reads, is meant to improve storage I/O by caching data that is frequently referenced in a buffer ahead of slower magnetic disk or tape. Cache memory, on the other hand, provides read buffering for the CPU.
Performance –
Cache
memory is important because it improves the efficiency of data retrieval. It
stores program instructions and data that are used repeatedly in the operation
of programs or information that the CPU is likely to need next. The computer
processor can access this information more quickly from the cache than from the
main memory. Fast access to these instructions increases the overall speed of
the program.
Aside from its main function of improving performance, cache
memory is a valuable resource for evaluating a computer's overall
performance. Users can do this by looking at cache's hit-to-miss
ratio. Cache hits are instances in which the system successfully retrieves
data from the cache. A cache miss is when the system looks for the data in the
cache, can't find it, and looks somewhere else instead. In some cases, users
can improve the hit-miss ratio by adjusting the cache memory block size -- the
size of data units stored.
Cache vs. virtual memory –
A computer's DRAM is limited, and its cache memory is much
smaller. Memory can be utilized when a large software or multiple apps are
executing. Operating system constructs Virtual Memory to compensate for a lack
of actual memory.
The OS does this by transferring inactive data from DRAM to disc
storage. This method expands virtual address space by forming contiguous
addresses that hold both a program and its data utilizing active memory in DRAM
and inactive memory in HDDs.
Virtual memory allows a computer to run larger programs or many
programs at the same time, with each program acting as though it had unlimited
memory.
The OS splits memory into page files or swap files that contain a
specific number of addresses in order to convert virtual memory into physical
memory. Those pages are kept on a disc, and when they're needed, the OS copies
them to main memory and converts the virtual memory address to a physical
location. A memory management unit is in charge of these translations (MMU).
No comments:
Post a Comment
If you have any doubt or question, please contact us.