![]() | This article includes alist of references,related reading, orexternal links,but its sources remain unclear because it lacksinline citations. Please helpimprove this article byintroducing more precise citations.(June 2014) (Learn how and when to remove this message) |
Memory pools, also calledfixed-size blocks allocation, is the use ofpools formemory management that allowsdynamic memory allocation. Dynamic memory allocation can, and has been achieved through the use of techniques such asmalloc andC++'soperator new; although established and reliable implementations, these suffer fromfragmentation because of variable block sizes, it is not recommendable to use them in areal time system due to performance. A more efficient solution is preallocating a number of memory blocks with the same size called thememory pool. The application can allocate, access, and free blocks represented byhandles atrun time.
Manyreal-time operating systems use memory pools, such as theTransaction Processing Facility.
Some systems, like the web serverNginx, use the termmemory pool to refer to a group of variable-size allocations which can be later deallocated all at once. This is also known as aregion; seeregion-based memory management.
A simple memory pool module can allocate, for example, three pools atcompile time with block sizes optimized for the application deploying the module. The application can allocate, access and free memory through the following interface:
unsigned int
. The module can interpret the handle internally by dividing it into pool index, memory block index and a version. The pool and memory block index allow fast access to the corresponding block with the handle, while the version, which is incremented at each new allocation, allows detection of handles whose memory block is already freed (caused by handles retained too long).Benefits
Drawbacks