Thespace complexity of analgorithm or adata structure is the amount of memory space required to solve an instance of thecomputational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely.[1] This includes the memory space used by its inputs, calledinput space, and any other (auxiliary) memory it uses during execution, which is calledauxiliary space.
Similar totime complexity, space complexity is often expressed asymptotically inbigO notation, such as etc., wheren is a characteristic of the input influencing space complexity.
Analogously to time complexity classesDTIME(f(n)) andNTIME(f(n)), the complexity classesDSPACE(f(n)) andNSPACE(f(n)) are the sets of languages that are decidable by deterministic (respectively, non-deterministic)Turing machines that use space. The complexity classesPSPACE andNPSPACE allow to be any polynomial, analogously toP andNP. That is,and
Thespace hierarchy theorem states that, for allspace-constructible functions there exists a problem that can be solved by a machine with memory space, but cannot be solved by a machine with asymptotically less than space.
The following containments between complexity classes hold.[2]
Furthermore,Savitch's theorem gives the reverse containment that if
As a direct corollary, This result is surprising because it suggests that non-determinism can reduce the space necessary to solve a problem only by a small amount. In contrast, theexponential time hypothesis conjectures that for time complexity, there can be an exponential gap between deterministic and non-deterministic complexity.
TheImmerman–Szelepcsényi theorem states that, again for is closed under complementation. This shows another qualitative difference between time and space complexity classes, as nondeterministic time complexity classes are not believed to be closed under complementation; for instance, it is conjectured that NP ≠co-NP.[3][4]
L or LOGSPACE is the set of problems that can be solved by a deterministic Turing machine using only memory space with regards to input size. Even a single counter that can index the entire-bit input requires space, so LOGSPACE algorithms can maintain only a constant number of counters or other variables of similar bit complexity.
LOGSPACE and other sub-linear space complexity is useful when processing large data that cannot fit into a computer'sRAM. They are related toStreaming algorithms, but only restrict how much memory can be used, while streaming algorithms have further constraints on how the input is fed into the algorithm.This class also sees use in the field ofpseudorandomness andderandomization, where researchers consider the open problem of whetherL =RL.[5][6]
The corresponding nondeterministic space complexity class isNL.
The termauxiliary space refers to space other than that consumed by the input.Auxiliary space complexity could be formally defined in terms of aTuring machine with a separateinput tape which cannot be written to, only read, and a conventional working tape which can be written to.The auxiliary space complexity is then defined (and analyzed) via the working tape.For example, consider thedepth-first search of abalanced binary tree with nodes: its auxiliary space complexity is
{{citation}}: CS1 maint: location missing publisher (link).