(Redirected from linear time)Jump to navigation Jump to search
- See: Computational Complexity Analysis Task.
- In computational complexity theory, an algorithm is said to take linear time, or O(n) time, if the asymptotic upper bound for the time it requires is proportional to the size of the input, which is usually denoted n.
- Informally spoken, the running time increases linearly with the size of the input. For example, a procedure that adds up all elements of a list requires time proportional to the length of the list. This description is slightly inaccurate, since the running time can significantly deviate from a precise proportionality, especially for small values of n. For more information, see the article on big O notation.
- Linear time is often viewed as a desirable attribute for an algorithm. Much research has been invested into creating algorithms exhibiting (nearly) linear time or better. This research includes both software and hardware methods. In the case of hardware, some algorithms which, mathematically speaking, can never achieve linear time with standard computation models are able to run in linear time. There are several hardware technologies which exploit parallelism to provide this. An example is content-addressable memory.
- This concept of Linear Time is used in string matching Algorithms such as Boyer-Moore Algorithm, Ukkonen's Algorithm