Computational Complexity Performance Measure

From GM-RKB
Jump to navigation Jump to search

A Computational Complexity Performance Measure is a measure of the amount of computational resources required for running an algorithm.


2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Computational_complexity Retrieved:2018-4-1.
    • In computer science, the computational complexity, or simply complexity of an algorithm is the amount of resources required for running it. The computational complexity of a problem is the minimum of the complexities of all possible algorithms for this problem (including the unknown algorithms).

      As the amount of needed resources varies with the input, the complexity is generally expressed as a function nf(n), where n is the size of the input, and f(n) is either the worst-case complexity, that is the maximum of the amount of resources that are needed for all input of size n, or the average-case complexity, that is average of the amount of resources over all input of size n.

      When the nature of the resources is not explicitly given, this is usually the time needed for running the algorithm, and one talks of time complexity. However, this depends on the computer that is used, and the time is generally expressed as the number of needed elementary operations, which are supposed to take a constant time on a given computer, and to change by a constant factor when one changes of computer.

      Otherwise, the resource that is considered is often the size of the memory that is needed, and one talks of space complexity.

      The study of the complexity of explicitly given algorithms is called analysis of algorithms, while the study of the complexity of problems is called computational complexity theory. Clearly, both areas are strongly related, as the complexity of an algorithm is always an upper bound of the complexity of the problem solved by this algorithm.