-> Conceptual Entities
-> Big-O Notation
Big-O Notation Definition
A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items; a mathematical notation used to describe the asymptotic behavior of functions. More precisely, it is used to describe an asymptotic upper bound for the magnitude of a function in terms of another, usually simpler, function. (from NIST)
Big-O Notation Synonyms
Terms in Big-O Notation category
No reproduction or republication permitted.