algorithm - 如何计算一个 "not so simple"程序的复杂度?

标签 algorithm asymptotic-complexity

<分区>

我知道如何通过计算每行执行的次数来计算程序的复杂性,只要有变量声明或涉及一些简单的循环(即线性情况)。

但在某些情况下,我看到某些代码行以对数、指数、立方等方式运行,我只是想知道如何解决这个问题?

最佳答案

The [following][3] figure dices up the analysis of runtime of the merge sort algorithm: We split the original array into two arrays of size n / 2 each. Then we merge those arrays, an operation that merges n elements and thus takes Θ( n ) time. By this argument, the complexity for each row is Θ( n ). We know that the number of rows in this diagram, also called the depth of the recursion tree, will be log( n ). The reasoning for this is exactly the same as the one we used when analyzing the complexity of binary search. We have log( n ) rows and each of them is Θ( n ), therefore the complexity of mergeSort is Θ( n * log( n ) )


Simple programs can be analyzed by counting the nested loops of the program. A single loop over n items yields f( n ) = n. A loop within a loop yields f( n ) = n2. A loop within a loop within a loop yields f( n ) = n3.

Given a series of for loops that are sequential, the slowest of them determines the asymptotic behavior of the program. Two nested loops followed by a single loop is asymptotically the same as the nested loops alone, because the nested loops dominate the simple loop.


 i := 1;
while ( i < n )  
  i = i * 2;

The running time is logarithmic, O(logn) because of the multiplication by 2.

一些有用的链接是: https://cs.stackexchange.com/questions/192/how-to-come-up-with-the-runtime-of-algorithms

希望对你有帮助!

关于algorithm - 如何计算一个 "not so simple"程序的复杂度?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26583229/

相关文章:

algorithm - 证明 f(n) = o(g(n)) 意味着 2^f(n) = o(2^g(n))

javascript - 没有得到正确的二进制搜索算法

algorithm - 嵌套循环的运行时间

complexity-theory - 需要 O(1) 摊销时间的操作在最坏情况下可以有 O(n^2) 时间吗?

algorithm - 是否有任何数值稳定的多边形质心查找算法版本?

.net - ImmutableList<T> 的性能特征

complexity-theory - 三个嵌套for循环的渐近分析

c - 我如何使用初始化-维护-终止技术证明算法正确?

arrays - 获取数组元素的顺序图

python - 如何分配和管理优先级机制