Main topic of week 6: complexity analysis As introduced in week 4.
Function complexity is the sum of the time complexity in each child function. Obvioulsy, "child function" makes sense when the function calls himself somewhere, meanning the function is recursive. Then the function's time complexity may be based on quantities like length of the list or string. If the child function calling himself with shorter or longer length of the list or string, then it makes some progress towards the base, and we can design our recusive time complexity T(n) based on the length. One thing worth to mention here is that the complexity analysis is based on the worest case scenario, which will simplify the caculation quiet a bit.
Idea how to prove the "uncontinous" recursive function. Because, for example, the length of a list in a recursive function may not stricly decrease or increase by one in each child function call. In the mergeSort(), we find the length of the list to be considered in child function is half to the length of the list in it's parent function. That will make our Time complexity function in a series of discrete points n. So in order to prove it, we could first prove the list of length n = 2^k(in mergeSort() example), thus we know k is decreased by one during each subfunction call. And then prove the function is monotonic, meaning either non-increasing, or non-decreasing. Finally we prove all the possible length lies between each 2^k and 2^k+1 length list.
After finding the time complexity function, we can bounded function by some other functions. Thus big O and big omega is introduced here agian.
没有评论:
发表评论