# Recent questions tagged time-complexity

Consider th following cases for quick sort to sort an array of n element $a[0...n-1]$ i) Choosing the pivot element randomly from the given array ii) choosing median element as pivot. Iii) Choosing middle element as pivot For which of the above cases quick sort always gives $O(nlogn)$ time complexity?
For the below message number of bits required in Huffman Coding is: abbaabccdabcd
Let $g(n) = \Omega(n)$, $f(n) = O(n)$ and $h(n) = \theta(n)$ then what is the time complexity of $[g(n) f(n) + h(n)]$ How to solve such questions?
Consider an array A of length n, array contain number between (1 – 10), in any arbitary order, best sorting algorithm takes $650$ ns if $n = 50$, the time required by algorithm if n = 300.
Here the tightest lower bound means?. (Please suggest some source if possible for such questions) https://gateoverflow.in/1026/gate2004-29
1 .Let P be the problem. Suppose that the worst case asymptotic running time complexity of P is in O($n^2$lgn) and is also in Ω(n). Now let A be an algorithm that solves P. Which of the following is correct about A: A has worst-case time comlexity in O(lgn) A has ... . The best case time case complexity of an algorithm is θ(n). The Algorithm executes in time T(n) = Ω($n^2$) for every input data.
1 .Solve the recurrence: $T(n) = 2T(\sqrt n) + log\ n$ $T(n) = T(\frac{n}{2}) + 2T(\frac{n}{4}) +n$ 2 .$f(n) = n^{1.01}$ and $g(n) = n(log\ n)^2$ then which one is true $f(n) = O(g(n))$ $f(n) = \Omega(g(n))$ $f(n) = \Theta(g(n))$ 3 . $f(n) = n^{\frac{1}{2}}, g(n) = 1$, then is it correct $g(n) = o(fn)$ {$o$ = small o Not big $O$} Please solve the above questions.
The decreasing asymptotic order of the functions: F1:$2^{2^{n+1}}$ F2: log*log n F3: log log* n F4: (log n)! F5: $n^{log log n}$ F1,F5,F4,F2,F3 F1,F4,F5,F3,F2 F1,F5,F4,F2,F3 None of these
1 vote
Consider the following functions: $f(n)=\Omega(n^{3})$ $g(n)=O(n^{2})$ $h(n)=\Theta (n^{2})$ Which one of the following is the correct asymptotic representation of f(n) + g(n) * h(n) ? $\Omega (n^{3})$ O($n^{4}$) $\Theta (n^{3})$ $\Omega (n^{4})$
1 vote
T(n)=T(k)+T(n−k−1)+θ(n) Can someone explain how to solve this?
Consider a disk drive rotating at 20,000 rpm and a transfer rate of 80 MB/sec. The disk head consumes 4 milliseconds on average to reach the target sector from/to where to read/write the data. If we assume disk controller overhead is 0.5 milliseconds, what will be time required to read a block of 2 MB from the disk
A programmer wants to create a dynamic array implementation of stack in which always a new array of size n+10 is created every time the array cannot accommodate more elements. For example, for inserting the first element, array of size 0+10=10 will be created. After inserting 10 elements ... to this new array. What is the time complexity of this stack implementation? O(n) O(log n) O(nlog n) O(1)
1 vote
If n is a multiple of 3, then the minimum number of multiplications needed to calculate $a^{n}$ is?
If $f(n)=\sqrt{n}$ $g(n)=n^{1+\sin n}$ Then: f(n) = O g(n) f(n) = Ω g(n) f(n) = Θ g(n) f(n) and g(n) cannot be asymptotically compared