# Recent questions tagged algorithms

In the knapsack problem, we are given a set of n item, where each item i is specified by a size si and a value vi. We are also given a size bound S(the size of our knapsack). The goal is to find the subset of items of maximum total value such that sum of their sizes is ... place of XXXX in the algo? No statement needed, it is already a complete algo. a[S][n]=result a[n][S]=result a[n][n]=result
The Dimensions are A (5 x 10) , B(10 x 3) C(3 x12) D (12x3) My ans 303, ans given 405, which is right?
My ans is 14, but ME says 17.. Who is right? Will be awarded free visit in Guruji's Croatian ashram if u can help me..
Consider the following pseudocode, that computes kth power of x func(x,k){ IF k=0 return 1 ELSE IF k is even return(func(x,k/2))*(func(x,k/2)) } The time complexity $T(k)$ and space complexity $S(k)$ are $(A)T(k)=T(k/2)+c$, $S(k)=O(logk)$ $(A)T(k)=2T(k/2)+c$, $S(k)=O(k)$ $(A)T(k)=2T(k/2)+c$, $S(k)=O(logk)$ (D)None of these
Say we have an array like this→ [2,4,1,19,6,18,20,5] Here the $(n/4)th$ smallest element is → 2nd smallest element since the array size is 8. In this array the second smallest element is $2$ Now if we divide it according to the pivot we have a recurrence relation like→ T(n)=T(n-1) + O(1) + O(n) So isn’t this the worst case where the complexity is $O(n^2)$ ?
Consider the following instance of OBST(Optimal Binary Search Tree) problem $n=4,$ $<a_{1},a_{2},a_{3},a_{4}>=<do,if,int,while>$ $P(1...4)=<3,3,1,1>$ $Q(0….4)=<2,3,1,1,1>$ The cost of OBST is___________ Is there any formula to calculate it??
What will be solution for $2^{2^{2^{2}}}$?? How to evaluate it? Left to right or right to left??
Consider th following cases for quick sort to sort an array of n element $a[0...n-1]$ i) Choosing the pivot element randomly from the given array ii) choosing median element as pivot. Iii) Choosing middle element as pivot For which of the above cases quick sort always gives $O(nlogn)$ time complexity?
For the below message number of bits required in Huffman Coding is: abbaabccdabcd
Consider the following variation on the Interval Scheduling Problem. You have a processor that can operate 24 hours a day, every day. People submit requests to run daily jobs on the processor. Each such job comes with a start time and an end time; if the job is accepted to run on the processor, ... pick the two jobs (9 P.M., 4 A.M.), (1 P.M., 7 P.M.), which can be scheduled without overlapping.
Let $g(n) = \Omega(n)$, $f(n) = O(n)$ and $h(n) = \theta(n)$ then what is the time complexity of $[g(n) f(n) + h(n)]$ How to solve such questions?
Consider an array A of length n, array contain number between (1 – 10), in any arbitary order, best sorting algorithm takes $650$ ns if $n = 50$, the time required by algorithm if n = 300.
Let G = (V,E) be a directed or undirected graph, and suppose that BFS is run on G from given source vertex s ∈ V. Then upon termination, for each vertex v ∈ V, the value v.d computed by BFS satisfies v.d ≥ ∂ (s,v) (∂ (s,v) is the shortest path distance from s to v). My question is in which case v.d is > ∂ (s,v). Please explain with an example.
Suppose edge weights are integers in the range 1 to W, where W is a constant. How fast can Kruskal’s algorithm run?
1 vote
An array is said to be -ordered if for each such that . for example, the array 1 4 2 6 3 7 5 8 is 2-ordered. In a 2-ordered array of 2N elements, what is the maximum number of positions that an element can be from its position if the array were 1-ordered?
Of the following, which gives the best upper bound for the value of where is a solution of the recurrence , with
Below is the implementation of merge sort. void sort(int a, int lo, int hi){ if(hi<=lo) return; int mid = lo + (hi-lo)/2; sort(a,lo,mid); sort(a,mid+1,hi); merge(a,lo,mid,hi); } Assume that merge function merges the elements a[lo] to a[mid] and a[mid+1] to a[hi]. ... merge() whenever a[mid]<=a[mid+1]. The number of comparisons needed to merge sort a sorted array is: O(n log n) O(n) O(1) O(log n)
Suppose we are sorting an array of ten integers using some quadratic sorting algorithm. After four iterations of the algorithm's main loop, the array elements are ordered as shown here: 1 2 3 4 5 0 6 7 8 9 Which statement is correct? (Note: Our selection ... C. The algorithm might be insertion sort, but could not be selection sort. D. The algorithm is neither selection sort nor insertion sort.
Here the tightest lower bound means?. (Please suggest some source if possible for such questions) https://gateoverflow.in/1026/gate2004-29
We know that the longest common subsequence of 2 sequence X and Y of length m and n respectively can be determined in O (mn). But if we only have to determine the length of the longest common subsequence and not the actual subsequence we can compute it in O(n+m). Can anyone please suggest an algorithm with the complexity O(n+m)?
1 .Let P be the problem. Suppose that the worst case asymptotic running time complexity of P is in O($n^2$lgn) and is also in Ω(n). Now let A be an algorithm that solves P. Which of the following is correct about A: A has worst-case time comlexity in O(lgn) A has ... . The best case time case complexity of an algorithm is θ(n). The Algorithm executes in time T(n) = Ω($n^2$) for every input data.
1 .Solve the recurrence: $T(n) = 2T(\sqrt n) + log\ n$ $T(n) = T(\frac{n}{2}) + 2T(\frac{n}{4}) +n$ 2 .$f(n) = n^{1.01}$ and $g(n) = n(log\ n)^2$ then which one is true $f(n) = O(g(n))$ $f(n) = \Omega(g(n))$ $f(n) = \Theta(g(n))$ 3 . $f(n) = n^{\frac{1}{2}}, g(n) = 1$, then is it correct $g(n) = o(fn)$ {$o$ = small o Not big $O$} Please solve the above questions.
An unsorted array of n elements is given to you and you have to print largest 10% of them. This can be done on time O(1) O(n) O(logn) O(nlogn)
To merge two sorted sequences of size m and n min(m,n) comparisons are required. And the remaining elements of the bigger sequence must be pushed directly. So the answer should be min(20,24)+min(30,35)+min(44,50)+min(65,94)=159. Here is the link to the question: https://gateoverflow.in/1997/gate2014-2-38
Let G be a directed graph whose vertex set is the set of numbers from 1 to 100. There is an edge from vertex i to a vertex j iff either j = i+1 or j = 4i. The minimum number of edges in a path in G from vertex 1 to 100 is ? I am getting 6 as minimum edges as answer but correct answer is 9 edges. The pattern of edges I’m following – 1 → 4 → 5 → 6 -→ 24 → 25 → 100
Title!
1 vote
The recursive equation of the $0-1$ Knapsack Problem is $V[i,w] = max(v_{i}+V[i-1,w-w_{i}],V[i-1,w])\ for\ 1\leqslant i\leqslant n,\ 0\leqslant w\leqslant W$ Could anyone give me an example of weights, profits and Knapsack size where $(V[i-1,w]) \geqslant (v_{i}+V[i-1,w-w_{i}])$? Source: http://www.es.ele.tue.nl/education/5MC10/Solutions/knapsack.pdf
T(n) = $T(\frac{n}{k}) + T(\frac{3n}{4}) + n$ if n>=2 = 1 if n=1 Which of the following is false? T(n) is O($n^{\frac{3}{2}}$) when k=3 T(n) is O(n log n) when k=3 T(n) is O(n log n) when k=4 T(n) is O(n log n) when k=5
The decreasing asymptotic order of the functions: F1:$2^{2^{n+1}}$ F2: log*log n F3: log log* n F4: (log n)! F5: $n^{log log n}$ F1,F5,F4,F2,F3 F1,F4,F5,F3,F2 F1,F5,F4,F2,F3 None of these
1 vote
Consider the following recurrence relation for some unknown recursive algorithm. Take some assumption about the termination condition of algorithm. $T(n)=\sqrt{n} T(\sqrt{n}) + \sqrt{n}$ What is the tightest time bound? O(log log n) O(n log log n) O(n) O(n log $log^{2}n$)
If in a matrix (m*n) having sorted rows and sorted columns then time complexity to insert new element?
In a min heap of n elements, the Kth smallest element can be found in? O(n) O(nlogn) O(logn) O(1).
1 vote
Consider the following functions: $f(n)=\Omega(n^{3})$ $g(n)=O(n^{2})$ $h(n)=\Theta (n^{2})$ Which one of the following is the correct asymptotic representation of f(n) + g(n) * h(n) ? $\Omega (n^{3})$ O($n^{4}$) $\Theta (n^{3})$ $\Omega (n^{4})$
What is the time complexity for finding maximum profit in Greedy fractional knapsack algorithm if number of items are N= $2^{n}$ ? O(n log n) O($n^{2}$log n) O($2^{n}$) O(n.$2^{n}$)