Guest alberthendri Posted August 10, 2004 Posted August 10, 2004 The (worst case) fastest known algorithm to sort an array of random numbers runs in O(n log n) time. Has it been proved that no faster algorithm can exist? Is there any place I can find such a proof (hopefully with tutorial)?
Dave Posted August 10, 2004 Posted August 10, 2004 As far as I'm aware, there's no proof saying that the quick sort (which I think is the one you're talking about) is the fastest algorithm, although I think it'd be quite hard to think of a much faster one.
e(ho0n3 Posted August 10, 2004 Posted August 10, 2004 In the worst-case, quicksort takes time O(nn) to sort a list of n items. Algorithms that use divide-and-conquer methods (like quicksort and mergesort) for sorting having average running times in the order of O(n log n) in general. I could do a proof for you if you desire.
bloodhound Posted August 10, 2004 Posted August 10, 2004 how exactly does this quicksort algorithm work?
pulkit Posted August 11, 2004 Posted August 11, 2004 You can very easily prove that any algorithm based on comparisons can't run faster than O(n log n). There are sorting algorithms that are not based on comparisons, which typically run faster than O(n log n) however they use some extra information about the numbers being sorted. For eg. You can sort n integers in range 0......m-1 in time O(n+m) using a bucket sort algorithm. The time complexity analysis for quick sort is an assymptotic one and gives the average case comlpexity of O(n log n) , worst case is O(n*n).
e(ho0n3 Posted August 11, 2004 Posted August 11, 2004 how exactly does this quicksort algorithm work? Basically you pick an element x from the list you want to sort, place all the elements greater than or equal to x on one side of the list, and the rest on the other. Then you recursively do the same procedure on both sides unitl the list is sorted. You can always google to find out more.
pulkit Posted August 12, 2004 Posted August 12, 2004 To improve your quick sort, you can pick a sub set of k elements from the collection to be sorted and then split the numbers about the median of these k numbers. As it is a constant number of numbers that you pick each time it does not effect time complexity.
Guest TheDon1 Posted September 11, 2004 Posted September 11, 2004 Why is QuickSort O(N log N) come someone explain. I can see why its growth rate is original logN because of the recursive partitioning but, why EXACTLY, is that multiplied by N. I must surely have something to with the comparisons made N times through out each recursive call. But i cannot tell exactly. thanks for reading, peace!
pulkit Posted September 11, 2004 Posted September 11, 2004 There is a formal proof wherein you analyse the algorithm assymptotically. It is not a trivial one, hence the complexity does not come out intuitively to you. http://www.cse.iitd.ernet.in/~suban/CSL102/lecture/node55.html Look at the above link, its from the course webpage of a first year course I had taken. The proof is given right at the end of the page.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now