Programming assignment Do the ”Empirical study of sorting algorithms”

computer science

Description

Programming assignment Do the ”Empirical study of sorting algorithms” study in Chapter 7 section 8 and provide you own tables like figures 7.20 running the eight algorithms (notice duplicates-improved versions: Shell-O, Merge-O, Quick-O, Heap-O the Radix-4 and Radix-8 sort), just like Section 7.8 in the textbook. At a minimum run you comparison under both Windows 7-10, or optional UNIX (Linux or Solaris)in classroom, T-MARC or home. Compile the files for the sort algorithms in Chapter 7, compare the running time and test the programs from your textbook on : Exchange sort (insertion, selection, bubble), quicksort, quicksort improved, shellsort with powers of 2, shellsort with powers of 3, heapsort, mergesort, mergesort improved, radixsort/8 . I wand a short evaluation and calibration of your ”timing/profiling” software as described in class, ask me if you need any details. I leave in your hands how deep you want to go with this. In addition you may use a profiler on each of Unix and Windows. You need to turn in the code and results for test runs and very important a report on comparisons based on a running time table and graphics. Your comparisons will look at at running time for at least thee different list sizes. Generate random lists (at lesat 5 sizes) or integers of size 100, 1,000, 10,000, 100,000 , 1,000,000 for input files, you may try further till the runtime becomes significantly expensive, say more than 15 min. You may also want to compare the running time for the same code and input but for different compilers and OS (a good job will earn you extra credit). The project is asking that you do a job similar to section 7.8 p.259-261 and compare the running time of different algorithms, see how this backs up our theoretical results and if you could find the proportionality constant. 


2. Optional - Extra credit - Internet research individual reports Do a Google search opf your choice on a topic related to this class Data Structure 2 (see the textbook table of content for keywords - or visit the link indicated in the Lab1 CSCI230 handout) In the past , for example I had the topic ”137GB Hard Drive limit”- related to Chapter 8, to learm about HD capacities barriers, the HD CHS (cylinder-head-sector) geometry and more. This problem was addressed by a number of articles, as they were posted around Spring 2003 some may be unavailable, one that I recommended was: ”Hard Drive Size Limitations and Barriers” at: www.dewassoc.com/kbase/hard drives/hard drive size barriers.htm or the white paper posted by Maxtor at: www.maxtor.com see also Wikipedia or https://www.extremetech.com/computing/58672-future-storage-the-view-from-2001?. Find some interesting papers on this or other topic that will also come in our textbook, Chapter 8.2 Explain in a few paragraphs (no more than 2 pages) what were the historical barrier size limitations, why did they appear what is involved ie hardware (motherboard), BIOS, Operating System,anything else? Are we likely to have a similar problem in the next few years and if not why?

Instruction Files

Related Questions in computer science category