|Institution:||University of Cincinnati|
|Department:||Engineering : Computer Science|
|Keywords:||Computer Science; HEURISTICS; FPGA; CPU TIME MEASUREMENT; EXPERIMENTAL DESIGN|
|Full text PDF:||http://rave.ohiolink.edu/etdc/view?acc_num=ucin1006804309|
Chairperson of the Supervisory Committee:Professor Carla Purdy Department of Electrical and Computer Engineering and Computer Science Many of the problems arising in electronic design automation are NP-hard and require heuristic algorithms, which can only be analyzed and compared through experiments. Unfortunately, it is often unclear to experimenters if enough data has been gathered, what statistics to run, and how to visualize results. The purpose of this paper is to demonstrate a general experimental design procedure for heuristics evaluation and a new evaluation tool developed by B. Billups and this author to help with the data analysis and visualization. To demonstrate this procedure and tool, sorting algorithms are implemented and compared as a short example. Then the FPGA routing is selected as a target NP-hard problem to discuss, and three heuristic algorithms, PathFinder, Frontier and VPR430, are statistically compared based on the experimental data collected using careful experimental design. As part of this work we also discuss in detail how to measure the processor time for an algorithm. Result! s show that VPR430 and Frontier are proven to be significantly faster than PathFinder but with loss of some performance. PathFinder gives the best solution but it takes much longer time than other two algorithms. These results are consistent with the results in the literature, but provide a more detailed analysis of these three algorithms and show the effect of more thorough experimentation.