Date of Award


Document Type

Honors Thesis


Engineering & Computer Science

First Advisor

Rodney Summerscales


Machine learning is obscure and expensive to develop. Neural architecture search (NAS) algorithms automate this process by learning to create premier ML networks, minimizing the bias and necessity of human experts. From this recently emerging field, most research has focused on optimizing a promisingly unique combination of NAS’s three segments. Despite regularly acquiring state of the art results, this practice sacrifices computing time and resources for slight increases in accuracy; this also obstructs performance comparison across papers. To resolve this issue, we use NASLib’s modular library to test the efficiency per module in a unique subset of combinations. Each NAS algorithm produces an ML image recognition model—tested on the CIFAR-10 dataset—and compared for efficiency, a ratio between compute time and accuracy.

Subject Area

Machine learning; Computer algorithm; Neural architecture search

Files over 3MB may be slow to open. For best results, right-click and select "save as..."