Date of Award
Engineering & Computer Science
Machine learning is obscure and expensive to develop. Neural architecture search (NAS) algorithms automate this process by learning to create premier ML networks, minimizing the bias and necessity of human experts. From this recently emerging field, most research has focused on optimizing a promisingly unique combination of NAS’s three segments. Despite regularly acquiring state of the art results, this practice sacrifices computing time and resources for slight increases in accuracy; this also obstructs performance comparison across papers. To resolve this issue, we use NASLib’s modular library to test the efficiency per module in a unique subset of combinations. Each NAS algorithm produces an ML image recognition model—tested on the CIFAR-10 dataset—and compared for efficiency, a ratio between compute time and accuracy.
Dulcich, Joshua, "Exploring the Efficiency of Neural Architecture Search (NAS) Modules" (2022). Honors Theses. 263.
Machine learning; Computer algorithm; Neural architecture search
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Files over 3MB may be slow to open. For best results, right-click and select "save as..."