UC San Diego
A Synthetic Brain Benchmark Suite
CortexSuite is a new brain-inspired benchmark suite containing with a comprehensive array of algorithms from machine learning, natural language processing, and computer vision algorithms and includes real world datasets for each algorithm. It is frequently used by computer architects and chip designers, and anybody who wants clean implementations of these algorithms.
|Spectral Clustering||Machine Learning|
|Latent Dirichlet Allocation||Machine Learning|
|Principle Component Analysis||Feature Extraction|
|Singular Value Decomposition||Feature Extraction|
|Motion Estimation||Computer Vision|
|Super Resolution Enhancement||Computer Vision|
|Convolution Neural Network||Neural Networks|
|Sensor Fusion||Internet of Things|
|Restricted Boltzmann Machine||Neural Networks|
|Speech Recognition||Language Processing|
|Disparity Map||Computer Vision|
|Feature Tracking||Computer Vision|
|Image Segmentation||Computer Vision|
|Robot Localization||Computer Vision|
|Image Stitch||Computer Vision|
|Texture Synthesis||Computer Vision|
These days, many traditional end-user applications are said to "run fast enough" on existing machines, so the search continues for novel applications that can leverage the new capabilities of our evolving hardware. Foremost of these potential applications are those that are clustered around information processing capabilities that humans have today but are lacking in computers.
The fact that brains can perform these computations serves as an existence proof that these applications are realizable. At the same time, we often discover that the human nervous system, with its 80 billion neurons, on some metrics, is more powerful and energy-efficient than today's machines. Both of these aspects make this class of applications a desirable target for an architectural benchmark suite, because there is evidence that these applications are both useful and computationally challenging. CortexSuite seeks to capture this workload.
We classify and identify benchmarks within CortexSuite by analogy to the human neural processing function. We use the major lobes of the cerebral cortex as a model for the organization and classification of data processing algorithms. To be clear, our goal is not to emulate the brain at the level of the neuron, but rather to collect together synthetic, man-made algorithms that have similar function and have met with success in the real world.
We consulted six world-class machine learning and computer vision researchers, who collectively hold 83,091 citations across their distinct subareas, asking them to identify newly emerging computationally-intensive algorithms or applications that are going to have a large impact over the next ten years. This is coupled with datasets that reflect the philosophy of practical use algorithms and are coded in "clean C" so as to make them accessible, analyzable, and usable for parallel and approximate compiler and architecture researchers alike.
|CortexSuite debuted at IISWC October 26, 2014 in
Raleigh, North Carolina. The published paper can be found here:
CortexSuite: A Synthetic Brain Benchmark Suite (pdf)(bib entry)
Shelby Thomas, Chetan Ghohkale, Enrico Tanuwidjaja, Tony Chong,
David Lau, Saturnino Garcia, and Michael Bedford Taylor.
IEEE International Symposium on Workload Characterization (IISWC), Oct 2014.
The extended version of the the paper can be found here: CortexSuite: A Synthetic Brain Benchmark Suite (Extended)
|CortexSuite was developed at UC San Diego under the guidance of
Michael B. Taylor and Saturnino Garcia with a diverse research staff
of phd and masters students.
This work was partially supported by NSF Awards 0846152, 1018850, and 1228992, and by C-FAR, part of STARnet, a Semiconductor Research Corporation program.