Apr 4, 2021
Automated neural architecture search (NAS) methods have been demonstrated as a powerful tool to facilitate neuralarchitecture design. However, the broad applicability of NAS has been restrained due to the difficulty of designingtask-specific search spaces and the necessity and verbosity to implement every NAS component from scratchwhen switching to another search space. In this work, we propose ModularNAS, a framework that implementsessential components of NAS in a modularized and unified manner. It enables automatic search space generationfor customized use cases while reusing predefined search strategies, allowing customized NAS use cases withlittle extra work needed. We conduct extensive experiments to verify the improved model performance on varioustasks by reusing supported NAS components over customized search spaces. We have also shown that targetingexisting architectures, ModularNAS can find superior ones concerning accuracy and deployment efficiency, suchas latency and FLOPS. The source code of our framework can be found at https://bit.ly/3gHXG.
The Conference on Machine Learning and Systems targets research at the intersection of machine learning and systems. The conference aims to elicit new connections amongst these fields, including identifying best practices and design principles for learning systems, as well as developing novel learning methods and theory tailored to practical machine learning workflows.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker