Working with Neural Architecture Search part2(Machine Learning)

Monodeep Mukherjee
2 min readJan 17, 2023
Photo by Nicolas Houdayer on Unsplash
  1. Development of a Neural Network-Based Mathematical Operation Protocol for Embedded Hexadecimal Digits Using Neural Architecture Search (NAS)(arXiv)

Author : Victor Robila, Kexin Pei, Junfeng Yang

Abstract : It is beneficial to develop an efficient machine-learning based method for addition using embedded hexadecimal digits. Through a comparison between human-developed machine learning model and models sampled through Neural Architecture Search (NAS) we determine an efficient approach to solve this problem with a final testing loss of 0.2937 for a human-developed model.

2.NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension (arXiv)

Author : Xin He, Jiangchao Yao, Yuxin Wang, Zhenheng Tang, Ka Chu Cheung, Simon See, Bo Han, Xiaowen Chu

Abstract : One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (i.e., subnet). However, the inconsistency of characteristics among subnets incurs serious interference in the optimization, resulting in poor performance ranking correlation of subnets. Subsequent explorations decompose supernet weights via a particular criterion, e.g., gradient matching, to reduce the interference; yet they suffer from huge computational cost and low space separability. In this work, we propose a lightweight and effective local intrinsic dimension (LID)-based method NAS-LID. NAS-LID evaluates the geometrical properties of architectures by calculating the low-cost LID features layer-by-layer, and the similarity characterized by LID enjoys better separability compared with gradients, which thus effectively reduces the interference among subnets. Extensive experiments on NASBench-201 indicate that NAS-LID achieves superior performance with better efficiency. Specifically, compared to the gradient-driven method, NAS-LID can save up to 86% of GPU memory overhead when searching on NASBench-201. We also demonstrate the effectiveness of NAS-LID on ProxylessNAS and OFA spaces. Source code: https://github.com/marsggbo/NAS-LID.

--

--

Monodeep Mukherjee

Universe Enthusiast. Writes about Computer Science, AI, Physics, Neuroscience and Technology,Front End and Backend Development