台灣人工智慧實驗室 之 AutoML for Genomic AI

Efficient Neural Architecture Search (ENAS)

ezGeno implemented a light version of ENAS, named ezNAS. Briefly speaking, ezNAS is different from ENAS on the sampling strategy and the design of residual connections. Our sampling strategy can debias the searching process. In addition, the design of residual connections is largely simplified. [1]

Combinations of multiple 1D features

The search space for specific task — predicting enhancer activity (AcEnhancer). In this task, we search for a 2-branch model. Each branch deals with different data and composes with six layers. The NAS algorithm has to decide which convolution layer to be used and how the information pass through the residual connections on both branches. [1]

Comparison with AutoKeras

The performance of ezGeno on TF Binding tasks. (a) The performance of ezGeno on 10 difficult TFs, in comparison with AutoKeras and one-layer DeepBind. (b) The running time of ezGeno, in comparison with AutoKeras. [1]

Reference:

--

--

Professor, Biomechatronics Engineering, National Taiwan University

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store