【导读】Neural Architecture Search(NAS),即自动化神经网络搜索,虽然早被谷歌在2017 年提出,直到今年才大行其道,屡屡在顶会舞台亮相。 Marius Lindauer
整理了246篇最新的NAS论文,欢迎收藏查看
https://www.automl.org/automl/literature-on-neural-architecture-search/
请关注专知公众号(点击上方蓝色专知关注)
后台回复“LNAS” 就可以获取完整版《自动化神经网络论文列表》的下载链接~
· [STICKY] Best Practices for Scientific Research on Neural ArchitectureSearch (Lindauer and Hutter. 2019)
https://arxiv.org/abs/1909.02453
· Bayesian Optimization of Neural Architecturesfor Human Activity Recognition (Osmani and Hamidi. 2019;accepted at Human Activity Sensing)
https://link.springer.com/chapter/10.1007/978-3-030-13001-5_12
· Compute-Efficient Neural Network ArchitectureOptimization by a Genetic Algorithm (Litzingerand Schiffmann. 2019; accepted at ICANN’19)
https://link.springer.com/chapter/10.1007/978-3-030-30484-3_32
· Automated deep learning design for medicalimage classification by health-care professionals with no coding experience: afeasibility study (Faes et al. 2019;accepted at The Lancet Digital Health)
https://www.sciencedirect.com/science/article/pii/S2589750019301086
· A greedy constructive algorithmfor the optimization of neural network architectures (Pasini et al. 2019)
https://arxiv.org/abs/1909.03306
· Differentiable Mask Pruning forNeural Networks (Ramakrishnan et al. 2019)
https://arxiv.org/abs/1909.04567
· Neural Architecture Search inEmbedding Space (Liu. 2019)
https://arxiv.org/abs/1909.03615
· Auto-GNN: Neural ArchitectureSearch of Graph Neural Networks (Zhou et al. 2019)
https://arxiv.org/abs/1909.03184
· Efficient Neural ArchitectureTransformation Search in Channel-Level for Object Detection (Peng et al. 2019)
https://arxiv.org/abs/1909.02293
· Training compact neuralnetworks via auxiliary overparameterization
https://arxiv.org/abs/1909.02214
· MANAS: Multi-Agent NeuralArchitecture Search (Carlucci et al. 2019)
https://arxiv.org/abs/1909.01051
· Neural Architecture Search forJoint Optimization of Predictive Power and Biological Knowledge (Zhang et al.2019)
https://arxiv.org/abs/1909.00337
· ScalableReinforcement-Learning-Based Neural Architecture Search for Cancer DeepLearning Research (Balaprakash et al. 2019)
https://arxiv.org/abs/1909.00311
· Automatic Neural Network Search Method forOpen Set Recognition (Sun et al. 2019;accepted at ICIP’19)
https://ieeexplore.ieee.org/abstract/document/8803605
· HM-NAS: Efficient Neural Architecture Searchvia Hierarchical Masking (Yan et al. 2019;accepted at ICCV’19 Neural Architects Workshop)
https://arxiv.org/abs/1909.00122
· Once for All: Train One Networkand Specialize it for Efficient Deployment (Cai et al. 2019)
https://arxiv.org/abs/1908.09791
· Refactoring Neural Networks forVerification (Shriver et al. 2019)
https://arxiv.org/abs/1908.08026
· CNASV: A Convolutional Neural ArchitectureSearch-Train Prototype for Computer Vision Task (Zhouand Yang. 2019; accepted at CollaborateCom’19)
· Automatic Design of Deep Networks with NeuralBlocks (Zhong et al. 2019; accepted atCognitive Computation)
https://link.springer.com/article/10.1007/s12559-019-09677-5
· DifferentiableLearning-to-Group Channels via Groupable Convolutional Neural Networks (Zhanget al. 2019)
https://arxiv.org/abs/1908.05867
· SCARLET-NAS: Bridging the gapBetween Scalability and Fairness in Neural Architecture Search (Chu et al.2019)
https://arxiv.org/abs/1908.06022
· A Novel Encoding Scheme for Complex NeuralArchitecture Search (Ahmad et al. 2019;accepted at ITC-CSCC)
https://ieeexplore.ieee.org/document/8793329
· A Graph-Based Encoding for EvolutionaryConvolutional Neural Network Architecture Design (Irwin-Harriset al. 2019; accepted CEC’19)
https://ieeexplore.ieee.org/document/8790093
· A Novel Framework for Neural ArchitectureSearch in the Hill Climbing Domain (Verma et al. 2019;accepted at AIKE’19)
https://ieeexplore.ieee.org/abstract/document/8791709
· Automated Neural NetworkConstruction with Similarity Sensitive Evolutionary
Algorithms (Tian et al. 2019)
http://rvc.eng.miami.edu/Paper/2019/IRI19_EA.pdf
· AutoGAN: Neural Architecture Search forGenerative Adversarial Networks (Gong et al. 2019;accepted at ICCV’19)
https://arxiv.org/abs/1908.03835
· Refining the Structure ofNeural Networks Using Matrix Conditioning (Yousefzadeh and O’Leary. 2019)
https://arxiv.org/abs/1908.02400
· SqueezeNAS: Fast neuralarchitecture search for faster semantic segmentation (Shaw et al. 2019)
https://arxiv.org/abs/1908.01748
· MoGA: Searching BeyondMobileNetV3 (Chu et al. 2019)
https://arxiv.org/abs/1908.01314
· Particle Swarm Optimisation forEvolving Deep Neural Networks for Image Classification by Evolving and StackingTransferable Blocks (Wang et al. 2019)
https://arxiv.org/abs/1907.12659
· MemNet: Memory-EfficiencyGuided Neural Architecture Search with Augment-Trim learning (by Liu et al.2019)
https://arxiv.org/abs/1907.09569
· Efficient Novelty-Driven NeuralArchitecture Search (Zhang et al. 2019)
https://arxiv.org/abs/1907.09109
· PC-DARTS: Partial ChannelConnections for Memory-Efficient Differentiable Architecture Search (Xu et al.2019)
https://arxiv.org/abs/1907.05737
· Hardware/SoftwareCo-Exploration of Neural Architectures (Jiang et al. 2019)
https://arxiv.org/abs/1907.04650
· EPNAS: Efficient ProgressiveNeural Architecture Search (Zhou et al. 2019)
https://arxiv.org/abs/1907.04648
· Video Action Recognition viaNeural Architecture Searching (Peng et al. 2019)
https://arxiv.org/abs/1907.04632
· Reinforcement Learning for NeuralArchitecture Search: A Review (Jaafra et al. 2019accepted at Image and Vision Computing)https://www.sciencedirect.com/science/article/pii/S026288561930088>
· Architecture Search for Image Inpainting (Li and King. 2019. accepted at InternationalSymposium on Neural Networks)
https://link.springer.com/chapter/10.1007/978-3-030-22796-8_12
· Neural Network ArchitectureSearch with Differentiable Cartesian Genetic Programming for Regression(Märtens and Izzo. 2019)
https://arxiv.org/abs/1907.01939
· FairNAS: Rethinking EvaluationFairness of Weight Sharing Neural Architecture Search (Chu et al. 2019)
https://arxiv.org/abs/1907.01845
· HyperNOMAD: Hyperparameteroptimization of deep neural networks using mesh adaptive direct search(Lakhmiri et al. 2019)
https://arxiv.org/pdf/1907.01698.pdf
· Evolving Robust NeuralArchitectures to Defend from Adversarial Attacks (Vargas and Kotyan. 2019)
https://arxiv.org/abs/1906.11667
· Surrogate-Assisted Evolutionary Deep LearningUsing an End-to-End Random Forest-based Performance Predictor (Sun et al. 2019; accepted by IEEE Transactions onEvolutionary Computation)
https://ieeexplore.ieee.org/document/8744404
· Adaptive Genomic Evolution of Neural NetworkTopologies (AGENT) for State-to-Action Mapping in Autonomous Agents (Behjat et al. 2019; accepted and presented in ICRA2019)
https://arxiv.org/abs/1903.07107
· Densely Connected Search Spacefor More Flexible Neural Architecture Search (Fang et al. 2019)
https://arxiv.org/abs/1906.09607
· SwiftNet: Using Graph Propagationas Meta-knowledge to Search Highly Representative Neural Architectures (Chenget al. 2019)
https://arxiv.org/abs/1906.08305
· Transfer NAS: KnowledgeTransfer between Search Spaces with Transformer Agents (Borsos et al. 2019)
https://arxiv.org/abs/1906.08102
· XNAS: Neural Architecture Search with ExpertAdvice (Nayman et al. 2019; accepted atNeurIPS’19)
https://arxiv.org/abs/1906.08031
· A Study of the LearningProgress in Neural Architecture Search Techniques (Singh et al. 2019)
https://arxiv.org/abs/1906.07590
· Hardware aware Neural NetworkArchitectures (using FBNet) (Srinivas et al. 2019)
https://arxiv.org/abs/1906.07214
· Sample-Efficient NeuralArchitecture Search by Learning Action Space (Wang et al. 2019)
https://arxiv.org/abs/1906.06832
· SwiftNet: Using GraphPropagation as Meta-knowledge to Search Highly Representative NeuralArchitectures (Cheng et al. 2019)
https://arxiv.org/abs/1906.08305
· Automatic Modulation Recognition Using NeuralArchitecture Search (Wei et al. 2019;accepted High Performance Big Data and Intelligent Systems)
https://ieeexplore.ieee.org/abstract/document/8735458
· Continual and Multi-TaskArchitecture Search (Pasunuru and Bansal. 2019)
https://arxiv.org/abs/1906.05226
· AutoGrow: Automatic LayerGrowing in Deep Convolutional Networks (Wen et al. 2019)
https://arxiv.org/abs/1906.02909
· One-Short Neural ArchitectureSearch via Compressing Sensing (Cho et al. 2019)
https://arxiv.org/abs/1906.02869
· V-NAS: Neural ArchitectureSearch for Volumetric Medical Image Segmentation (Zhu et al. 2019)
https://arxiv.org/abs/1906.02817
· StyleNAS: An Empirical Study ofNeural Architecture Search to Uncover Surprisingly Fast End-to-End UniversalStyle Transfer Networks (An et al. 2019)
https://arxiv.org/abs/1906.02470
· Efficient Forward Architecture Search (Hu et al. 2019; accepted at NeurIPS’19)
https://arxiv.org/abs/1905.13360
· Differentiable NeuralArchitecture Search via Proximal Iterations (Yao et al. 2019)
https://arxiv.org/abs/1905.13577
· Dynamic Distribution Pruningfor Efficient Network Architecture Search (Zheng et al. 2019)
https://arxiv.org/abs/1905.13543
· Particle swarm optimization of deep neuralnetworks architectures for image classification (FernandesJunior and Yen. 2019. accepted at Swarm and Evolutionary Computation)
https://www.sciencedirect.com/science/article/abs/pii/S2210650218309246
· On Network Design Spaces forVisual Recognition (Radosavovic et al. 2019)
https://arxiv.org/abs/1905.13214
· AssembleNet: Searching forMulti-Stream Neural Connectivity in Video Architectures (Ryoo et al. 2019)
https://arxiv.org/abs/1905.13209
· EfficientNet: Rethinking Model Scaling forConvolutional Neural Networks (Tan and Le, accepted atICML’19. 2019)
http://proceedings.mlr.press/v97/tan19a/tan19a.pdf
· Structure Learning for NeuralModule Networks (Pahuja et al. 2019)
https://arxiv.org/abs/1905.11532
· SpArSe: Sparse Architecture Search for CNNson Resource-Constrained Microcontrollers (Fedorovet al. 2019; accepted at NeurIPS’19)
https://arxiv.org/abs/1905.12107
· Network Pruning via TransformableArchitecture Search (Dong and Yang. 2019;accepted at NeurIPS’19)
https://arxiv.org/abs/1905.09717
· DEEP-BO for HyperparameterOptimization of Deep Networks (Cho et al. 2019)
https://arxiv.org/abs/1905.09680
· Constrained Design of Deep IrisNetworks (Nguyen et al. 2019)
https://arxiv.org/abs/1905.09481
· Adaptive Stochastic Natural Gradient Methodfor One-Shot Neural Architecture Search (Akimotoet al. 2019; accepted at ICML’19)
https://arxiv.org/abs/1905.08537
· Multinomial DistributionLearning for Effective Neural Architecture Search (Zheng et al. 2019)
https://arxiv.org/abs/1905.07529
· EENA: Efficient Evolution ofNeural Architecture (Zhu et al. 2019)
https://arxiv.org/abs/1905.07320
· DeepSwarm: OptimisingConvolutional Neural Networks using Swarm Intelligence (Byla and Pang. 2019)
https://arxiv.org/abs/1905.07350
· AutoDispNet: ImprovingDisparity Estimation with AutoML (Saikia et al. 2019)
https://arxiv.org/abs/1905.07443
· Online Hyper-parameter Learningfor Auto-Augmentation Strategy (Lin et al. 2019)
https://arxiv.org/abs/1905.07373
· Regularized EvolutionaryAlgorithm for Dynamic Neural Topology Search (Saltori et al. 2019)
https://arxiv.org/abs/1905.06252
· Deep Neural Architecture Searchwith Deep Graph Bayesian Optimization (Ma et al. 2019)
https://arxiv.org/abs/1905.06159
· Automatic Model Selection forNeural Networks (Laredo et al. 2019)
https://arxiv.org/abs/1905.06010
· Tabular Benchmarks for JointArchitecture and Hyperparameter Optimization (Klein and Hutter. 2019)
https://arxiv.org/abs/1905.04970
· BayesNAS: A Bayesian Approach for NeuralArchitecture Search (Zhou etal. 2019, accepted at ICML’19)
https://arxiv.org/abs/1905.04919
· Single-Path NAS: Device-AwareEfficient ConvNet Design (Stamoulis et al. 2019)
https://arxiv.org/abs/1905.04159
· Automatic Design of ArtificialNeural Networks for Gamma-Ray Detection (Assuncao et al. 2019)
https://arxiv.org/abs/1905.03532
· Neural Architecture Refinement:A Practical Way for Avoiding Overfitting in NAS (Jiang et al. 2019)
https://arxiv.org/abs/1905.02341
· Fast and Reliable ArchitectureSelection for Convolutional Neural Networks (Hahn et al. 2019)
https://arxiv.org/abs/1905.01924
· Differentiable ArchitectureSearch with Ensemble Gumbel-Softmax (Chang et al. 2019)
https://arxiv.org/abs/1905.01786
· Searching for A Robust Neural Architecture inFour GPU Hours (Dong and Yang 2019,accepted at CVPR’19)
https://xuanyidong.com/publication/cvpr-2019-gradient-based-diff-sampler/
· Evolving unsupervised deep neural networksfor learning meaningful representations (Sunet al. 2019, accepted by IEEE Transactions on Evolutionary Computation)
https://arxiv.org/abs/1712.05043
· Evolving Deep Convolutional Neural Networksfor Image Classification (Sun et al. 2019,accepted by IEEE Transactions on Evolutionary Computation)
https://arxiv.org/abs/1710.10741
· AdaResU-Net: Multiobjective AdaptiveConvolutional Neural Network for Medical Image Segmentation (Baldeon-Calisto and Lai-Yuen. 2019.; accepted atNeurocomputing)
https://www.sciencedirect.com/science/article/pii/S0925231219304679
· Automatic Design of Convolutional NeuralNetwork for Hyperspectral Image Classification (Chenet al. 2019; accepted at IEEE Transactions on Geoscience and Remote Sensing)
https://ieeexplore.ieee.org/abstract/document/8703410
· Progressive DifferentiableArchitecture Search: Bridging the Depth Gap between Search and Evaluation (Chenet al. 2019)
https://arxiv.org/abs/1904.12760
· Design Automation for EfficientDeep Learning Computing (Han et al. 2019)
https://arxiv.org/abs/1904.10616
· CascadeML: An Automatic NeuralNetwork Architecture Evolution and Training Algorithm for Multi-labelClassification (Pakrashi and Namee 2019)
https://arxiv.org/abs/1904.10551
· GraphNAS: Graph Neural ArchitectureSearch with Reinforcement Learning (Gao et al. 2019)
https://arxiv.org/abs/1904.09981
· Neural Architecture Search forDeep Face Recognition (Zhu. 2019)
https://arxiv.org/abs/1904.09523
· Efficient Neural ArchitectureSearch on Low-Dimensional
Data for OCT Image Segmentation (Gessert and Schlaefer. 2019)
https://openreview.net/forum?id=Syg3FDjntN
· NAS-Unet: Neural Architecture Search forMedical Image Segmentation (Weng et al. 2019;accepted at IEEE Access)
https://ieeexplore.ieee.org/document/8681706
· Fast DENSER: Efficient Deep NeuroEvolution (Assunção et al. 2019; accepted at ECGP’19)
https://link.springer.com/chapter/10.1007/978-3-030-16670-0_13
· NAS-FPN: Learning Scalable Feature PyramidArchitecture for Object Detection (Ghaisi et al. 2019;accepted at CVPR’19)
https://arxiv.org/abs/1904.07392
· Automated Search for Configurations of DeepNeural Network Architectures (Ghamizi et al. 2019;accepted at SPLC’19)
https://arxiv.org/abs/1904.04612
· WeNet: Weighted Networks forRecurrent Network Architecture Search (Huang and Xiang. 2019)
https://arxiv.org/abs/1904.03819
· Resource Constrained NeuralNetwork Architecture Search (Xiong et al. 2019)
https://arxiv.org/abs/1904.03786
· Size/Accuracy Trade-Off in ConvolutionalNeural Networks: An Evolutionary Approach (Cettoet al. 2019; accepted at INNSBDDL)
https://link.springer.com/chapter/10.1007/978-3-030-16841-4_3
· ASAP: Architecture Search,Anneal and Prune (Noy et al. 2019)
https://arxiv.org/abs/1904.04123
· Single-Path NAS: DesigningHardware-Efficient ConvNets in less than 4 Hours (Stamoulis et al. 2019)
https://arxiv.org/abs/1904.02877
· Architecture Search of DynamicCells for Semantic Video Segmentation (Nekrasov et al. 2019)
https://arxiv.org/abs/1904.02371
· Template-Based Automatic Searchof Compact Semantic Segmentation Architectures (Nekrasov et al. 2019)
https://arxiv.org/abs/1904.02365
· Exploring Randomly Wired NeuralNetworks for Image Recognition (Xie et al. 2019)
https://arxiv.org/abs/1904.01569
· Understanding NeuralArchitecture Search Techniques (Adam and Lorraine 2019)
https://arxiv.org/abs/1904.00438
· Automatic Convolutional Neural ArchitectureSearch for Image Classification Under Different Scenes (Weng et al. 2019; accepted for IEEE Access)
https://ieeexplore.ieee.org/document/8676019
· Single Path One-Shot NeuralArchitecture Search with Uniform Sampling (Guo et al. 2019)
https://arxiv.org/abs/1904.00420
· Network Slimming by SlimmableNetworks: Towards One-Shot Architecture Search for Channel Numbers (Yu andHuang 2019)
https://arxiv.org/abs/1903.11728
· sharpDARTS: Faster and MoreAccurate Differentiable Architecture Search (Hundt et al. 2019)
https://arxiv.org/abs/1903.09900
· DetNAS: Neural ArchitectureSearch on Object Detection (Chen et al. 2019)
https://arxiv.org/abs/1903.10979
· Evolution of Deep Convolutional NeuralNetworks Using Cartesian Genetic Programming (Suganumaet al. 2019; accepted at Evolutionary Computation)
https://www.mitpressjournals.org/doi/abs/10.1162/evco_a_00253
· Deep Evolutionary Networks with ExpeditedGenetic Algorithm for Medical Image Denoising (Liuet al. 2019; accepted at Medical Image Analysis)
https://www.sciencedirect.com/science/article/abs/pii/S1361841518307734
· Tuning Hyperparameters withoutGrad Students: Scalable and Robust Bayesian Optimisation with Dragonfly(Kandasamy et al. 2019)
https://arxiv.org/abs/1903.06694
· AttoNets: Compact and EfficientDeep Neural Networks for the Edge via Human-Machine Collaborative Design (Wonget al. 2019)
https://arxiv.org/abs/1903.07209
· Improving Neural ArchitectureSearch Image Classifiers via Ensemble Learning (Macko et al. 2019)
https://arxiv.org/abs/1903.06236
· Software-Defined Design SpaceExploration for an Efficient AI Accelerator Architecture (Yu et al. 2019)
https://arxiv.org/abs/1903.07676
· MFAS: Multimodal Fusion Architecture Search (Pérez-Rúa et al. 2019; accepted at CVPR’19)
https://hal.archives-ouvertes.fr/hal-02068293/document
· A Hybrid GA-PSO Method forEvolving Architecture and Short Connections of Deep Convolutional NeuralNetworks (Wang et al. 2019)
https://arxiv.org/abs/1903.03893
· Partial Order Pruning: for BestSpeed/Accuracy Trade-off in Neural Architecture Search (Li et al. 2019)
https://arxiv.org/abs/1903.03777
· Inductive Transfer for NeuralArchitecture Optimization (Wistuba and Pedapati 2019)
https://arxiv.org/abs/1903.03536
· Evolutionary Cell Aided Designfor Neural Network (Colangelo et al. 2019)
https://arxiv.org/abs/1903.02130
· Automated Architecture-Modelingfor Convolutional Neural Networks (Duong 2019)
https://btw.informatik.uni-rostock.de/download/workshopband/D1-1.pdf
· Learning Implicitly Recurrent CNNs ThroughParameter Sharing (Savarese and Maire,accepted at ICLR’19)
https://arxiv.org/abs/1902.09701
· Evaluating the Search Phase ofNeural Architecture Searc (Sciuto et al. 2019)
https://arxiv.org/abs/1902.08142
· Random Search andReproducibility for Neural Architecture Search (Li and Talwalkar 2019)
https://arxiv.org/abs/1902.07638
· Evolutionary Neural AutoML forDeep Learning (Liang et al. 2019)
https://arxiv.org/abs/1902.06827
· Fast Task-Aware ArchitectureInference (Kokiopoulou et al. 2019)
https://arxiv.org/abs/1902.05781
· Probabilistic NeuralArchitecture Search (Casale et al. 2019)
https://arxiv.org/abs/1902.05116
· Investigating Recurrent NeuralNetwork Memory Structures using Neuro-Evolution (Ororbia et al. 2019)
https://arxiv.org/abs/1902.02390
· Accuracy vs. Efficiency:Achieving Both through FPGA-Implementation Aware Neural Architecture Search(Jiang et al. 2019)
https://arxiv.org/abs/1901.11211
· The Evolved Transformer (So etal. 2019)
https://arxiv.org/abs/1901.11117
· Designing neural networks throughneuroevolution (Stanley et al. 2019;accepted at Nature Machine Intelligence)
https://www.nature.com/articles/s42256-018-0006-z
· NeuNetS: An Automated SynthesisEngine for Neural Network Design (Sood et al. 2019)
https://arxiv.org/abs/1901.06261
· Fast, Accurate and LightweightSuper-Resolution with Neural Architecture Search (Chu et al. 2019)
https://arxiv.org/abs/1901.07261
· EAT-NAS: Elastic ArchitectureTransfer for Accelerating Large-scale Neural Architecture Search (Fang et al.2019)
https://arxiv.org/abs/1901.05884
· Bayesian Learning of NeuralNetwork Architectures (Dikov et al. 2019)
https://arxiv.org/abs/1901.04436
· Auto-DeepLab: Hierarchical NeuralArchitecture Search for Semantic Image Segmentation (Liuet al. 2019; accepted at CVPR’19)
https://arxiv.org/abs/1901.02985
· The Art of Getting Deep Neural Networks inShape (Mammadli et al. 2019; accepted atTACO Journal)
https://dl.acm.org/citation.cfm?id=3291053
· Multi-Objective ReinforcedEvolution in Mobile Neural Architecture Search (Chu et al. 2019)
https://arxiv.org/abs/1901.01074
· A particle swarm optimization-based flexibleconvolutional auto-encoder for image classification (Sunet al. 2018, published by IEEE Transactions on Neural Networks and LearningSystems)
https://arxiv.org/abs/1712.05042
· SNAS: Stochastic Neural Architecture Search (Xie et al. 2018; accepted at ICLR’19)
https://arxiv.org/abs/1812.09926
· Graph Hypernetworks for Neural ArchitectureSearch (Zhang et al. 2018; Accepted atICLR’19)
https://arxiv.org/abs/1810.05749
· Efficient Multi-Objective Neural ArchitectureSearch via Lamarckian Evolution (Elsken et al. 2018;accepted at ICLR’19)
https://arxiv.org/abs/1804.09081
· Macro Neural ArchitectureSearch Revisited (Hu et al. 2018; accepted at Meta-Learn NeurIPSworkshop’18)
http://metalearning.ml/2018/papers/metalearn2018_paper16.pdf
· AMLA: an AutoML frAmework forNeural Network Design (Kamath et al. 2018; at ICML AutoML workshop)
http://pkamath.com/publications/papers/amla_automl18.pdf
· ChamNet: Towards EfficientNetwork Design through Platform-Aware Model Adaptation (Dai et al. 2018)
https://arxiv.org/abs/1812.08934
· Neural Architecture Search Overa Graph Search Space (de Laroussilhe et al. 2018)
https://arxiv.org/abs/1812.10666
· A Review of Meta-ReinforcementLearning for Deep Neural Networks Architecture Search (Jaafra et al.2018)
https://arxiv.org/abs/1812.07995
· Evolutionary NeuralArchitecture Search for Image Restoration (van Wyk and Bosman 2018)
https://arxiv.org/abs/1812.05866
· IRLAS: Inverse Reinforcement Learning forArchitecture Search (Guo et al. 2018;accepted at CVPR’19)
https://arxiv.org/abs/1812.05285
· FBNet: Hardware-Aware Efficient ConvNetDesignvia Differentiable Neural Architecture Search (Wuet al. 2018; accepted at CVPR’19)
https://arxiv.org/abs/1812.03443
· ShuffleNASNets: Efficient CNNmodels throughmodified Efficient Neural Architecture Search (Laube et al.2018)
https://arxiv.org/abs/1812.02975
· ProxylessNAS: Direct Neural ArchitectureSearch on Target Task and Hardware (Cai et al. 2018;accepted at ICLR’19)
https://arxiv.org/abs/1812.00332
· Mixed Precision Quantization ofConvNets via Differentiable Neural Architecture Search (Wu et al. 2018)
https://arxiv.org/abs/1812.00090
· TEA-DNN: the Quest forTime-Energy-Accuracy Co-optimized Deep Neural Networks (Cai et al. 2018)
https://arxiv.org/abs/1811.12065
· Evolving Space-Time NeuralArchitectures for Videos (Piergiovanni et al. 2018)
https://arxiv.org/abs/1811.10636
· InstaNAS: Instance-aware NeuralArchitecture Search (Cheng et al. 2018)
https://arxiv.org/abs/1811.10201
· Evolutionary-Neural Hybrid Agents forArchitecture Search (Maziarz et al. 2018;accepted at ICML’19 workshop on AutoML)
https://arxiv.org/abs/1811.09828
· Joint Neural ArchitectureSearch and Quantization (Chen et al. 2018)
https://arxiv.org/abs/1811.09426
· Transfer Learning with Neural AutoML (Wong et al. 2018; accepted at NeurIPS’18)
http://papers.nips.cc/paper/8056-transfer-learning-with-neural-automl.pdf
· Evolving Image ClassificationArchitectures with Enhanced Particle Swarm Optimisation (Fielding and Zhang2018)
https://ieeexplore.ieee.org/document/8533601
· Deep Active Learning with a NeuralArchitecture Search (Geifman and El-Yaniv2018; accepted at NeurIPS’19)
https://arxiv.org/abs/1811.07579
· Stochastic Adaptive NeuralArchitecture Search for Keyword Spotting (Véniat et al. 2018)
https://arxiv.org/abs/1811.06753
· NSGA-NET: A Multi-ObjectiveGenetic Algorithm for Neural Architecture Search (Lu et al. 2018)
https://arxiv.org/abs/1810.03522
· You only search once: SingleShot Neural Architecture Search via Direct Sparse Optimization (Zhang et al.2018)
https://arxiv.org/abs/1811.01567
· Automatically Evolving CNN ArchitecturesBased on Blocks (Sun et al. 2018;accepted by IEEE Transactions on Neural Networks and Learning Systems)
https://arxiv.org/abs/1810.11875
· The CoSTAR Block Stacking Dataset: Learningwith Workspace Constraints (Hundt et al. 2018;accepted at IROS’19)
https://arxiv.org/abs/1810.11714
· Fast Neural Architecture Search of CompactSemantic Segmentation Models via Auxiliary Cells (Nekrasovet al. 2018; accepted at CVPR’19)
https://arxiv.org/abs/1810.10804
· Automatic Configuration of DeepNeural Networks with Parallel Efficient Global Optimization (van Stein et al.2018)
https://arxiv.org/abs/1810.05526
· Gradient Based Evolution toOptimize the Structure of Convolutional Neural Networks (Mitschke et al.2018)
https://ieeexplore.ieee.org/document/8451394
· Searching Toward Pareto-OptimalDevice-Aware Neural Architectures (Cheng et al. 2018)
https://arxiv.org/abs/1808.09830
· Neural Architecture Optimization (Luo et al. 2018; accepted at NeurIPS’18)
https://arxiv.org/abs/1808.07233
· Exploring Shared Structures andHierarchies for Multiple NLP Tasks (Chen et al. 2018)
https://arxiv.org/abs/1808.07658
· Neural Architecture Search: ASurvey (Elsken et al. 2018)
https://arxiv.org/abs/1808.05377
· BlockQNN: Efficient Block-wiseNeural Network Architecture Generation (Zhong et al. 2018)
https://arxiv.org/abs/1808.05584
· Automatically Designing CNNArchitectures Using Genetic Algorithm for Image Classification (Sunet al. 2018)
https://arxiv.org/abs/1808.03818
· Reinforced Evolutionary Neural ArchitectureSearch (Chen et al. 2018; accepted atCVPR’19)
https://arxiv.org/abs/1808.00193
· Teacher Guided ArchitectureSearch (Bashivan et al. 2018)
https://arxiv.org/abs/1808.01405
· Efficient Progressive NeuralArchitecture Search (Perez-Rua et al. 2018)
https://arxiv.org/abs/1808.00391
· MnasNet: Platform-Aware Neural ArchitectureSearch for Mobile (Tan et al. 2018;accepted at CVPR’19)
https://arxiv.org/abs/1807.11626
· Towards Automated DeepLearning: Efficient Joint Neural Architecture and Hyperparameter Search (Zelaet al. 2018)
https://arxiv.org/abs/1807.06906
· Automatically Designing CNNArchitectures for Medical Image Segmentation (Mortazi and Bagci 2018)
https://arxiv.org/abs/1807.07663
· MONAS: Multi-Objective NeuralArchitecture Search using Reinforcement Learning (Hsu et al. 2018)
https://arxiv.org/abs/1806.10332
· Path-Level Network Transformation forEfficient Architecture Search (Cai et al. 2018;accepted at ICML’18)
https://arxiv.org/abs/1806.02639
· Lamarckian Evolution ofConvolutional Neural Networks (Prellberg and Kramer, 2018)
https://arxiv.org/abs/1806.08099
· Deep Learning ArchitectureSearch by Neuro-Cell-based Evolution with Function-Preserving Mutations(Wistuba, 2018)
http://www.ecmlpkdd2018.org/wp-content/uploads/2018/09/108.pdf
· DARTS: Differentiable Architecture Search (Liu et al. 2018; accepted at ICLR’19)
https://arxiv.org/abs/1806.09055
· Constructing Deep NeuralNetworks by Bayesian Network Structure Learning (Rohekar et al. 2018)
https://arxiv.org/abs/1806.09141
· Resource-Efficient NeuralArchitect (Zhou et al. 2018)
https://arxiv.org/abs/1806.07912
· Efficient Neural ArchitectureSearch with Network Morphism (Jin et al. 2018)
https://arxiv.org/abs/1806.10282
· TAPAS: Train-less AccuracyPredictor for Architecture Search (Istrate et al. 2018)
https://arxiv.org/abs/1806.00250
· AlphaX: eXploring NeuralArchitectures with Deep Neural Networks and Monte Carlo Tree Search (Wang et al2018)
https://arxiv.org/abs/1805.07440
https://arxiv.org/abs/1903.11059
· Multi-objective ArchitectureSearch for CNNs (Elsken et al. 2018)
https://arxiv.org/abs/1804.09081
· GNAS: A Greedy NeuralArchitecture Search Method for Multi-Attribute Learning (Huang et al 2018)
https://arxiv.org/abs/1804.06964
· Evolutionary ArchitectureSearch For Deep Multitask Networks (Liang et al. 2018)
https://arxiv.org/abs/1803.03745
· From Nodes to Networks:Evolving Recurrent Neural Networks (Rawal et al. 2018)
https://arxiv.org/abs/1803.04439
· Neural ArchitectureConstruction using EnvelopeNets (Kamath et al. 2018)
https://arxiv.org/abs/1803.06744
· Transfer Automatic MachineLearning (Wong et al. 2018)
https://arxiv.org/abs/1803.02780
· Neural Architecture Search withBayesian Optimisation and Optimal Transport (Kandasamy et al. 2018)
https://arxiv.org/abs/1802.07191
· Efficient Neural Architecture Search viaParameter Sharing (Pham et al. 2018; accepted atICML’18)
https://arxiv.org/abs/1802.03268
· Regularized Evolution for ImageClassifier Architecture Search (Real et al. 2018)
https://arxiv.org/abs/1802.01548
· Effective Building Block Designfor Deep Convolutional Neural Networks using Search (Dutta et al. 2018)
https://arxiv.org/abs/1801.08577
· Combination of Hyperband andBayesian Optimization for Hyperparameter Optimization in Deep Learning (Wang etal. 2018)
https://arxiv.org/abs/1801.01596
· Memetic Evolution of DeepNeural Networks (Lorenzo and Nalepa 2018)
https://dl.acm.org/citation.cfm?id=3205631
· Understanding and Simplifying One-ShotArchitecture Search (Bender et al. 2018; accepted atICML’18)
http://proceedings.mlr.press/v80/bender18a/bender18a.pdf
· Differentiable Neural Network ArchitectureSearch (Shin et al. 2018; accepted atICLR’18 workshop)
https://openreview.net/pdf?id=BJ-MRKkwG
· PPP-Net: Platform-aware progressive searchfor pareto-optimal neural architectures (Donget al. 2018; accepted at ICLR’18 workshop)
https://openreview.net/pdf?id=B1NT3TAIM
· Speeding up the HyperparameterOptimization of Deep Convolutional Neural Networks (Hinz et al. 2018)
https://www.worldscientific.com/doi/abs/10.1142/S1469026818500086
· Gitgraph – From ComputationalSubgraphs to Smaller Architecture search spaces (Bennani-Smires et al. 2018)
https://openreview.net/pdf?id=rkiO1_1Pz
· N2N Learning: Network to Network Compressionvia Policy Gradient Reinforcement Learning (Ashoket al. 2017; accepted at ICLR’18)
https://arxiv.org/abs/1709.06030
· Genetic CNN (Xieand Yuille, 2017; accepted at ICCV’17)
https://arxiv.org/abs/1703.01513
· MorphNet: Fast & SimpleResource-Constrained Structure Learning of Deep Networks (Gordon et al. 2017)
https://arxiv.org/abs/1711.06798
· MaskConnect: Connectivity Learning byGradient Descent (Ahmed andTorresani. 2017; accepted at ECCV’18)
https://arxiv.org/abs/1709.09582
· A Flexible Approach toAutomated RNN Architecture Generation (Schrimpf et al. 2017)
https://arxiv.org/abs/1712.07316
· DeepArchitect: AutomaticallyDesigning and Training Deep Architectures (Negrinho and Gordon 2017)
https://arxiv.org/abs/1704.08792
· A Genetic Programming Approach to DesigningConvolutional Neural Network Architectures (Suganumaet al. 2017; accepted at GECCO’17)
https://arxiv.org/abs/1704.00764
· Practical Block-wise Neural NetworkArchitecture Generation (Zhong et al. 2017; accepted atCVPR’18)
https://arxiv.org/abs/1708.05552
· Accelerating Neural Architecture Search usingPerformance Prediction (Baker et al. 2017;accepted at NeurIPS worshop on Meta-Learning 2017)
https://arxiv.org/abs/1705.10823
· Large-Scale Evolution of Image Classifiers (Real et al. 2017; accepted at ICML’17)
https://arxiv.org/abs/1703.01041
· Hierarchical Representations for EfficientArchitecture Search (Liu et al. 2017;accepted at ICLR’18)
https://arxiv.org/abs/1711.00436
· Neural Optimizer Search withReinforcement Learning (Bello et al. 2017)
https://arxiv.org/abs/1709.07417
· Progressive Neural Architecture Search (Liu et al. 2017; accepted at ECCV’18)
https://arxiv.org/abs/1712.00559
· Learning Transferable Architectures forScalable Image Recognition (Zoph et al. 2017;CVPR’18)
https://arxiv.org/abs/1707.07012
· Simple And Efficient Architecture Search forConvolutional Neural Networks (Elsken et al. 2017;accepted at NeurIPS workshop on Meta-Learning’17)
https://arxiv.org/abs/1711.04528
· Bayesian Optimization Combinedwith Incremental Evaluation for Neural Network Architecture Optimization(Wistuba, 2017)
https://www.semanticscholar.org/paper/Bayesian-Optimization-Combined-with-Successive-for-Wistuba/ddb182533c91f0941f088e1e298c52a111253554
· Finding Competitive NetworkArchitectures Within a Day Using UCT (Wistuba 2017)
https://arxiv.org/abs/1712.07420
· Hyperparameter Optimization: ASpectral Approach (Hazan et al. 2017)
https://arxiv.org/abs/1706.00764
· SMASH: One-Shot Model Architecture Searchthrough HyperNetworks (Brock et al. 2017;accepted at NeurIPS workshop on Meta-Learning’17)
https://arxiv.org/abs/1708.05344
· Efficient Architecture Search by NetworkTransformation (Cai et al. 2017;accepted at AAAI’18)
https://arxiv.org/abs/1707.04873
· Modularized Morphing of NeuralNetworks (Wei et al. 2017)
https://arxiv.org/abs/1701.03281
· Towards Automatically-Tuned Neural Networks (Mendoza et al. 2016; accepted at ICML AutoMLworkshop)
http://proceedings.mlr.press/v64/mendoza_towards_2016.html
· Neural Networks DesigningNeural Networks: Multi-Objective Hyper-Parameter Optimization (Smithson et al.2016)
https://arxiv.org/abs/1611.02120
· AdaNet: Adaptive StructuralLearning of Artificial Neural Networks (Cortes et al. 2016)
https://arxiv.org/abs/1607.01097
· Network Morphism (Wei et al.2016)
https://arxiv.org/abs/1603.01670
· Convolutional Neural Fabrics (Saxena and Verbeek 2016; accepted at NeurIPS’16)
https://arxiv.org/abs/1606.02492
· CMA-ES for HyperparameterOptimization of Deep Neural Networks (Loshchilov and Hutter 2016)
https://arxiv.org/abs/1604.07269
· Designing Neural Network Architectures usingReinforcement Learning (Baker et al. 2016;accepted at ICLR’17)
https://arxiv.org/abs/1611.02167
· Neural Architecture Search with ReinforcementLearning (Zoph and Le. 2016; accepted atICLR’17)
https://arxiv.org/abs/1611.01578
· Learning curve prediction with BayesianNeural Networks (Klein et al. 2017:accepted at ICLR’17)
http://ml.informatik.uni-freiburg.de/papers/17-ICLR-LCNet.pdf
· Hyperband: A Novel Bandit-BasedApproach to Hyperparameter Optimization (Li et al. 2016)
https://arxiv.org/abs/1603.06560
· Net2Net: Accelerating Learning via KnowledgeTransfer (Chen et al. 2015;accepted at ICLR’16)
https://arxiv.org/abs/1511.05641
· Optimizing deep learninghyper-parameters through an evolutionary algorithm (Young et al. 2015)
https://dl.acm.org/citation.cfm?id=2834896
· Practical Bayesian Optimization of MachineLearning Algorithms (Snoek et al. 2012;accepted at NeurIPS’12)
https://papers.nips.cc/paper/4522-practical-bayesian-optimization-of-machine-learning-algorithms.pdf
· A Hypercube-based Encoding forEvolving large-scale Neural Networks (Stanley et al. 2009)
https://ieeexplore.ieee.org/document/6792316/
· Neuroevolution: From Architectures toLearning (Floreano et al. 2008; acceptedat Evolutionary Intelligence’08)
https://link.springer.com/article/10.1007/s12065-007-0002-4
· Evolving Neural Networks through AugmentingTopologies (Stanley and Miikkulainen,2002; accepted at Evolutionary Computation’02)
http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf
· Evolving Artificial Neural Networks (Yao, 1999; accepted at IEEE)
https://ieeexplore.ieee.org/document/784219/
· An Evolutionary Algorithm that ConstructsRecurrent Neural Networks (Angeline et al. 1994)
https://ieeexplore.ieee.org/document/265960/
· Designing Neural Networks Using GeneticAlgorithms with Graph Generation System (Kitano,1990)
http://www.complex-systems.com/abstracts/v04_i04_a06/
· Designing Neural Networks using GeneticAlgorithms (Miller et al. 1989;accepted at ICGA’89)
https://dl.acm.org/citation.cfm?id=94034
· The Cascade-Correlation Learning Architecture (Fahlman and Leblere, 1989; accepted at NeurIPS’89)
https://papers.nips.cc/paper/207-the-cascade-correlation-learning-architecture
· Self Organizing Neural Networks for theIdentification Problem (Tenorio and Lee, 1988;accepted at NeurIPS’88)
https://papers.nips.cc/paper/149-self-organizing-neural-networks-for-the-identification-problem
-END-
专 · 知
专知,专业可信的人工智能知识分发,让认知协作更快更好!欢迎登录www.zhuanzhi.ai,注册登录专知,获取更多AI知识资料!
欢迎微信扫一扫加入专知人工智能知识星球群,获取最新AI专业干货知识教程视频资料和与专家交流咨询!
请加专知小助手微信(扫一扫如下二维码添加),加入专知人工智能主题群,咨询技术商务合作~
专知《深度学习:算法到实战》课程全部完成!560+位同学在学习,现在报名,限时优惠!网易云课堂人工智能畅销榜首位!
点击“阅读原文”,了解报名专知《深度学习:算法到实战》课程