当前期刊: arXiv - CS - Neural and Evolutionary Computing Go to current issue    加入关注   
显示样式:        排序: IF: - GO 导出
我的关注
我的收藏
您暂时未登录!
登录
  • Parametrization of Neural Networks with Connected Abelian Lie Groups as Data Manifold
    arXiv.cs.NE Pub Date : 2020-04-06
    Luciano Melodia; Richard Lenz

    Neural nets have been used in an elusive number of scientific disciplines. Nevertheless, their parameterization is largely unexplored. Dense nets are the coordinate transformations of a manifold from which the data is sampled. After processing through a layer, the representation of the original manifold may change. This is crucial for the preservation of its topological structure and should therefore

    更新日期:2020-04-08
  • Evolving Normalization-Activation Layers
    arXiv.cs.NE Pub Date : 2020-04-06
    Hanxiao Liu; Andrew Brock; Karen Simonyan; Quoc V. Le

    Normalization layers and activation functions are critical components in deep neural networks that frequently co-locate with each other. Instead of designing them separately, we unify them into a single computation graph, and evolve its structure starting from low-level primitives. Our layer search algorithm leads to the discovery of EvoNorms, a set of new normalization-activation layers that go beyond

    更新日期:2020-04-08
  • The multi-objective optimisation of breakwaters using evolutionary approach
    arXiv.cs.NE Pub Date : 2020-04-06
    Nikolay O. Nikitin; Iana S. Polonskaia; Anna V. Kalyuzhnaya; Alexander V. Boukhanovsky

    In engineering practice, it is often necessary to increase the effectiveness of existing protective constructions for ports and coasts (i. e. breakwaters) by extending their configuration, because existing configurations don't provide the appropriate environmental conditions. That extension task can be considered as an optimisation problem. In the paper, the multi-objective evolutionary approach for

    更新日期:2020-04-08
  • Directional approach to gradual cover: the continuous case
    arXiv.cs.NE Pub Date : 2020-04-06
    Tammy Drezner; Zvi Drezner; Pawel Kalczynski

    The objective of the cover location models is covering demand by facilities within a given distance. The gradual (or partial) cover replaces abrupt drop from full cover to no cover by defining gradual decline in cover. In this paper we use a recently proposed rule for calculating the joint cover of a demand point by several facilities termed "directional gradual cover". Contrary to all gradual cover

    更新日期:2020-04-08
  • Real-time Classification from Short Event-Camera Streams using Input-filtering Neural ODEs
    arXiv.cs.NE Pub Date : 2020-04-07
    Giorgio Giannone; Asha Anoosheh; Alessio Quaglino; Pierluca D'Oro; Marco Gallieri; Jonathan Masci

    Event-based cameras are novel, efficient sensors inspired by the human vision system, generating an asynchronous, pixel-wise stream of data. Learning from such data is generally performed through heavy preprocessing and event integration into images. This requires buffering of possibly long sequences and can limit the response time of the inference system. In this work, we instead propose to directly

    更新日期:2020-04-08
  • Specific Single- and Multi-Objective Evolutionary Algorithmsfor the Chance-Constrained Knapsack Problem
    arXiv.cs.NE Pub Date : 2020-04-07
    Yue Xie; Aneta Neumann; Frank Neumann

    The chance-constrained knapsack problem is a variant of the classical knapsack problem where each item has a weight distribution instead of a deterministic weight. The objective is to maximize the total profit of the selected items under the condition that the weight of the selected items only exceeds the given weight bound with a small probability of $\alpha$. In this paper, consider problem-specific

    更新日期:2020-04-08
  • How Do You Act? An Empirical Study to Understand Behavior of Deep Reinforcement Learning Agents
    arXiv.cs.NE Pub Date : 2020-04-07
    Richard Meyes; Moritz Schneider; Tobias Meisen

    The demand for more transparency of decision-making processes of deep reinforcement learning agents is greater than ever, due to their increased use in safety critical and ethically challenging domains such as autonomous driving. In this empirical study, we address this lack of transparency following an idea that is inspired by research in the field of neuroscience. We characterize the learned representations

    更新日期:2020-04-08
  • Self-Adjusting Evolutionary Algorithms for Multimodal Optimization
    arXiv.cs.NE Pub Date : 2020-04-07
    Amirhossein Rajabi; Carsten Witt

    Recent theoretical research has shown that self-adjusting and self-adaptive mechanisms can provably outperform static settings in evolutionary algorithms for binary search spaces. However, the vast majority of these studies focuses on unimodal functions which do not require the algorithm to flip several bits simultaneously to make progress. In fact, existing self-adjusting algorithms are not designed

    更新日期:2020-04-08
  • Binary Neural Networks: A Survey
    arXiv.cs.NE Pub Date : 2020-03-31
    Haotong Qin; Ruihao Gong; Xianglong Liu; Xiao Bai; Jingkuan Song; Nicu Sebe

    The binary neural network, largely saving the storage and computation, serves as a promising technique for deploying deep models on resource-limited devices. However, the binarization inevitably causes severe information loss, and even worse, its discontinuity brings difficulty to the optimization of the deep network. To address these issues, a variety of algorithms have been proposed, and achieved

    更新日期:2020-04-08
  • DiagNet: towards a generic, Internet-scale root cause analysis solution
    arXiv.cs.NE Pub Date : 2020-04-07
    Loïck BonniotWIDE; Christoph NeumannWIDE; François TaïaniWIDE

    Diagnosing problems in Internet-scale services remains particularly difficult and costly for both content providers and ISPs. Because the Internet is decentralized, the cause of such problems might lie anywhere between an end-user's device and the service datacenters. Further, the set of possible problems and causes is not known in advance, making it impossible in practice to train a classifier with

    更新日期:2020-04-08
  • Beer Organoleptic Optimisation: Utilising Swarm Intelligence and Evolutionary Computation Methods
    arXiv.cs.NE Pub Date : 2020-04-07
    Mohammad Majid al-Rifaie; Marc Cavazza

    Customisation in food properties is a challenging task involving optimisation of the production process with the demand to support computational creativity which is geared towards ensuring the presence of alternatives. This paper addresses the personalisation of beer properties in the specific case of craft beers where the production process is more flexible. We investigate the problem by using three

    更新日期:2020-04-08
  • Attribution in Scale and Space
    arXiv.cs.NE Pub Date : 2020-04-03
    Shawn Xu; Subashini Venugopalan; Mukund Sundararajan

    We study the attribution problem [28] for deep networks applied to perception tasks. For vision tasks, attribution techniques attribute the prediction of a network to the pixels of the input image. We propose a new technique called \emph{Blur Integrated Gradients}. This technique has several advantages over other methods. First, it can tell at what scale a network recognizes an object. It produces

    更新日期:2020-04-08
  • Neural Analogical Matching
    arXiv.cs.NE Pub Date : 2020-04-07
    Maxwell Crouse; Constantine Nakos; Ibrahim Abdelaziz; Kenneth Forbus

    Analogy is core to human cognition. It allows us to solve problems based on prior experience, it governs the way we conceptualize new information, and it even influences our visual perception. The importance of analogy to humans has made it an active area of research in the broader field of artificial intelligence, resulting in data-efficient models that learn and reason in human-like ways. While analogy

    更新日期:2020-04-08
  • Evolutionary Multi-Objective Optimization Driven by Generative Adversarial Networks
    arXiv.cs.NE Pub Date : 2019-07-10
    Cheng He; Shihua Huang; Ran Cheng; Kay Chen Tan; Yaochu Jin

    Recently, more and more works have proposed to drive evolutionary algorithms using machine learning models.Usually, the performance of such model based evolutionary algorithms is highly dependent on the training qualities of the adopted models.Since it usually requires a certain amount of data (i.e. the candidate solutions generated by the algorithms) for model training, the performance deteriorates

    更新日期:2020-04-08
  • Symbiosis Promotes Fitness Improvements in the Game of Life
    arXiv.cs.NE Pub Date : 2019-08-19
    Peter D. Turney

    We present a computational simulation of evolving entities that includes symbiosis with shifting levels of selection. Evolution by natural selection shifts from the level of the original entities to the level of the new symbiotic entity. In the simulation, the fitness of an entity is measured by a series of one-on-one competitions in the Immigration Game, a two-player variation of Conway's Game of

    更新日期:2020-04-08
  • Application of Genetic Algorithm for More Efficient Multi-Layer Thickness Optimization in Solar Cells
    arXiv.cs.NE Pub Date : 2019-09-14
    Premkumar Vincent; Gwenaelle Cunha Sergio; Jaewon Jang; In Man Kang; Jaehoon Park; Hyeok Kim; Minho Lee; Jin-Hyuk Bae

    Thin-film solar cells are predominately designed similar to a stacked structure. Optimizing the layer thicknesses in this stack structure is crucial to extract the best efficiency of the solar cell. The commonplace method used in optimization simulations, such as for optimizing the optical spacer layers' thicknesses, is the parameter sweep. Our simulation study shows that the implementation of a meta-heuristic

    更新日期:2020-04-08
  • Predicting the outputs of finite networks trained with noisy gradients
    arXiv.cs.NE Pub Date : 2020-04-02
    Gadi Naveh; Oded Ben-David; Haim Sompolinsky; Zohar Ringel

    A recent line of studies has focused on the infinite width limit of deep neural networks (DNNs) where, under a certain deterministic training protocol, the DNN outputs are related to a Gaussian Process (GP) known as the Neural Tangent Kernel (NTK). However, finite-width DNNs differ from GPs quantitatively and for CNNs the difference may be qualitative. Here we present a DNN training protocol involving

    更新日期:2020-04-06
  • Under the Hood of Neural Networks: Characterizing Learned Representations by Functional Neuron Populations and Network Ablations
    arXiv.cs.NE Pub Date : 2020-04-02
    Richard Meyes; Constantin Waubert de Puiseau; Andres Posada-Moreno; Tobias Meisen

    The need for more transparency of the decision-making processes in artificial neural networks steadily increases driven by their applications in safety critical and ethically challenging domains such as autonomous driving or medical diagnostics. We address today's lack of transparency of neural networks and shed light on the roles of single neurons and groups of neurons within the network fulfilling

    更新日期:2020-04-06
  • Does Comma Selection Help To Cope With Local Optima
    arXiv.cs.NE Pub Date : 2020-04-02
    Benjamin Doerr

    One hope of using non-elitism in evolutionary computation is that it aids leaving local optima. We perform a rigorous runtime analysis of a basic non-elitist evolutionary algorithm (EA), the $(\mu,\lambda)$ EA, on the most basic benchmark function with a local optimum, the jump function. We prove that for all reasonable values of the parameters and the problem, the expected runtime of the $(\mu,\lambda)$

    更新日期:2020-04-06
  • Neural Architecture Generator Optimization
    arXiv.cs.NE Pub Date : 2020-04-03
    Binxin Ru; Pedro Esperanca; Fabio Carlucci

    Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention. An over-reliance on expert knowledge in the search space design has however led to increased performance (local optima) without significant architectural breakthroughs, thus preventing truly novel solutions from being reached. In

    更新日期:2020-04-06
  • Generating Similarity Map in COVID-19 Transmission Dynamics with Topological Autoencoder
    arXiv.cs.NE Pub Date : 2020-04-03
    Pitoyo Hartono

    At the end of 2019 the world has seen the initial breakout of COVID-19, a disease caused by SARS-CoV2 virus in China. The World Health Organization (WHO) declared this disease as a pandemic on March 22 2020. As the disease spread globally, it becomes difficult to tract the transmission dynamics of this disease in all countries, as they may differ in geographical, demographical and strategical aspects

    更新日期:2020-04-06
  • Benchmarking Deep Spiking Neural Networks on Neuromorphic Hardware
    arXiv.cs.NE Pub Date : 2020-04-03
    Christoph Ostrau; Jonas Homburg; Christian Klarhorst; Michael Thies; Ulrich Rückert

    With more and more event-based neuromorphic hardware systems being developed at universities and in industry, there is a growing need for assessing their performance with domain specific measures. In this work, we use the methodology of converting pre-trained non-spiking to spiking neural networks to evaluate the performance loss and measure the energy-per-inference for three neuromorphic hardware

    更新日期:2020-04-06
  • The data-driven physical-based equations discovery using evolutionary approach
    arXiv.cs.NE Pub Date : 2020-04-03
    Alexander Hvatov; Mikhail Maslyaev

    The modern machine learning methods allow one to obtain the data-driven models in various ways. However, the more complex the model is, the harder it is to interpret. In the paper, we describe the algorithm for the mathematical equations discovery from the given observations data. The algorithm combines genetic programming with the sparse regression. This algorithm allows obtaining different forms

    更新日期:2020-04-06
  • Fractional Deep Neural Network via Constrained Optimization
    arXiv.cs.NE Pub Date : 2020-04-01
    Harbir Antil; Ratna Khatri; Rainald Löhner; Deepanshu Verma

    This paper introduces a novel algorithmic framework for a deep neural network (DNN), which in a mathematically rigorous manner, allows us to incorporate history (or memory) into the network -- it ensures all layers are connected to one another. This DNN, called Fractional-DNN, can be viewed as a time-discretization of a fractional in time nonlinear ordinary differential equation (ODE). The learning

    更新日期:2020-04-03
  • Device-aware inference operations in SONOS nonvolatile memory arrays
    arXiv.cs.NE Pub Date : 2020-04-02
    Christopher H. Bennett; T. Patrick Xiao; Ryan Dellana; Vineet Agrawal; Ben Feinberg; Venkatraman Prabhakar; Krishnaswamy Ramkumar; Long Hinh; Swatilekha Saha; Vijay Raghavan; Ramesh Chettuvetty; Sapan Agarwal; Matthew J. Marinella

    Non-volatile memory arrays can deploy pre-trained neural network models for edge inference. However, these systems are affected by device-level noise and retention issues. Here, we examine damage caused by these effects, introduce a mitigation strategy, and demonstrate its use in fabricated array of SONOS (Silicon-Oxide-Nitride-Oxide-Silicon) devices. On MNIST, fashion-MNIST, and CIFAR-10 tasks, our

    更新日期:2020-04-03
  • Projected Neural Network for a Class of Sparse Regression with Cardinality Penalty
    arXiv.cs.NE Pub Date : 2020-04-02
    Wenjing Li; Wei Bian

    In this paper, we consider a class of sparse regression problems, whose objective function is the summation of a convex loss function and a cardinality penalty. By constructing a smoothing function for the cardinality function, we propose a projected neural network and design a correction method for solving this problem. The solution of the proposed neural network is unique, global existent, bounded

    更新日期:2020-04-03
  • Neuronal Sequence Models for Bayesian Online Inference
    arXiv.cs.NE Pub Date : 2020-04-02
    Sascha Frölich; Dimitrije Marković; Stefan J. Kiebel

    Sequential neuronal activity underlies a wide range of processes in the brain. Neuroscientific evidence for neuronal sequences has been reported in domains as diverse as perception, motor control, speech, spatial navigation and memory. Consequently, different dynamical principles have been proposed as possible sequence-generating mechanisms. Combining experimental findings with computational concepts

    更新日期:2020-04-03
  • GraphChallenge.org Sparse Deep Neural Network Performance
    arXiv.cs.NE Pub Date : 2020-03-25
    Jeremy Kepner; Simon Alford; Vijay Gadepally; Michael Jones; Lauren Milechin; Albert Reuther; Ryan Robinett; Sid Samsi

    The MIT/IEEE/Amazon GraphChallenge.org encourages community approaches to developing new solutions for analyzing graphs and sparse data. Sparse AI analytics present unique scalability difficulties. The Sparse Deep Neural Network (DNN) Challenge draws upon prior challenges from machine learning, high performance computing, and visual analytics to create a challenge that is reflective of emerging sparse

    更新日期:2020-04-03
  • Deep Molecular Programming: A Natural Implementation of Binary-Weight ReLU Neural Networks
    arXiv.cs.NE Pub Date : 2020-03-30
    Marko Vasic; Cameron Chalk; Sarfraz Khurshid; David Soloveichik

    Embedding computation in molecular contexts incompatible with traditional electronics is expected to have wide ranging impact in synthetic biology, medicine, nanofabrication and other fields. A key remaining challenge lies in developing programming paradigms for molecular computation that are well-aligned with the underlying chemical hardware and do not attempt to shoehorn ill-fitting electronics paradigms

    更新日期:2020-04-01
  • The Operating System of the Neuromorphic BrainScaleS-1 System
    arXiv.cs.NE Pub Date : 2020-03-30
    Eric Müller; Sebastian Schmitt; Christian Mauch; Sebastian Billaudelle; Andreas Grübl; Maurice Güttler; Dan Husmann; Joscha Ilmberger; Sebastian Jeltsch; Jakob Kaiser; Johann Klähn; Mitja Kleider; Christoph Koke; José Montes; Paul Müller; Johannes Partzsch; Felix Passenberg; Hartmut Schmidt; Bernhard Vogginger; Jonas Weidner; Christian Mayr; Johannes Schemmel

    BrainScaleS-1 is a wafer-scale mixed-signal accelerated neuromorphic system targeted for research in the fields of computational neuroscience and beyond-von-Neumann computing. The BrainScaleS Operating System (BrainScaleS OS) is a software stack giving users the possibility to emulate networks described in the high-level network description language PyNN with minimal knowledge of the system. At the

    更新日期:2020-04-01
  • Extending BrainScaleS OS for BrainScaleS-2
    arXiv.cs.NE Pub Date : 2020-03-30
    Eric Müller; Christian Mauch; Philipp Spilger; Oliver Julien Breitwieser; Johann Klähn; David Stöckel; Timo Wunderlich; Johannes Schemmel

    BrainScaleS-2 is a mixed-signal accelerated neuromorphic system targeted for research in the fields of computational neuroscience and beyond-von-Neumann computing. To augment its flexibility, the analog neural network core is accompanied by an embedded SIMD microprocessor. The BrainScaleS Operating System (BrainScaleS OS) is a software stack designed for the user-friendly operation of the BrainScaleS

    更新日期:2020-04-01
  • Initial Design Strategies and their Effects on Sequential Model-Based Optimization
    arXiv.cs.NE Pub Date : 2020-03-30
    Jakob Bossek; Carola Doerr; Pascal Kerschke

    Sequential model-based optimization (SMBO) approaches are algorithms for solving problems that require computationally or otherwise expensive function evaluations. The key design principle of SMBO is a substitution of the true objective function by a surrogate, which is used to propose the point(s) to be evaluated next. SMBO algorithms are intrinsically modular, leaving the user with many important

    更新日期:2020-04-01
  • Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural Network Model
    arXiv.cs.NE Pub Date : 2020-03-30
    Ifeatu Ezenwe; Alok Joshi; KongFatt Wong-Lin

    Neural networks are complex algorithms that loosely model the behaviour of the human brain. They play a significant role in computational neuroscience and artificial intelligence. The next generation of neural network models is based on the spike timing activity of neurons: spiking neural networks (SNNs). However, model parameters in SNNs are difficult to search and optimise. Previous studies using

    更新日期:2020-04-01
  • MUXConv: Information Multiplexing in Convolutional Neural Networks
    arXiv.cs.NE Pub Date : 2020-03-31
    Zhichao Lu; Kalyanmoy Deb; Vishnu Naresh Boddeti

    Convolutional neural networks have witnessed remarkable improvements in computational efficiency in recent years. A key driving force has been the idea of trading-off model expressivity and efficiency through a combination of $1\times 1$ and depth-wise separable convolutions in lieu of a standard convolutional layer. The price of the efficiency, however, is the sub-optimal flow of information across

    更新日期:2020-04-01
  • Balanced One-shot Neural Architecture Optimization
    arXiv.cs.NE Pub Date : 2019-09-24
    Renqian Luo; Tao Qin; Enhong Chen

    The ability to rank candidate architectures is the key to the performance of neural architecture search~(NAS). One-shot NAS is proposed to reduce the expense but shows inferior performance against conventional NAS and is not adequately stable. We investigate into this and find that the ranking correlation between architectures under one-shot training and the ones under stand-alone full training is

    更新日期:2020-04-01
  • On robot compliance. A cerebellar control approach
    arXiv.cs.NE Pub Date : 2020-03-02
    Ignacio Abadia; Francisco Naveros; Jesus A. Garrido; Eduardo Ros; Niceto R. Luque

    The work presented here is a novel biological approach for the compliant control of a robotic arm in real time (RT). We integrate a spiking cerebellar network at the core of a feedback control loop performing torque-driven control. The spiking cerebellar controller provides torque commands allowing for accurate and coordinated arm movements. To compute these output motor commands, the spiking cerebellar

    更新日期:2020-04-01
  • VOR Adaptation on a Humanoid iCub Robot Using a Spiking Cerebellar Model
    arXiv.cs.NE Pub Date : 2020-03-03
    Francisco Naveros; Niceto R. Luque; Eduardo Ros; Angelo Arleo

    We embed a spiking cerebellar model within an adaptive real-time (RT) control loop that is able to operate a real robotic body (iCub) when performing different vestibulo-ocular reflex (VOR) tasks. The spiking neural network computation, including event- and time-driven neural dynamics, neural activity, and spike-timing dependent plasticity (STDP) mechanisms, leads to a nondeterministic computation

    更新日期:2020-04-01
  • Distributed Embodied Evolution in Networks of Agents
    arXiv.cs.NE Pub Date : 2020-03-28
    Anil Yaman; Giovanni Iacca

    In most network problems, the optimum behaviors of agents in the network are not known before deployment. In addition to that, agents might be required to adapt, i.e. change their behavior based on the environment conditions. In these scenarios, offline optimization is usually costly and inefficient, while online methods might be more suitable. In this work we propose a distributed embodied evolutionary

    更新日期:2020-03-31
  • NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search
    arXiv.cs.NE Pub Date : 2020-03-28
    Chen Wei; Chuang Niu; Yiping Tang; Jimin Liang

    Neural architecture search (NAS) is a promising method for automatically finding excellent architectures.Commonly used search strategies such as evolutionary algorithm, Bayesian optimization, and Predictor method employs a predictor to rank sampled architectures. In this paper, we propose two predictor based algorithms NPUBO and NPENAS for neural architecture search. Firstly we propose NPUBO which

    更新日期:2020-03-31
  • Data-Driven Neuromorphic DRAM-based CNN and RNN Accelerators
    arXiv.cs.NE Pub Date : 2020-03-29
    Tobi Delbruck; Shih-Chii Liu

    The energy consumed by running large deep neural networks (DNNs) on hardware accelerators is dominated by the need for lots of fast memory to store both states and weights. This large required memory is currently only economically viable through DRAM. Although DRAM is high-throughput and low-cost memory (costing 20X less than SRAM), its long random access latency is bad for the unpredictable access

    更新日期:2020-03-31
  • Learning Latent Causal Structures with a Redundant Input Neural Network
    arXiv.cs.NE Pub Date : 2020-03-29
    Jonathan D. Young; Bryan Andrews; Gregory F. Cooper; Xinghua Lu

    Most causal discovery algorithms find causal structure among a set of observed variables. Learning the causal structure among latent variables remains an important open problem, particularly when using high-dimensional data. In this paper, we address a problem for which it is known that inputs cause outputs, and these causal relationships are encoded by a causal network among a set of an unknown number

    更新日期:2020-03-31
  • Environmental Adaptation of Robot Morphology and Control through Real-world Evolution
    arXiv.cs.NE Pub Date : 2020-03-30
    Tønnes F. Nygaard; Charles P. Martin; David Howard; Jim Torresen; Kyrre Glette

    Robots operating in the real world will experience a range of different environments and tasks. It is essential for the robot to have the ability to adapt to its surroundings to work efficiently in changing conditions. Evolutionary robotics aims to solve this by optimizing both the control and body (morphology) of a robot, allowing adaptation to internal, as well as external factors. Most work in this

    更新日期:2020-03-31
  • The Hessian Estimation Evolution Strategy
    arXiv.cs.NE Pub Date : 2020-03-30
    Tobias Glasmachers; Oswin Krause

    We present a novel black box optimization algorithm called Hessian Estimation Evolution Strategy. The algorithm updates the covariance matrix of its sampling distribution by directly estimating the curvature of the objective function. This algorithm design is targeted at twice continuously differentiable problems. For this, we extend the cumulative step-size adaptation algorithm of the CMA-ES to mirrored

    更新日期:2020-03-31
  • Empirical Comparison of Graph Embeddings for Trust-Based Collaborative Filtering
    arXiv.cs.NE Pub Date : 2020-03-30
    Tomislav Duricic; Hussain Hussain; Emanuel Lacic; Dominik Kowald; Denis Helic; Elisabeth Lex

    In this work, we study the utility of graph embeddings to generate latent user representations for trust-based collaborative filtering. In a cold-start setting, on three publicly available datasets, we evaluate approaches from four method families: (i) factorization-based, (ii) random walk-based, (iii) deep learning-based, and (iv) the Large-scale Information Network Embedding (LINE) approach. We find

    更新日期:2020-03-31
  • Critical Limits in a Bump Attractor Network of Spiking Neurons
    arXiv.cs.NE Pub Date : 2020-03-30
    Alberto Arturo Vergani; Christian Robert Huyck

    A bump attractor network is a model that implements a competitive neuronal process emerging from a spike pattern related to an input source. Since the bump network could behave in many ways, this paper explores some critical limits of the parameter space using various positive and negative weights and an increasing size of the input spike sources The neuromorphic simulation of the bumpattractor network

    更新日期:2020-03-31
  • Predicting Elastic Properties of Materials from Electronic Charge Density Using 3D Deep Convolutional Neural Networks
    arXiv.cs.NE Pub Date : 2020-03-17
    Yong Zhao; Kunpeng Yuan; Yinqiao Liu; Steph-Yves Loius; Ming Hu; Jianjun Hu

    Materials representation plays a key role in machine learning based prediction of materials properties and new materials discovery. Currently both graph and 3D voxel representation methods are based on the heterogeneous elements of the crystal structures. Here, we propose to use electronic charge density (ECD) as a generic unified 3D descriptor for materials property prediction with the advantage of

    更新日期:2020-03-31
  • SHX: Search History Driven Crossover for Real-Coded Genetic Algorithm
    arXiv.cs.NE Pub Date : 2020-03-30
    Takumi Nakane; Xuequan Lu; Chao Zhang

    In evolutionary algorithms, genetic operators iteratively generate new offspring which constitute a potentially valuable set of search history. To boost the performance of crossover in real-coded genetic algorithm (RCGA), in this paper we propose to exploit the search history cached so far in an online style during the iteration. Specifically, survivor individuals over past few generations are collected

    更新日期:2020-03-31
  • Re-purposing Heterogeneous Generative Ensembles with Evolutionary Computation
    arXiv.cs.NE Pub Date : 2020-03-30
    Jamal Toutouh; Erik Hemberg; Una-May O'Reily

    Generative Adversarial Networks (GANs) are popular tools for generative modeling. The dynamics of their adversarial learning give rise to convergence pathologies during training such as mode and discriminator collapse. In machine learning, ensembles of predictors demonstrate better results than a single predictor for many tasks. In this study, we apply two evolutionary algorithms (EAs) to create ensembles

    更新日期:2020-03-31
  • What it Thinks is Important is Important: Robustness Transfers through Input Gradients
    arXiv.cs.NE Pub Date : 2019-12-11
    Alvin Chan; Yi Tay; Yew-Soon Ong

    Adversarial perturbations are imperceptible changes to input pixels that can change the prediction of deep learning models. Learned weights of models robust to such perturbations are previously found to be transferable across different tasks but this applies only if the model architecture for the source and target tasks is the same. Input gradients characterize how small changes at each input pixel

    更新日期:2020-03-31
  • Boolean learning under noise-perturbations in hardware neural networks
    arXiv.cs.NE Pub Date : 2020-03-27
    Louis Andreoli; Xavier Porte; Stéphane Chrétien; Maxime Jacquot; Laurent Larger; Daniel Brunner

    A high efficiency hardware integration of neural networks benefits from realizing nonlinearity, network connectivity and learning fully in a physical substrate. Multiple systems have recently implemented some or all of these operations, yet the focus was placed on addressing technological challenges. Fundamental questions regarding learning in hardware neural networks remain largely unexplored. Noise

    更新日期:2020-03-30
  • Rolling Horizon Evolutionary Algorithms for General Video Game Playing
    arXiv.cs.NE Pub Date : 2020-03-27
    Raluca D. Gaina; Sam Devlin; Simon M. Lucas; Diego Perez-Liebana

    Game-playing Evolutionary Algorithms, specifically Rolling Horizon Evolutionary Algorithms, have recently managed to beat the state of the art in performance across many games. However, the best results per game are highly dependent on the specific configuration of modifications and hybrids introduced over several works, each described as parameters in the algorithm. However, the search for the best

    更新日期:2020-03-30
  • Can We Use Split Learning on 1D CNN Models for Privacy Preserving Training?
    arXiv.cs.NE Pub Date : 2020-03-16
    Sharif Abuadbba; Kyuyeon Kim; Minki Kim; Chandra Thapa; Seyit A. Camtepe; Yansong Gao; Hyoungshick Kim; Surya Nepal

    A new collaborative learning, called split learning, was recently introduced, aiming to protect user data privacy without revealing raw input data to a server. It collaboratively runs a deep neural network model where the model is split into two parts, one for the client and the other for the server. Therefore, the server has no direct access to raw data processed at the client. Until now, the split

    更新日期:2020-03-30
  • Learning representations in Bayesian Confidence Propagation neural networks
    arXiv.cs.NE Pub Date : 2020-03-27
    Naresh Balaji Ravichandran; Anders Lansner; Pawel Herman

    Unsupervised learning of hierarchical representations has been one of the most vibrant research directions in deep learning during recent years. In this work we study biologically inspired unsupervised strategies in neural networks based on local Hebbian learning. We propose new mechanisms to extend the Bayesian Confidence Propagating Neural Network (BCPNN) architecture, and demonstrate their capability

    更新日期:2020-03-30
  • Evolutionary Bin Packing for Memory-Efficient Dataflow Inference Acceleration on FPGA
    arXiv.cs.NE Pub Date : 2020-03-24
    Mairin Kroes; Lucian Petrica; Sorin Cotofana; Michaela Blott

    Convolutional neural network (CNN) dataflow inference accelerators implemented in Field Programmable Gate Arrays (FPGAs) have demonstrated increased energy efficiency and lower latency compared to CNN execution on CPUs or GPUs. However, the complex shapes of CNN parameter memories do not typically map well to FPGA on-chip memories (OCM), which results in poor OCM utilization and ultimately limits the

    更新日期:2020-03-30
  • Bayesian Hierarchical Multi-Objective Optimization for Vehicle Parking Route Discovery
    arXiv.cs.NE Pub Date : 2020-03-27
    Romit S Beed; Sunita Sarkar; Arindam Roy

    Discovering an optimal route to the most feasible parking lot has been a matter of concern for any driver which aggravates further during peak hours of the day and at congested places leading to considerable wastage of time and fuel. This paper proposes a Bayesian hierarchical technique for obtaining the most optimal route to a parking lot. The route selection is based on conflicting objectives and

    更新日期:2020-03-30
  • Automatically designing CNN architectures using genetic algorithm for image classification
    arXiv.cs.NE Pub Date : 2018-08-11
    Yanan Sun; Bing Xue; Mengjie Zhang; Gary G. Yen

    Convolutional Neural Networks (CNNs) have gained a remarkable success on many image classification tasks in recent years. However, the performance of CNNs highly relies upon their architectures. For most state-of-the-art CNNs, their architectures are often manually-designed with expertise in both CNNs and the investigated problems. Therefore, it is difficult for users, who have no extended expertise

    更新日期:2020-03-30
  • Evolving Plasticity for Autonomous Learning under Changing Environmental Conditions
    arXiv.cs.NE Pub Date : 2019-04-02
    Anil Yaman; Giovanni Iacca; Decebal Constantin Mocanu; Matt Coler; George Fletcher; Mykola Pechenizkiy

    A fundamental aspect of learning in biological neural networks (BNNs) is the plasticity property which allows them to modify their configurations during their lifetime. Hebbian learning is a biologically plausible mechanism for modeling the plasticity property based on the local activation of neurons. In this work, we employ genetic algorithms to evolve local learning rules, from Hebbian perspective

    更新日期:2020-03-30
  • Utilizing Differential Evolution into optimizing targeted cancer treatments
    arXiv.cs.NE Pub Date : 2020-03-21
    Michail-Antisthenis Tsompanas; Larry Bull; Andrew Adamatzky; Igor Balaz

    Working towards the development of an evolvable cancer treatment simulator, the investigation of Differential Evolution was considered, motivated by the high efficiency of variations of this technique in real-valued problems. A basic DE algorithm, namely "DE/rand/1" was used to optimize the simulated design of a targeted drug delivery system for tumor treatment on PhysiCell simulator. The suggested

    更新日期:2020-03-28
  • Sampled Training and Node Inheritance for Fast Evolutionary Neural Architecture Search
    arXiv.cs.NE Pub Date : 2020-03-07
    Haoyu Zhang; Yaochu Jin; Ran Cheng; Kuangrong Hao

    The performance of a deep neural network is heavily dependent on its architecture and various neural architecture search strategies have been developed for automated network architecture design. Recently, evolutionary neural architecture search (ENAS) has received increasing attention due to the attractive global optimization capability of evolutionary algorithms. However, ENAS suffers from extremely

    更新日期:2020-03-28
  • Novelty search employed into the development of cancer treatment simulations
    arXiv.cs.NE Pub Date : 2020-03-21
    Michail-Antisthenis Tsompanas; Larry Bull; Andrew Adamatzky; Igor Balaz

    Conventional optimization methodologies may be hindered when the automated search is stuck into local optima because of a deceptive objective function landscape. Consequently, open ended search methodologies, such as novelty search, have been proposed to tackle this issue. Overlooking the objective, while putting pressure into discovering novel solutions may lead to better solutions in practical problems

    更新日期:2020-03-28
Contents have been reproduced by permission of the publishers.
导出
全部期刊列表>>
聚焦肿瘤,探索癌症
欢迎探索2019年最具下载量的材料科学论文
宅家赢大奖
向世界展示您的会议墙报和演示文稿
全球疫情及响应:BMC Medicine专题征稿
新版X-MOL期刊搜索和高级搜索功能介绍
化学材料学全球高引用
ACS材料视界
x-mol收录
自然科研论文编辑服务
南方科技大学
南方科技大学
舒伟
中国科学院长春应化所于聪-4-8
复旦大学
课题组网站
X-MOL
香港大学化学系刘俊治
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug