Peter Bentley

Peter Bentley
University College Lodon, UK

Title:  Building a Nature-Inspired Computer

Abstract:

Since the before birth of computers we have strived to make intelligent machines that share some of the properties of our own brains. We have tried to make devices that quickly solve
problems that we find time consuming, that adapt to our needs, and that learn and derive new information. In more recent years we have tried to add new capabilities to our devices: selfadaptation, fault tolerance, self-repair, even self-programming, or self-building. In pursing these challenging goals we push the boundaries of computer and software architectures. We invent new parallel processing approaches or we exploit hardware in new ways.

For the last decade Peter Bentley and his group have made their own journey in this area. In order to overcome the observed incompatibilities between conventional architectures and
biological processes, Bentley created the Systemic Computer – a computing paradigm and architecture designed to process information in a way more similar to natural systems. The computer uses a systemics world-view. Instead of the traditional centralized view of computation, here all computation is distributed. There is no separation of data and code/functionality into memory, ALU, and I/O. Everything in systemic computation is composed of systems, which may not be destroyed, but may transform each other through their interactions, akin to collision based computing. Two systems interact in the context of a third
system, which defines the result of their interaction. All interactions may be separated and embedded within scopes, which are also systems, enabling embedded hierarchies. Systemic computation makes the following assertions:

  • Everything is a system.
  • Systems can be transformed but never destroyed or created from nothing.
  • Systems may comprise or share other nested systems.
  • Systems interact, and interaction between systems may cause transformations of those systems, where the nature of that transformation is determined by a contextual system.
  • All systems can potentially act as context and affect the interactions of other systems, and all systems can potentially interact in some context.
  • The transformation of systems is constrained by the scope of systems, and systems may have partial membership within the scope of a system.
  • Computation is transformation.

Computation has always meant transformation in the past, whether it is the transformation of position of beads on an abacus, or of electrons in a CPU. But this simple definition also allows us to call the sorting of pebbles on a beach, or the transcription of protein, or the growth of dendrites in the brain, valid forms of computation. Such a definition is important, for it provides a common language for biology and computer science, enabling both to be understood in terms of computation.

The systemic computer is designed to enable many features of natural computation and provide an effective platform for biological modeling and bio-inspired algorithms. Several
different implementations of the systemic computer have been created, each with their own advantages and disadvantages. Simulators on conventional computers have enabled the
demonstration of bio-inspired algorithms, fault tolerance, selfrepair, and modeling of biological processes. To improve speed, an FPGA-based hardware implementation was created
and shown to be several orders of magnitude faster. A GPUbased implementation was also created in order to combine flexibility, scalability, and speed.

Through this work, many important lessons have been learned. In addition to the advances in bio-inspired computing, it is increasingly possible to see parallels between systemic computing and other techniques and architectures under development. High performance graph-based computing or novel hardware based on memristors or neural modeling may provide excellent new substrates for systemic-style computation in the future.

Slide