Aphysical neural network is a type ofartificial neural network in which an electrically adjustable material is used to emulate the function of aneural synapse or a higher-order (dendritic) neuron model.[1] "Physical" neural network is used to emphasize the reliance on physical hardware used to emulateneurons as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which amemristor or other electrically adjustable resistance material is used to emulate a neural synapse.[2][3]
In the 1960sBernard Widrow andTed Hoff developedADALINE (Adaptive Linear Neuron) which used electrochemical cells calledmemistors (memory resistors) to emulate synapses of an artificial neuron.[4] The memistors were implemented as 3-terminal devices operating based on the reversible electroplating of copper such that the resistance between two of the terminals is controlled by the integral of the current applied via the third terminal. The ADALINE circuitry was briefly commercialized by the Memistor Corporation in the 1960s enabling some applications in pattern recognition. However, since the memistors were not fabricated using integrated circuit fabrication techniques the technology was not scalable and was eventually abandoned assolid-state electronics became mature.[5]
In 1989Carver Mead published his bookAnalog VLSI and Neural Systems,[6] which spun off perhaps the most common variant of analog neural networks. The physical realization is implemented inanalog VLSI. This is often implemented as field effect transistors in low inversion. Such devices can be modelled astranslinear circuits. This is a technique described byBarrie Gilbert in several papers around mid 1970th, and in particular hisTranslinear Circuits from 1981.[7][8] With this method circuits can be analyzed as a set of well-defined functions in steady-state, and such circuits assembled into complex networks.
Alex Nugent describes a physical neural network as one or more nonlinear neuron-like nodes used to sum signals and nanoconnections formed from nanoparticles, nanowires, or nanotubes which determine the signal strength input to the nodes.[9] Alignment or self-assembly of the nanoconnections is determined by the history of the applied electric field performing a function analogous to neural synapses. Numerous applications[10] for such physical neural networks are possible. For example, a temporal summation device[11] can be composed of one or more nanoconnections having an input and an output thereof, wherein an input signal provided to the input causes one or more of the nanoconnection to experience an increase in connection strength thereof over time. Another example of a physical neural network is taught by U.S. Patent No. 7,039,619[12] entitled "Utilizednanotechnology apparatus using a neural network, a solution and a connection gap," which issued to Alex Nugent by theU.S. Patent & Trademark Office on May 2, 2006.[13]
A further application of physical neural network is shown in U.S. Patent No. 7,412,428 entitled "Application ofhebbian and anti-hebbian learning to nanotechnology-based physical neural networks," which issued on August 12, 2008.[14]
Nugent and Molter have shown that universal computing and general-purpose machine learning are possible from operations available through simple memristive circuits operating the AHaH plasticity rule.[15]More recently, it has been argued that also complex networks of purely memristive circuits can serve as neural networks.[16][17]
In 2002,Stanford Ovshinsky described an analog neural computing medium in whichphase-change material has the ability to cumulatively respond to multiple input signals.[18] An electrical alteration of the resistance of the phase change material is used to control the weighting of the input signals.
Greg Snider ofHP Labs describes a system of cortical computing with memristive nanodevices.[19] Thememristors (memory resistors) are implemented by thin film materials in which the resistance is electrically tuned via the transport of ions or oxygen vacancies within the film.DARPA'sSyNAPSE project has funded IBM Research and HP Labs, in collaboration with the Boston University Department of Cognitive and Neural Systems (CNS), to develop neuromorphic architectures which may be based on memristive systems.[20]
In 2022, researchers reported the development ofnanoscalebrain-inspired artificial synapses, usingthe ion proton (H+
), for 'analogdeep learning'.[21][22]