Logo del repository
  1. Home
 
Opzioni

Some Notes on Perceptron Learning

BUDINICH, MARCO
1993
  • journal article

Periodico
JOURNAL OF PHYSICS. A, MATHEMATICAL AND THEORETICAL
Abstract
We extend the geometrical approach to the Perceptron and show that, given n examples, learning is of maximal difficulty when the number of inputs d is such that n = 5d. We then present a new Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have fixed parameters, like the usual learning constant \eta, but it adapts them to the cost function. We show that there exist an optimal choice for \beta, the steepness of the transfer function. We present also a brief systematic study of the parameters \eta and \beta of the standard Perceptron algorithm.
Archivio
http://hdl.handle.net/11368/2558394
Diritti
metadata only access
Soggetti
  • Perceptron learning

  • continuous neuron

  • back-propagation

Visualizzazioni
1
Data di acquisizione
Apr 19, 2024
Vedi dettagli
google-scholar
Get Involved!
  • Source Code
  • Documentation
  • Slack Channel
Make it your own

DSpace-CRIS can be extensively configured to meet your needs. Decide which information need to be collected and available with fine-grained security. Start updating the theme to match your nstitution's web identity.

Need professional help?

The original creators of DSpace-CRIS at 4Science can take your project to the next level, get in touch!

Realizzato con Software DSpace-CRIS - Estensione mantenuta e ottimizzata da 4Science

  • Impostazioni dei cookie
  • Informativa sulla privacy
  • Accordo con l'utente finale
  • Invia il tuo Feedback