Sinusoidal and monotonic transfer functions: Implications for VC dimension

Gaynier R.J. and Downs T. (1995) Sinusoidal and monotonic transfer functions: Implications for VC dimension. Neural Networks, 8 6: 901-904. doi:10.1016/0893-6080(95)00040-7


Author Gaynier R.J.
Downs T.
Title Sinusoidal and monotonic transfer functions: Implications for VC dimension
Journal name Neural Networks   Check publisher's open access policy
ISSN 0893-6080
Publication date 1995
Sub-type Article (original research)
DOI 10.1016/0893-6080(95)00040-7
Volume 8
Issue 6
Start page 901
End page 904
Total pages 4
Subject 1702 Cognitive Sciences
2805 Cognitive Neuroscience
2800 Neuroscience
Abstract It is sometimes stated that neural networks that employ units with nonmonotonic transfer functions are more difficult to train than networks that use monotonic transfer functions, because the former can be expected to have more local minima. That this is often true arises from the fact that networks using monotonic transfer functions tend to have a smaller VC (Vapnik-Chervonenkis) dimension than networks using nonmonotonic transfer junctions. But the VC dimension of a network is not solely influenced by the nature of the transfer function. We give an example of a network with an infinite VC dimension and demonstrate that it is equivalent to a network which contains only monotonic transfer functions. Thus we show that monotonicity alone is not a sufficient criterion to avoid large VC dimension.
Keyword Generalisation
Monotonic
Sinusoidal
Transfer function
VC dimension
Q-Index Code C1
Q-Index Status Provisional Code
Institutional Status Unknown

Document type: Journal Article
Sub-type: Article (original research)
Collection: Scopus Import
 
Versions
Version Filter Type
Citation counts: Scopus Citation Count Cited 1 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Tue, 26 Jul 2016, 05:06:46 EST by System User