Order Out of Chaos: Order in Neural Networks
Order Out of Chaos: Order in Neural Networks
This chapter provides details about order effects in neural networks, a commonly used modeling approach. It examines two networks in detail, the Adaptive Resonance Theory (ART) architecture and Jeff Elman's recurrent networks. The ART model shows that about a 25% difference in recognition rates can arise from using different orders. Elman's recurrent network shows that, with the wrong order, a task might not even be learnable. The chapter also discusses why these effects arise, which is important for understanding the impact and claims of computational models.
Keywords: order effects, neural networks, Adaptive Resonance Theory, recurrent networks, Jeff Elman
Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
Please, subscribe or login to access full text content.
If you think you should have access to this title, please contact your librarian.
To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .