Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Oxford, UK : Blackwell Publishing Ltd
    Computational intelligence 8 (1992), S. 0 
    ISSN: 1467-8640
    Source: Blackwell Publishing Journal Backfiles 1879-2005
    Topics: Computer Science
    Notes: Much research in machine learning has been focused on the problem of symbol-level learning (SLL), or learning to improve the performance of a program given examples of its behavior on typical inputs. A common approach to symbol-level learning is to use some sort of mechanism for saving and later reusing the solution paths used to solve previous search problems. Examples of such mechanisms are macro-operator learning, explanation-based learning, and chunking. However, experimental evidence that these mechanisms actually improve performance is inconclusive. This paper presents a formal framework for analysis of symbol-level learning programs, and then uses this framework to investigate a series of solution-path caching mechanisms which provably improve performance. The analysis of these mechanisms is illuminating in many respects; in particular, in order to obtain positive results, it is necessary to use a novel representation for a set of solution paths, and also to apply certain unusual optimizations to a set of solution paths. Several of the predictions made by the model have been confirmed by recently published experiments.
    Type of Medium: Electronic Resource
    Signatur Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...