Had always thought that intelligence was about learning, so this concept struck me. Note mention of improbable events and model correctness maintenance, always of concern in such studies. Technical.
'Transfer learning' jump-starts new AI projects
Machine learning, once implemented, tends to be specific to the data and requirements of the task at hand. Transfer learning is the act of abstracting and reusing those smarts
'Transfer Learning' Jump-Starts New AI Projects in InfoWorld by James Kobielus
Abstracting and reusing knowledge gleaned from a machine-learning application in other, newer apps--or "transfer learning"--is supplementing other learning methods that constitute the backbone of most data science practices. Among the technique's practical uses is productivity acceleration modeling, which is viable when prior work can be reused without extensive revision in order to speed up time to insight. Another transfer-learning application involves the method helping scientists produce machine-learning models that exploit relevant training data from prior modeling projects.
This technique is particularly appropriate for addressing projects in which prior training data can easily become obsolete, which is a problem that frequently occurs in dynamic problem domains. A third area of data science in which transfer learning could yield benefits is risk mitigation. In this situation, transfer learning can help scientists leverage subsets of training data and feature models from related domains when the underlying conditions of the modeled phenomenon have radically changed.
This can help researchers ameliorate the risk of machine-learning-driven predictions in any problem domain vulnerable to extremely improbable events. Transfer learning also is critical to data scientists' efforts to create "master learning algorithms" that automatically obtain and apply fresh contextual knowledge via deep neural networks and other forms of artificial intelligence. ... "
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment