A kind of AI. Summarization is useful and powerful concept. But consider that summarization also exists in a context. Its output is only useful in a particular context, and that exists based also on the requester. And that can also change over time, location, requesters current goals, etc. And influenced by the metadata involved with its construction. Still a very useful step forward.
Google Brain’s AI achieves state-of-the-art text summarization performance
Kyle Wiggers in Venturebeat
Summarizing text is a task at which machine learning algorithms are improving, as evidenced by a recent paper published by Microsoft. That’s that’s good news — automatic summarization systems promise to cut down on the amount message-reading done by enterprise workers, which one survey estimates amounts to 2.6 hours each day.
Not to be outdone, a Google Brain and Imperial College London team built a system — Pre-retraining with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence, or PEGASUS — that leverages Google’s Transformers architecture combined with pre-training objectives tailored for abstractive text generation. They say it achieves state-of-the-art results in 12 summarization tasks spanning news, science, stories, instructions, emails, patents, and legislative bills, and that it shows “surprising” performance on low-resource summarization, surpassing previous top results on six data sets with only 1,000 examples. .... "
Monday, December 23, 2019
Google's Summarization Performance: Pegasus
Labels:
Context,
Google,
Metadata,
Pegasus,
Summarization
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment