Article about this course is dated, but worth a look as to the direction of this work. I had always seen this as the most immediately useful aspect of the semantic web:
" ... Linked Data describes a method of publishing structured data, so that it can be interlinked and become more useful. It builds upon standard Web technologies, such as HTTP and URIs - but rather than using them to serve web pages for human readers, it extends them to share information in a way that can be read automatically by computers. This enables data from different sources to be connected and queried.... "
See also some links in the comments to the course article, which provide more background information and information about the course. I am sure there was followup, if you have any information about that, pass it along.
Tuesday, January 04, 2011
Subscribe to:
Post Comments (Atom)
1 comment:
Over 2010 the adoption of linked data best practices ramped up dramatically, particularly in the wake of the Data.gov (US) and Data.gov.uk (UK) government data transparency initiatives. And with the advent of Infochimps, MSFT Azure DataMarket, Talis Kasabi and others, widespread transactional consumption of data started to see the light as a viable business model.
In 2011 a convergence will take place: high-quality linked data from a wide variety of sources will become available and produce revenue through keyed APIs. Web app developers will augment or replace current open sources with authoritative sources that they pay to reuse; they will generate revenue through advertising or by charging their users.
A primary driver for producers to get data online will be deep analytics in addition to, and in some cases instead of, direct revenue generation. The linked data model coupled with analytics will give providers deep insight into how their data is being consumed.
Post a Comment