The process of conducting scientific research is currently separate from the process of publishing, but it doesn't have to be that way. 'As a scientist', commented Bourne, 'I want an interaction with a publisher that begins at the start of the process, not towards the end.' He is looking for data interoperability, integrated rich media to improve comprehension, and semantic linking of data that in itself can lead to new knowledge discovery.
The theme was developed by Cameron Neylon of the Science and Technology Facilities Council, a biophysicist who researches and writes on the interface of web technology with science. He bluntly stated the rapid changes brought about in the last few years: 'I am the last generation to remember "the library" as the place you go to retrieve information, the last to think of journals as printed entities, and the last to physically search (for a resource)'. In the days when information dissemination was, in effect, a one-to-many process then a gatekeeper was required to manage the content flow. Now, however, information comes from all directions. And, Neylon argues, this is a good thing -- Ranganathan's third law, 'Every book its reader' is technically feasible for the first time.
Neylon disagrees with Clay Shirky that the challenge we face when dealing with the information firehose is one of 'filter failure'. As he sees it, the problem is discovery deficit. Information needs to be enabled not blocked, and librarians and publishers need to sell (or provide) services not content. The key is to enable discovery - 'this is the biggest opportunity we've ever had to take information and do something useful with it'.
Further coverage of the remainder of UKSG 2011 will appear on www.infotoday.eu
Image courtesy of Horia Varlan via Flickr.