Monday 30 November 2009

HyperCard is Dead. Long Live HyperCard!

I cut my professional teeth on HyperCard, writing VideoDisk XCMDs to allow a HyperCard stack to control a video presentation, back in 1986. Although I was a UNIX system programmer (cut me and I bleed regexp), it was Apple's HyperCard which best let me manipulate data for users.

And now it's back in the form of TileStack, a kind of re-imagination of HyperCard for a Web 2.0 environment. There have been other contenders (e.g Runtime Revolution) but they didn't have proper integration with the Web. Now I can write stacks (in a HyperTalk-like language) that use AJAX Web Services - XML, JSON the lot. I'm as happy as Larry!

The following embedded stack uses an idle handler to periodically make a Flickr API call and then set the icon of button n to media of item 1 of the items of JSONdata. I'd forgotten how simple this stuff was - come back Bill Atkinson, the Web needs you!

Monday 23 November 2009

Evaluating Expertise Promotion

I thought I'd look further at the issue of effective communication of research impact and expertise. The University of Southampton Communications team made a press release on the subject of "Brain-Computer Interfacing" earlier this term. It's obviously because they believe that we as an institution are good at it and that we have something to promote.

I thought I'd take a look at how effective our communication on the subject is, and as you can probably guess, this equates to how high up the Google ranking do we feature compared to other universities? This is a pretty good measure of the effectiveness of our research expertise promotion because anyone who wants to find an expert on a topic is going to start by looking on Google. I knocked together some scripts to look at how our institution fares in the competition for Google eyeballs (basic web analytics).

The screendump on the left shows the results of a Google query for "Brain-Computing Interfacing". All the results from universities are coloured in red, all those from publishers in green, those from news sources, magazines and blogs in blue and unclassified resources are grey. You can quickly see that there are a couple of university-contributed results right at the top, and then an increasing number further down. Those results with a silver background are from Southampton (yay!)

Clearly, Google looks for resources which are about "Brain-Computer Interfacing" and then ranks them according to their "impact" or "importance". Exactly how it does that (PageRank or Black Magic) isn't really my concern here; however it happens, Google controls the order in which these results are presented, and the effect is that if you appear near the top you are more likely to get visited. The script that I use to generate these annotated pages actually gets 500 results, but most people get 10 results per page and don't bother to ask for more than a single page.

To better compare institutions' effectiveness at promoting their research expertise, I distilled this page to a spreadsheet. Once it's in a data form, then the sky's the limit and I can visualise it in different ways (such as this map of global expertise in the area).

The spreadsheet (reproduced below as a table of institutions) shows me that Southampton comes out rather well in the area - we are the third institution named after Oxford and Gronigen, and that slightly further down the list comes a couple of papers from our Institutional Repository. This seems to be a good result - we can claim to be doing alright on this topic. But what about all the other areas in which we think we have some expertise? There are hundreds and thousands of keywords that we need to analyse to see the effectiveness of our communications overall. I think its time to scale up my scripts to get a bigger picture!

InstitutionCountryServerTitle Computer Interfacing Project Thoughts - A Brain-Computer Interfacing Project at RuG Southampton BCI Research Programme | Brain Computer Interfacing and Assistive Technologies Team
LausanneSwitzerlandinfoscience.epfl.chAnticipation Based Brain-Computer Interfacing (aBCI) - Infoscience
U TwenteNetherlandseprints.eemcs.utwente.nlEEMCS EPrints Service - 11091 Brain-Computer Interfacing for . Computer Music Interfacing Demo Computer Interfacing - Systems & Control Engineering ...
Carnegie MellonUSwww-2.cs.cmu.eduClassifying Single Trial EEG: Towards Brain Computer Interfacing Soton - Brain-computer interfacing in rehabilitation tasks for driving a brain computer interfacing system: a ... Computer Interfacing
Rhode IslandUSwww.ele.uri.eduBrain - Computer Interfacing Mason P. Wilson IV URI department of ... Interfacing in Tetraplegic Patients with High ...
ColoradoUSwww.cs.colostate.eduTemporal and Spatial Complexity measures for EEG-based Brain ...
UC San"Gerwin Schalk, Ph.D."
WashingtonUSwww.cs.washington.eduDynamic Bayesian Networks for Brain-Computer Interfaces Nazarpour Home Page Note on Brain Actuated Spelling with the Berlin Brain-Computer ...
Uni SaarlandGermanypsydok.sulb.uni-saarland.dePsyDok - Brain Computer Interfaces for Communication in Paralysis ... Selection and Classification in Brain Computer Interfaces ...
U FreiburgGermanywww.bmi.uni-freiburg.deBMII: Ferran Galan
Kansas CityUSwww.csee.umkc.eduCIBIT Laboratory PowerPoint - LeslieSmith Southampton Brain-Computer Interfacing Research Programme - Aims
BrownUSwww.cs.brown.eduMichael J. Black: Neural Prosthesis Research Projects
U fur Medizinische Psychologie und Verhaltensneurobiologie ... Speakers
North FloridaUSwww.unf.eduUNF Webpage"Presence: Research Encompassing Sensory Enhancement, Neuroscience ..."
WashingtonUSwww.cs.washington.eduPradeep Shenoy University > School of Engineering > Research Groups > PhD ...
Carnegie MellonUSwww.cs.cmu.eduAutomated EEG feature selection for brain computer interfaces ... the Group - ISEL - Intelligent Systems Engineering Laboratory
TuftsUSwww.cs.tufts.eduCOMP 250-BCI Syllabus Publications of a Robot as Embodied Interface for Brain Computer ... Leslie S. Smith: Research Home Page
New JerseyUSembc2006.njit.eduBlind Source Separation in single-channel EEG analysis: An ... Papers
BielefeldGermanyni.www.techfak.uni-bielefeld.dePublications | Neuroinformatics Group
WashingtonUSese.wustl.eduEIT - Sample Web Page Template
ColoradoUSwww.cs.colostate.eduEEG Pattern Analysis Group @ Columbia University Curran - Kent Law School - University of Kent
Georgia StateUSwww.cis.gsu.eduEEG-based communication: a pattern recognition approach ...
TU GrazAustriabci.tugraz.atPublications - Laboratory of Brain-Computer Interfaces
NortheasternUSnuweb1.neu.eduNews University > School of Engineering > Contacts and People ... Music Research Articles Year Undergraduate Projects 2008-2009
UC San"IN RECENT years, brain-computer interface (BCI) systems" 1 - Faculty of Social Sciences
Carnegie MellonUSwww.cs.cmu.eduLinear and nonlinear methods for brain-computer interfaces ...
New JerseyUSembc2006.njit.eduOn-line Differentiation Of Neuroelectric Activities: Algorithms ...
BielefeldGermanybieson.ub.uni-bielefeld.deBieSOn - P300-based brain-computer interfacing
TU GrazAustriahci.tugraz.atPublication list of Alois Schloegl
Uni SaarlandGermanypsydok.sulb.uni-saarland.deBrain Computer Interfaces for Communication in Paralysis - Eingang ... Publications
BrownUSwww.cs.brown.eduMichael J. Black: Neural Prosthesis Research Projects Subspace Analysis
ColoradoUSwww.math.colostate.eduCurriculum Vitae: Michael Kirby (Professor) Co-Director Pattern ...'s Homepage : Resume | VGandhi
TU"Seminar Computational Intelligence E, SS 2007"
North CarolinaUScatalog.lib.ncsu.eduNCSU Libraries - Toward brain-computer interfacing / edited by ...

PS It did occur to me after I had published an earlier draft of this post that I should also have checked the results for "Brain-Computer Interface". It turns out we come further down the league-table for this variation on the phrase (7th institutional position rather than 3rd) but that this phrase is much less-commonly used. As long as potential funders, students and media researchers know which phrase to use we should be alright. Otherwise, we will have to become a bit more canny about our use of synonyms. (I'm not sure whether it's significant that an EBay sponsored link appears only on "Brain-Computer Interface"!)

Monday 16 November 2009

Life is a Conference (Oh Chum)

Since EPrints has now celebrated its 10th birthday** I have been chewing over where this decade of repository activity is leading us. Bigger repositories? More repositories? Faster repositories? Better repositories? Well, yes to all the above, but collecting, curating and sharing data/documents seems to be only part of the picture.

At the same time, I have become a director of the Web Science Doctoral Training Centre at the University of Southampton. Its five year mission (no, really, it's there in the EPSRC grant letter) is to build up a cohort of interdisciplinary scientists who can understand the impact of the Web on our society - its economic activity, political exchange, social interactions, scientific knowledge transfer - and predict the future benefits and downsides of different kinds of Web technology.

For the last few days I have been trying to pull some of these pieces together: the Web, the Social Web, the Data Web, repositories, open access and open science. In recent years, the community has built a Web infrastructure for e-research that handles research outputs, research data, research process and workflows. This infrastructure has many desirable properties - it is dynamic and persistent and supports managed curation and auditable provenance.

One thing I believe is missing from the picture at the moment is research people, research careers and research meetings. Researchers engaged in human-oriented research activities, rather than research artefact and research experiments. Research after all isn't just about individual scientists turning dials on a piece of laboratory equipment, but about many individuals debating and evaluating their ideas in scientific discourse and scientific debate. Part of that discourse and debate happens through journal publications, but much of it happens in conferences and workshops, through face to face interactions. The proceedings of these meetings become part of the literature, and so part of the personal, dynamic, face-to-face engagement is captured for posterity, but the questions and answers, the ad hoc discussions, birds of a feather sessions and arguments over dinner - all the normal human interactions that generate inspiration as well as larger scale knowledge transfer - have not been captured.

Except that they are starting to be exposed beyond the boundaries of the conference meetings by microblogging services. The low barrier to communication afforded by a Twitter client on a smart phone means that ideas, controversies and emerging consensuses are broadcast beyond the immediately present delegates in a meeting. These communications are not edited, published and catalogued for posterity (and are only searchable for a short time), but they do (potentially) increase the efficiency of the meeting.

A decade ago, the only way to facilitate social networking in the research world was by face to face meetings; flying hundreds or thousands of people half-way across the world for a week in order to be able to talk to each other (or perhaps even to listen to each other). This is still the way by which much research business is conducted, despite being more aware of the environmental consequences of our conferences. There must be a better way.

Twitter, blogs, web, phone, email, papers, workshops, meetings, projects, texts (SMSes) are all ways of mediating engagement between knowledge generating people. In point of fact, conferences are not very efficient engagement mechanisms - most sessions are full of people doing email. Virtual conferences (whether held in Second Life or a rather more prosaic video/audio conferencing environment) also have shortcomings in fostering participation and engagement.

We need to redesign our social interactions to make them more pro-human, pro-diary, pro-budget and pro-environment. We need to use technology not to ape our large-scale face-to-face meetings (using enormous video walls of dozens and hundreds of virtual delegates), but to support us as we try to achieve scientific debate and argument with loose synchronisation across a dynamic community of individuals spread over a number of years.

It's ironic that a university is supposed to be a community of researchers, but none of us know what our neighbours do until we accidentally meet them at a conference on the other side of the world. This is no longer acceptable as the importance of interdisciplinary research increases! Let's instead use technology to improve the social transfer of our knowledge capital with our international research community and our institutional research community too!

I believe that's where our infrastructure needs to grow - supporting our research engagement as well as managing our research artefacts.

** Technically, it is 10 years since Stevan proposed EPrints at the first OAI meeting in Santa Fe at the end of October 1999. We will have a more tangible anniversary in June 2010, celebrating 10 years since the first release of the software at the second OAI meeting.

Friday 13 November 2009

"Getting" Twitter

Recently I sat in on a demonstration of Twitter to a University research group that included our PVC for research. Because of his presence I was quite self-conscious about justifying the Web tools I normally take for granted, and although the demo itself was fine, it didn't seem to answer the question "is this really useful or just some gratuitous teenager technology?" I have always claimed that twitter is a fantastic tool for keeping up-to-date with the spread of ideas and debate in the community - lots of micro-comments keep me in the loop about which speakers have raised what issues at which conferences, even when I can't travel and engage directly. However, I have been worried recently that the twitter output that I see has been less technical/academic/professional and more personal/informal/gossipy. So I thought I would do a quick investigation to see if there is any evidence to support my positive experience of twitter. I chose to look at Twitter activity surrounding CETIS 2009 as several of my Twitter contacts had mentioned it in the run-up to the conference.

The CETIS conference ( is run by the JISC Centre For Educational Technology and Interoperability Standards, and attracts many people from the E-learning community. It took place at Aston on 10th and 11th November 2009, attracting 146 delegates according to the open list on the conference website.

Over the period that the conference had been mentioned (from the afternoon of Nov 5th up till midnight on Nov 12th) 566 tweets were sent by 89 separate contributors. The large majority of these (440, 78%) were sent during the conference sessions, with 255 (45%) on the first day and 185 (33%) on the second day. Outside the conference hours, 32 (5%) were sent in the break between the two days of the conference, 57 (10%) were sent before the start of the conference and 37 (7%) were sent after the end of the last session.

How many of these tweets are merely "backchat" or "electronic gossip" and how many of them are broadcasting helpful information? I used the twitter API to download all the tweets and individually categorised them as "informational" or not. An informational tweet contains some information about the conference that is useful to an external viewer (a non-delegate such as myself). It may contain a quote from a speaker, a URL to a relevant resource, or a brief (microsummary) of an issue raised. By contrast, an example of a non-informational tweet may be a complaint about the wireless network, a comment about the quality of the food or a message of thanks to the organisers. This categorisation requires some judgement on my behalf, but the criteria are resonably straightforward and repeatable.

The distribution of tweets over time can be seen in the following figure (click to see a bigger version), which also shows how the number of "informational" tweets (red) compares to the total number of tweets (blue). In total, 324 tweets (57%) were in the informational category.

During the conference sessions, the informational tweets account for most of the Twitter activity (307, 70%). In other words, the effort expended in twittering during conference sessions is not wasteful and distracting effort from engaging with the conference agenda. It is mainly valuable to an outside observer - which I would claim extends the impact and influence of the conference beyond the cohort of local delegates. Of course, this works best if the tweets can refer (and link) to a rich set of online resources to direct observers to.

Back to my obsession with showing that Twitter isn't just an electronic stream of gossip - the figure on the left shows how people break down into different Twitter categories: those who only twitter useful information (or did on this occasion), those who never twitter useful information (not useful to me anyway) and those who mainly or partly twitter useful information (those whose information rating were more or less than 50%).

But those who stick strictly to the facts don't provide the biggest chunk of information. This figure on the left shows the contribution of the various groups of twitterers to the total information content of the tweets: most of the useful twitter information is provided by people who mix "information" and "comment".

So perhaps I shouldn't get too worried by the criticism that Twitter is full of people telling us what they have for breakfast and what happened on their trip to work. Perhaps it is precisely those kinds of people who are more likely to let us all know what the key themes are emerging from that high profile conference that we couldn't attend.