Skip to main content

Eduserv Symposium 2008

I came to attend this symposium out of the blue, having seen an email late one Wednesday afternoon, saying our assistant director was too ill to go, and after a quick look at the programme, I realised it was a follow-up to an event I'd seen on video a while back where an entire conference on Second Life had been trashed by a talk which had argued it was all pretty much useless hype. So if this year's presentations were going to be in that vein, it sounded like like a fun time.


This being a web 2 conference, lots of it was used, including a live chat backchannel ( http://www.eduserv.org.uk/foundation/symposium/2008/livechat powered by cover it live streaming software:http://www.coveritlive.com/ ), a ning based conference centred social networking site (which as expected didn't achieve critical mass but was a nice feature all the same), and of course lots lots more.


Eduserv's Andy Powell started the day talking about these "Disruptive technologies" we know so well. Looking across the room, it seemed a-bleep with mobile phones, laptops and all kinds of hybrid gadgets twittering and SL-ing and all kinds of SN/Web 2.0-ing as he spoke.


"Please turn your phones off as it interferes with the equipment in the room, unless you're twittering or blogging from it"


This was the digerati of UK HE in the room (from which a colleague had minutes before noted the conspicuous absence of any HEA top brass), and it was a bit negative to hear all these references to the "disruption" caused by the uptake of web 2.0 in HE and all this focus on how to "control" it. But later on it surfaced that I wasn't the only one who thought a more positive terminology (like "Emerging Technologies") would be more conducive to positive adoption on campus or even just to an understanding of the real strengths and limitations of these tools. Another good reason to have a chat back channel - all these slightly controversial thoughts tend to get put forward there easily, while I guess people are a bit more shy of doing it live in Q&A.


Larry Johnson:


Larry presented using Second Life as an embellished Power Point, with his avatar walking through a virtual exhibition of photos of his grandparents and of various turn-of-century discoveries, followed by lists of all the technological revolutions that that generation had to deal with. He compared that with the current IT situation, from the beginning of the personal computer and Internet, to now, and noted that in comparative terms we haven't even got from the Gutenberg press to Martin Luther - any real revolution to come from this has still to come. Another difference between that generation and this one is that the focus has shifted from using technology to free up time - we have no such illusions today. My lack of a pen at that point limits my recollection now, but there were some areas that the Horizon report had identified as the main areas of growth and change for the education community:


  • the arrival of grassroots video as a teaching tool and increased pressure in HE institutions to deliver video storage/distribution/collaboration.
  • Collaboration Webs - using tools like google docs or other simple online tools requiring just a modern computer and web browser.
  • Mash-ups - old news but now getting more mainstream with the increasing availability of data.
  • Social OS - the next step in social networking is a focus on the individual rather than on content in all aspects of software.


In my opinion these blue sky previsions don't tend to take into account the more global state of the world today, the economic downturn and it's effects on the world for example, so Dr Johnson's talk seemed a bit limited in that respect, and when cornered (by me) later over coffee, he seemed dismissive of the effects of global warming and possible legislation changes on data centre energy usage as well as changes due to price increases and how the digital divide would affect the future he envisaged. The horizon report can be found at http://www.nmc.org/horizon


Bobbie Johnson: The guardian and Web 2.0
http://www.slideshare.net/tag/efsym2008


This was the most useless talk of the symposium. I think the inclusion of two large media agencies was a mistake, and we could have done with half that presence replaced by someone from another business sector, from a student or from some other piece of the picture. Here are my notes anyway:


The Guardian was founded as the Manchester Guardian in 1821. The paper's format and structure didn't change until the early 50s with the addition of photography. At all times the core values of social justice, freedom of thought and religion and social reform have been at the forefront of the decisions they have made as an organisation. Johnson spoke at length on the history of this newspaper on that basis, and the various owners and trusts that formed through the years.


The website appeared in 1996. Very embarrassing. By 2007 the director told his staff at the All Hands meeting - "We are now a digital operation which makes printed stuff on the side". So radical change is very recent.


He then showed us a front page scan from a couple of years ago. Very few things came from web 2.0 specifically (although you could say that all the user generated content was in some way reflective of the new notion of the web as a 2 way consumption/production medium).


Then he showed a very nice blog aggregate page (in his words a "Superblog"): http://commentisfree.guardian.co.uk/index.html - probably one to emulate when doing a university-wide blogging service, although I suspect it's very well edited, so there's an extra bit of effort than just getting people to write good blogs.


The Guardian site has gone from Closed/Subscription based to free access, and as a company they have gone from content provider to content platform.


I closed my notes with a poem:


Did photography create surrealism in art?

The digerati thumb their phones

a blue glare reflects on their faces

Information hiding ignorance

Geoffrey Bilder: Sausages, coffee, chickens and the web: Establishing new trust metrics for scholarly communication


A very interesting and clued-up talk on trust issues and the web. Personally I would have defined these as filtering issues, but it still makes sense either way: the web is awash with information and it's not rated, so you can waste huge amounts of time surfing it, and never can be sure of the quality of what you read, whereas traditional media has inbuilt filtering - due to the physical and commercial limits of just publishing everything like the web does.


Bilder's talk examined amongst other things the reason why the tilde (~) is non-trustworthy - (Spoiler alert!) - because it denotes a URL for a home directory - i.e. not official information but contained in a personal home page. But to a regular non-techy this isn't obvious, and the same is true for the various web 2 enabled sites. It's hard to assess trust. The path followed by any new technology depends on all these issues, and trust is crucial to it's adoption. It usually goes like this:


  1. A techno-information power base invents a new technology (eg, the blogging community circa 1996)

  2. Publicity/Hype follows
  3. The masses take up this technology
  4. Breakdown: the hype doesn't live up to it. (eg: people discover most blogs are abandoned in a few weeks).
  5. Filtering systems are created. (eg: technorati)


In this way Bilder made a clear connection between the trust exuded by traditional publishing media via it's implicit filtering system ("wow - they're going to publish my book" = "it passed the filter").



He then talked about the first filtering systems put together on early web logs: the slashdot.org karma points system put together to reduce the incredibly high volume of comments they were dealing with daily, and which was reducing the overall value of the site - high points (awarded via good behaviour on the site) made you a temporary comment moderator, and in turn your moderations would be moderated by other high karma scorers, thus drastically improving the quality of post comments if you opted to raise your filter level.


Other early systems of peer-based filtering were Ebay's focus on user trust and ratings and Google's siterank system. These trust metrics were key to the success of these sites.


Chatting later to Debra, she agreed that self filtering systems are probably the way forward. The slightly depressing outcome of Bilder's talk was the idea that in the same way that traditional media has been supplanted in a way by the web, and as medieval scribes were made redundant Gutenberg press, so quality controlled on-line resource collections like Intute are endangered by this, because they apply a "centralised" filtering/trust system, which an automated web 2 enabled peer review system might do just as well.


The questions and on-line comments were very interesting, and it was a shame there was no time to answer or discuss at length. One insight from here was the way people's perception of their personal profile (as used on SN sites) as increasingly personal - something that should be owned and held by the individual and released/sold only to trusted parties of interest to the individual. Bilder agreed that this is probably the way things will be in future.



And then we went for lunch. Many a picture was flickrd of the curiously purple tray of summer desserts.
http://www.flickr.com/photos/dmje/2475205817/ (more photos at
http://www.flickr.com/photos/tags/efsym2008/ - and the efsym2008 tag worked quite well as a way to tag across slideshare, flickr, delicious etc)


Also during lunch I bumped into Torsten Reimer of the now semi-defunct AHRC Methods Network. He sadly told me of the serious lack of funds that this kind of initiative suffers from. They have a little money for small projects, but not enough for anything bigger as a result of these, or for any radical strategic changes, so the MN is not viable at the moment.


BBC:
This was similar to the guardian talk in it's irrelevance for me, but of the two I'd have kept this one, more witty and a lot more insight into the future: the speaker showed us the evolution of BBC content up to it's inclusion today on other websites: on the Sun, the Guardian's sites, and the communities formed around programs that the BBC had produced, but that were taking place outside of the BBC's websites.


So does it matter to the Beeb that their competitors are taking the content that 25% of their income is spent on (the online side) and making community out of them? This is the "globalisation" problem of web 2.0, and a hard decision for the Beeb, but they currently allow it. Possibly because their core principle is that they are a brand: Their charted doesn't specify they have to make programmes on TV: they just have to entertain, educate, inform.


Chris Adie:


First of all, the document circulated prior to the Symposium ( http://www.vp.is.ed.ac.uk/content/1/c4/12/45/GuidelinesForUsingExternalWeb2.0Services-20070823.pdf
) is a great first step towards regulations/guidelines/policies that help an academic institution deal with the issues that come up with the increasing adoption of Web 2 technologies.


In ID's case, the problem (for me) is the possibility of us hosting a university wide blogging service. A service like this would need us to first revise guidelines in many ways, even if the decision is to allow people to just use external services (we are still liable and there are still risks even if this is the case).


Another problem with external services is the credit crunch: what happens when your service goes bust, closes, shifts in focus, loses critical mass, starts charging or switches to paid registration?


From the chat: here are the BBC's guidelines on SN/Web 2 use: http://www.bbc.co.uk/guidelines/editorialguidelines/advice/personalweb/index.shtml


Also in the chat, the point was made that some of the social networking sites might be more resilient than public services - for example the ill fated AHDS - what will upcoming UK elections mean for any online services we may be using now?


Some of what he said I found to be a bit unbalanced along the lines of that chat comment: he said for example that information might be more at risk of unauthorised use, unscheduled maintenance etc - but these are also risks within an institution if their internal policies or technical systems aren't up to scratch - and if the government can lose huge amounts of public data, I am sure Higher Ed can catch up.


Also I'm a bit concerned with the paper's implicit position on Intellectual Property rights. It is true that not all info should be given away immediately, and that a lot of grant money depends on ideas being kept safely under wraps, even in academia, but a university legal dept should be up to speed on the GPL and CC licenses, and be able to advise what is personal and what is owned by the institution depending on who you are, the nature of the work/data and in what capacity you work for it. Any other sharing should be facilitated by universities by their embracing of web 2.0 related speedy transfer of knowledge (such as twitter/facebook).


Apart from these doubts though - this is the first clear and broad paper trying to put together the first academic guidelines on risks and implications of using SN and Web 2 technologies, and he is aware it's just a draft and needs input from others.


Afterwards I asked Chris how we can feed back to him about his paper. He said he's in the process of making it into a wiki, but that at present comments are open, and we can feed back that way.


David Harrison: A Modern Work Environment at Cardiff U: http://www.cardiff.ac.uk/insrv/futures/mwe/index.html


http://diharrison.wordpress.com/2008/05/10/reflections-upon-efsym2008/


Dr Harrison startled us all with a very advanced web manager's view on how to run all the IT services within Cardiff University whilst still leaving space for SN/Web 2 technologies to be adopted strongly and used by their staff.


The presentation had lots of diagrams which I can't really explain well in written form, but here goes: The core (read "boring") services like calendars, request trackers, sick forms, finance software are at the centre of the picture, around which sit the managed research and learning environments, and around these, are the VLE/VRE. Anything else around this circle includes twitter and friends. Somehow this made much more sense with his slides though so I should stop there..


My main notes were that he had Cardiff's VC supporting all the way through, attending all the meetings and pushing things forward. We can't count on the same support at Bristol Uni, with Eric Thomas being much less available and not known to be particularly tech-friendly.


He also said that innovation, real discovery isn't particularly widespread in universities. The kind of innovation they see more and need is where existing innovation is brought into the university or across faculties and departments. This is a brilliant potential benefit of Web 2.0 - facilitating communication between people who wouldn't normally talk to each other, and giving them ways to disseminate that and value it.


More discussion of this at
http://blog.newport.ac.uk/blogs/michael/archive/2008/05/09/32921.aspx - another staff member involved in their MWE blog that mentions this presentation (I'm afraid I only scanned through this first time I looked... It's mostly on the media presentations).


Grainne's Presentation was the only one that really went into how web 2.0 actually affects pedagogy within academia. It was also interesting because I joined ILRT after she had left, and this was my first chance to see her after hearing so much about her. Fortunately she's already put it online: http://e4innovation.com/?p=198 - so I can skip talking about it since this post has gone on far too long now!

Comments

Anonymous said…
Great Ale .
Thanks for Your Great Article.
Welldone.
David Harrison said…
Michael Webb is from University of Newport, and is not involved in MWE although he was at the event.
ale said…
Thanks for that correction David! Hopefully he won't mind too much...

Ale

Popular posts from this blog

My Interpretation so far of the Sutra of Innumerable Meanings

This Sutra, handed down by Ananda, the Buddha's companion, then found and translated to Chinese by Kumarijiva, is part 1 of a trilogy consisting of the Lotus Sutra, The Innumerable Meanings Sutra and the Meditation Sutra. It is studied and known among others, by Nichiren Schools of Buddhism, and it's his interpretation that I probably share most with: http://nichiren.info/OngiKuden/text/Muryogi.htm First of all, it is meant to be read by Bodhisattvas. Boddhisatvas are people who use what they learn to teach others about how to be Buddhas. When I think of Boddhisatvas, I think of people like Gandhi or Martin Luther King etc, people who fought beyond their own lives for the greater good or peace, perhaps even without knowing what the right way or right practice is. The Innumerable Meanings Sutra says these people will attain the supreme enlightenment that the Buddha attained, eventually, although the short term effect will be that everyone is a lot better off. Laws, people and so...

Big Cafe on Transport Sustainability

About a month ago, I went to the "Big Cafe for Transport" event that was happening just around the corner from my house at the brilliant new "Co-Exist" sustainability business centre . Coexist run as a CIC and are just about to launch with a plan to open up green community and event spaces, funded in turn by work and business spaces. I really hope that means a market in stokes croft! After I attended, I'd promised everyone I'd write up about it, and promptly left it as a nagging thing in the background as life took over. But now the official write up of the event has been published so I thought I should finish the abortive blog post I made that same night. A disclaimer : I'm allowed to make mistakes here, so if I've written anything wrong or stupid, please correct me! A big cafe costs 20 pounds to attend. It started really early on a Saturday morning (thus excluding the entire population of Stokes Croft), but it included a lunch (from Kukuva Cafe ac...