Attending the PLAN network last week, the biggest surprise for me — given that PLAN is an arts organisation — was how many of the speakers focused on macro issues of policy, regulation and infrastructure.
This emphasis led me to search out Jonathan Grudin’s prescient paper from fifteen years ago, The Computer Reaches Out: the Historical Continuity of Interface Design (download as 1.1 MB PDF file). In it, Grudin charted how the focus of the ‘interface’ in computing extended — over the period from the 1950s to the early 90s — from the hardware, to software, through the screen, to groups and organisations. He argued that, with this shift, the duration of the events studied to design the interface increased from microseconds to days, and the methods used to study them changed from ad hoc techniques, through lab experiment, to ethnographic observation. (For a concise overview of these trends, see Figure 1 and Table 1 in Grudin’s paper.) With the era of ubiquitous computing (‘ubicomp’), one could argue that the computer has reached out once more: the interface is at the level of society and the public domain; the events studied develop over months and years; the methodology is historical and political analysis. Here are some examples from the PLAN event.
Julian Priest reviewed how the electromagnetic communications spectrum is often seen, metaphorically, as a land mass to be partitioned and auctioned off. Historically, a significant proportion of the spectrum has been reserved for government (for the military and emergency services), while the rest has been made available, increasingly on commercial terms, as a place to build a business. The 3G mobile auctions are an example.
But, Julian argued, the spectrum is a social construct. Referring to their current Spectrum Framework Review, he explained how Ofcom is looking to review its approach to the spectrum as technical possibilities develop. Julian argued for a ‘spectrum commons’ wherein no-one has exclusive ownership of the certain parts of the spectrum, and anyone can do anything they like in these parts within prescribed limits. This creates a low-entry-cost space where innovation can flourish.
As Julian put it, “The tinkerers are moving faster than the regulators”. Referring to his very thorough report The State of Wireless London, he cited the 20,000 wi-fi networks that have developed in Greater London (60% of which were recently found to be ‘wide open’ in terms of security) as an example.
Taking a lead from Julian Priest, and from Jo Walsh (see below), Pete Gomes and Saul Albert talked about wireless networks being “the new urban infrastructure”. Their Arts-Council-sponsored Wireless London project aims to “use cultural projects as a catalyst for embedding wireless technology in urban areas” and develop “a long-term cultural strategy for the spread of civic technology in London”. It’s always struck me as slightly odd that arts funding goes into communications infrastructure — I can understand it going into ‘architecture’, but not into ‘roads’. (I have a vested interest: as a shareholder in a very small broadband service provider, there was a time when we seemed to be in a bizarre race to get our wireless aerials up in central Sheffield before other arts projects in the area did, because the first to do it could claim priority over the airwaves.)
Jo Walsh explained how lots of community networks don’t have ready, affordable access to the copyrighted mapping data from Ordance Survey, and that creating geographic data for the public domain could address this — as in the LondonFreeMap project. Her vision is of an “open guide to London”, wherein maps are annotated with civic and historical information (this is built partly on semantic web ideas, which I’ve so far failed to get my head round). The evnt site (see also the associated blog extends this idea into a calendar sharing service based on tags — the equivalent of del.icio.us for events.
Duncan Campbell is well known for his critiques of technologies used for surveillance, and devoted his talk mainly to the threats to liberty from ubiquitous computing, as an antidote to what he saw as the ‘celebratory’ tone of the introduction to the event.
Talking mainly about the example of GSM location technologies, Duncan argued that (when he was Home Secretary) David Blunkett was conned by commercial cowboys about the potential applications of these technologies, which, Duncan asserted, do not work. (For background, see Spy Blog’s coverage of Blunkett’s claims on this subject.) He described empirical tests he had carried out comparing his location as given by GSM services with his actual location (as determined by eyesight or GPS services), which demonstrated fairly wild inaccuracies. You can try the mapAmobile service for free to test this for yourself.
Duncan explained that — if I followed him correctly — the GSM location technologies work by identifiying which ‘cell’ of your service provider’s network you are connected to (mobile phone networks have to know when you move between cells so they can route any calls correctly), but the way the networks work means that phones may connect to different cells while staying still. Using the ‘Sitefinder’ mobile phone base station database I discovered that, in an area 800m square centred on where I live, there are 37 base stations — while in the equivalent area around my old house in Sheffield, there are none.
There were some more straightforward presentations on art at the event, but I didn’t find any of the ones I saw very noteworthy.