Archives for category: Technology Strategy Board

On Tuesday 21st I braved the apparently dire storms and broken trains to trek up to The Studio, just round the corner from Birmingham New Street station for the fourth in a series of events hosted by the KTN Company on behalf of the Arts Council and Innovate UK. Like its predecessors (in Bristol, Manchester and Sheffield), the session brought together artists, arts administrators and creative technologists to help brainstorm what kinds of projects the two organisations might be able to spearhead to bring art and technology together in a way that’s creatively, strategically and economically meaningful.

The session was hosted by Graham Hitchen of Directional Thinking and expertly facilitated by Jason DaPonte of The Swarm (and a former colleague of mine from the BBC back when we were all trying to figure out exactly what it meant to be doing “internet stuff” for the corporation). The shape of the day was pretty straightforward: an open discussion in the morning helped to identify recurring themes and to cluster them; some attendees presented work of their own in this area to bring it all to life; and the afternoon saw us break into groups to pick off some of the emerging themes in which we were interested.

Now I had gone along not in my usual KTN role, but purely out of personal interest. Readers of this blog or of my own, more personal rantings on DGMFS will know that the coming together of art and tech is both a personal interest to me and a cornerstone of our business practice. Anyway, as a result I was “stuck in” as an attendee and not taking such rigorous notes as is my wont. So what follows is more a set of recollections about the day and the themes which emerged during lengthy, and often rambunctious, discussions.

Digital art is tending towards the immersive. Now I think there’s a real spectrum here: there’s what I think of as “traditional” immersion (being deafened and smacked around in a mosh pit comes to mind, but then so does sitting in the best seats, bang in front of the proscenium arch for the performance of a great play or opera); there’s technologically enhanced immersion or Augmented Reality; then there’s screen- based virtual reality; and finally fully immersive, virtual environments of the kind the Oculus Rift may allow.

I think these are important distinctions to draw, but it’s certainly going to be fascinating to watch new art forms develop right along the immersion spectrum.

Digital art often plays with the senses and perception. Now again, I would suggest that all art in some ways changes perception (arguably it’s the whole point), but the advent of digital is driving more direct interventions with the senses, from sensory deprivation, through binaural sound to the stimulation of traditionally overlooked senses in the arts – especially smell and touch.

There’s a strong sense of play here. This closely relates to the observation about senses above, and of course to gamification more generally. I lost count of the number of times Minecraft came up in conversation (there were, I suspect, a lot of parents in the room). I sounded a slight note of caution here. I love play – it’s a crucial part of a healthy life, and of healthy art. But there are many darker sides to humanity that art has always sought to explore or even celebrate (consider a 40,000 year arc from the Chauvet cave paintings to Norwegian Black Metal). It would be great shame – and hideously short-term thinking – if digital art got hung up overly on “fun”. (For what it’s worth, I this is just a “phase” as my mother might say.)

Production is being democritised. I don’t personally like to “D” word in this context, but I get what it means, and I think it’s hugely important. Specifically, ever-cheaper tools and the wide availability of accessible publishing platforms are allowing the re-emergence of serious amateur art making (often referred to – hideously – as the rise of the prosumer). The emergence of cheap computing, too (the Raspberry Pi, the Arduino etc) are accelerating this. But how does the traditional art world respond? And how will artistic practice be altered? For me, it’s return to normal after the blip that was art in the age of mass media. But that doesn’t mean it’s not a paradigm shift.

Multi-disciplinary practice is becoming the norm. As The Unseen’s Lauren Bowker pointed out about fashion and technology in the London Fashion Week panel I hosted last month, multi-disciplinary working is not so much preferable as essential. And this is certainly the case in art. I believe we’re going to see more and more group-created multi-disciplinary art (consider the work of 2010 Turner nominees The Otolith Group). Does this mean the disappearance of auterist art? And how will that play out? Again, this is potentially a seismic shift in how art is produced.

Shared spaces are necessary. Something that closely relates to the rise of collaborative art practice is the need for shared workspaces – spaces where radical experimentation and the sharing of ideas and practice arise naturally. There’s was considerable discussion about this, not least as Birmingham lacks a space like, say, Bristol’s Pervasive Media Studio or Liverpool’s FACT. (And in case this all sounds a bit Newspeak, remember that there’s a hundred-year old precedent for a shared, multi-disciplinary space where artists and technologists can work together: The Bauhaus.)

“Art for art’s sake”. There was quite a lot of discussion about the need to think about art on its own terms. and I heartily concur. I recall a speech Sir Nicholas Kenyon gave on launching a Proms season in the early noughties. He discussed how in the Thatcher years art had been considered part of the tourist or (another hideous term) “heritage” sectors; under New Labour art had all been about regeneration. But these positions are merely two sides of the same coin, and both see art as a means to socio-economic ends. He suggested that we need to change the discourse about art, and frankly, a decade on, that still stands. Indeed, if anything, the advent of digital has only intensified the debate, with new digital art once again being discussed constantly in terms of collateral benefits, rather than considered in its own right.

Business models must evolve. Business models for the arts don’t need to be continually re-examined. Far from it. I was struck that of the day’s discussion was almost exclusively about the role of digital in the making of art. Yet from my experience of working with some major players in the UK art scene (commercial and public sector alike), the principal interest in digital media is about how it can create new business models and frankly drive income: more people through the door, more bums on seats, higher profile, online content distributed to new audiences and so on.

I’m not saying that changes to artistic practice wrought by digital technology aren’t fascinating in and of themselves (or I wouldn’t have spent 1400 words talking about them). But the new technologies offer huge business opportunities to artists and arts organisations of all statures. How those opportunities are grasped, of course, is at the heart of the disruption challenges facing not just this arena but almost every corner of contemporary life.

The series of events concluded with a session in London yesterday at Cecil Sharp House. (One wonders how the folk song collector and archivist Sharp would have responded to digital technology? Interestingly, archiving – another personal obsession – was entirely absent from the discussions on Tuesday.) I’m very much looking forward to hearing how that session pans out and how, more importantly, this little “tour” influences where ACE and Innovate UK go next.


The Barbican’s Digital Revolution has become one of London’s must-see exhibitions of the summer; with queues up to 100-long waiting to get in at pre-alloted times, its popularity is the kind generally experienced by the blockbuster art retrospectives put on by the larger London galleries. But in every way this is a very different show: Digital Revolution is about the impact made on creative practice generally – and the arts specifically – by the rise of the PC, the internet and, in its wake, the world wide web.

Digital Revolution Trailer from Barbican Centre on Vimeo.


Now just to be clear from the outset, I’m not an art critic, nor any other kind of critic for that matter, so the handful of observations here are those of someone who’s simply lived through this particular “revolution” as a consumer, as a citizen and as a professional participant. (Google “digital revolution reviews” to get a sense of what the professional critical establishment made of it. Actually the spread of opinion – summarised neatly by The Week – is almost comically predictable, with WiReD calling the show “spectacular” and the Telegraph dismissing it as “gimmicky”!)

The first thing to say is this is not the Barbican’s first foray into this arena. Two previous exhibitions in the Centre’s main gallery, Serious Games (1997) and Game On (2002) looked in curatorial depth at games technology and its place in the wider culture (and Culture). But more than this, of all the main London arts centres, a pretty strong case could be made for the Barbican being the most technologically adventurous across its three principal areas of output: theatre, music and the visual arts. And as I write, only last weekend the centre was effectively taken over by the Wikimania conference. As a stolidly modernist institution, the Barbican feels as though it gets digital.

To get down to specifics then… The heart of the show takes place in the Curve gallery and begins with “Digital Archeology”, a look at the early days of digital technology in areas such as music, video games and film making. Later it looks at, among other things, how collaborative practice has been enhanced by digital, at how musicians and film makers have worked with digital technology to go way beyond the expected pop promo, at the pivotal role digital post production plays in contemporary film making, and at how developers are using code to create art (more of this in a moment).

The curatorial approach in the Curve has been to combine, essentially, a museum-style history lesson with an art show. Forty year-old pieces of electronic kit, classics from the world of video games and snippets of technologically groundbreaking film making vie for place with art work using or else inspired by digital tech. For some this might seem an odd fit, but for me it really worked, setting the work of important (although very different) artists such as Chris Milk, Björk and James Bridle in a context of a revolution that very largely played out in the entertainment arena.

The exhibition continues throughout the centre, with an area dedicated to “lo-fi” or “indie” video games (no coincidence that these terms mirror those used in music), and finishing with the installation that graces most of the show’s advertising, Umbrellium‘s playful Assemblance, a darkened space in which laser light appears to respond “personally” to the movements of the audience and which somehow encourages collaboration and non-verbal communication between strangers.


So the themes that emerged for me, then?

Well the most immediately obvious one – in the Curve at any rate – is that the curators have gone maximal. Put simply, the minute you enter the show you’re overwhelmed, with countless visual exhibits glowing or else lit up in the dark to a soundtrack of Peter Gabriel and Afrika Bambaataa. And so it continues, with the sound and light bleeding from one exhibit to the next.

Now “Immersive” or “interactive” might be more polite terms to use than “overwhelming”, and Digital Revolution is both of those things of course; but “overwhelming” feels to me more accurate, and certainly less modish. And to be clear, I mean this as a positive; I think it’s a commonplace that the life we all now lead in the developed world (and, increasingly, in the developing world, too) is defined by technological overwhelm. In the Curve, this feeling is captured brilliantly, and in this sense the show itself is a meta-exhibit.

Then there’s the flow between entertainment and the arts to which I alluded earlier. It’s often asserted that high art is decades ahead of entertainment in terms of technical practice, and there’s some truth in this. But it’s a more complex picture than that. Digital Revolution makes it clear that there’s been a constant two-way traffic between the two arenas. Indeed, one of digital technology’s chief impacts has perhaps been to blur the boundaries between the two.

And yes, it’s a cliché, but I was struck too by the sheer speed with which this has all happened – in less time than I’ve been alive – and its sheer magnitude. If you want a graphic illustration of Moore’s Law, look no further than a couple of early exhibits in the show, a sampling synthesizer – the Fairlight – and Linn drumm machine. Now both have their retro charm – not least to a music geek like me – but my God, the Fairlight in particular is huge. Moreover, it’s a dedicated piece of kit, that largely did one thing (albeit to great effect); its functionality might be no more than a plugin in a contemporary Digital Audio Workstation.

Or look at the FX-driven classic films, extracts from which run throughout the “digital archeology” part of the show. Terminator 2, The Abyss and Jurassic Park are all arguably fine bits of storytelling, but alongside the two films highlated later – Inception and Gravity – they don’t look any less hoaky than Ray Harryhausen or Gerry Anderson’s work, and certainly less charming. (Interestingly, by comparison, Peter Gabriel’s “Shock the Monkey” sounded, to these ears, easily as contemporary as the piece that takes centre stage later on. Simon Reynolds might have something to say about that.)

As it happens, both Jurassic Park and T2 both tackle themes that I personally felt could have been explored a little more in the show: the downsides of technology. In this regard, James Bridle‘s piece Dronestagram is a standout. Bridle’s work in general is intriguing; on one hand he’s an advocate of “The New Aesthetic” on the other hand his work frequently forces the audience to consider the political implications of technology, from Wikipedia revisions about the Iraq War to the role of drones in modern warfare (and beyond). But the curators seem to have steered clear, largely, of the negative aspects of this particular revolution. Even the overwhelm that I talked about earlier comes at a price – often a very significant one – but that’s not tackled here.

But this caveat seems churlish, especially when considering what for me was the most inspiring exhibit in the show, albeit one outside the main exhibition spaces. Not Impossible Labs have developed BrainWriter, which uses a combination of brainwave recognition and eye-tracking to enable the paralysed graffiti artist TemptOne to communicate and perhaps even make art once again. It’s a moving story in itself, but beyond that points to one of the next frontiers in digital technology: the direct control of objects in physical space by mind alone.


But my single favourite piece was part of the DevArt space, which brings together pieces commissioned jointly by Google and the Barbican from developers creating art from code. Zach Lieberman‘s Play the World feels somewhat overlooked in the coverage I’ve read, yet for me it’s the best – and most evocative – piece in the show. It consists of a single piano keyboard surrounded by a circle of speakers; a note played on the keyboard triggers software that finds that very note being played that moment on a radio station somehwere in the world (the station and country are displayed on LED screens below the speaker from which the sound emerges).

Now of course the results aren’t as musically perfect as those created, ironically, on the Fairlight, sitting in a vitrine near the exhibition’s entrance. Indeed, from a certain perspective what comes out is cacophonous. But the way that Lieberman has used code to summon up a sense of our place in a world that is at once culturally heterogenous and yet irreversibly connected represents, for me, the very best kind of digital art: interactive and immersive, yes, but more than that, profoundly thought-provoking and, well, human.



[Disclaimer: the Barbican is a long-standing client of Turner Hopkins.]

Last week, on July 23rd, I went along to the spectacular Level39 offices in Canary Wharf for the “Connected Cities Assembly” – an evening of networking and brainstorming hosted by the TSB’s IC Tomorrow team. Now to some extent this event was outside my usual purview; regular readers will know that my professional bag is the meeting point of the content industries and digital tech. Nonetheless, I’m profoundly interested in all matters urban, as a city centre dweller myself, and someone convinced by eco-heretic Stuart Brand’s argument that cities are the last best hope for the planet. So it was great to be in the company of people thinking profoundly about issues of urbanism – and how digital technology plays into them.

The evening was a three-acter: scene setting around four key themes; facilitated round table discussion around each of those; and informal networking, of course. The scene setting stuff was fascinating; Rick Holland from the TSB discussed Connected Buildings; Richard Miller, the TSB’s Head of Sustainability talked about The Connected Environment; Beatrice Rogers, the Knowledge Transfer Network’s Head of Design introduced the Connected Communities theme; and the TSB’s Head of Transport Stephen Hart talked about Connected Services.

I won’t go through each of these themes in detail, but here are some of the observations and anecdotes that caught my ear:

  • London is the 15th most water-stressed city in the developed world.
  • 50,000 people a year die in the UK because of poor air quality. (I confess that both these points were quite shocking to me.)
  • There is no “one answer” to the problems that increasing urbanisation brings – rather we need to find solutions that address problems laterally – and smart use of data is key here.
  • And with regard to data, we are moving to a point where we “sense everything”.
  • The question is: when we do sense everything, what do we do with the data that arises as a consequence?
  • Specifically, standards and regulations around data are pretty appalling across the board (a common theme here when we’ve been thinking about data standards in the content industries).
  • With regard to design, “the technology drive often forgets the user”.
  • A smart approach to tech is about more than “the new app of the day”.
  • Transport systems the world over are “at breaking point”; how can tech entrepreneurs help address this (“what would happen if every traffic light in a city had its own IP address?”)

Following these introductory remarks, attendees were asked to circulate every 15 minutes around the four “themed” tables for something of an informal brainstorm addressing the issues raised so for. The pace across the following hour was pretty furious and the energy in the room obviously high.

The facilitators then brought us all back together to summarise the main points of the discussions they’d been hosting. I will go through these, briefly, theme by theme.

Connected Services

Stephen pointed out that the tenor of the discussion had ultimately been more about describing the problems in detail than coming up with specific solutions. Issues included:

  • Private vs open data
  • Coordination between different teams and bodies
  • Data visualisation
  • Standards

Connected Environment

Some of the themes that emerged here clearly echoed the previous set:

  • Standards
  • Open platforms
  • Data sharing
  • “It’s too much about the tech; not enough about the citizen”

Connected Communities

Beatrice’s summary touched on:

  • How can we utilise “communities of interest”?
  • Doing more with less.
  • Changing behaviour – especially among the disengaged
  • There is no one size fits all solution.
  • Digital exclusion remains a very real problem.
  • Trust is vital – but how is it “captured”?
  • Just why is word of mouth so important?
  • And, of course: the criticality of good design.

Connected Buildings

Finally Rick summarised his group’s key themes:

  • Route to market is crucial… –
  • Are property developers the right people to work with? They’re crucial for revolutionary approaches – but things can take a very long time…
  • If, on the other hand, you’re thinking about retro-fitting quick wins then you need to be thinking about who runs existing buildings.
  • Find what motivates people.
  • That is… how can we take data about energy and turn it into something meaningful and motivating for ordinary people?
  • Oh, and again: design is critical.

Listening back, it struck me that Stephen’s observation about his theme applied across the board: that what had emerged was more an in-depth discussion about the problems of urbanism than a hard set of solutions. But frankly this is among the thorniest of areas we face, not just as creative technologists – but as a species! The solutions will only emerge from a hard-headed mapping of the problem space such as tonight’s. Personally, as something of an “outsider” to the sphere, I found the evening completely fascinating and look forward to seeing what the group does next.


After our competition briefing in London on May 6th, we hit the road as ever, taking a stripped-down version of the briefing to Manchester and then on to Bristol. We were, again, talking everyone through two current Creative Industries funding competitions: Innovation in location-based services and Enhancing the value of interactions with digital content.

We did Manchester on the morning of May 8th, in the splendid Museum of Science and Industry. As in London the TSB’s Lech Rzedzicki and Tom Fiddian walked us through the EVIDC and LBS services in detail, before taking questions from the audience, along with Matt Brown. As ever, the questions ranged from the straightforward and technical (Is this single stage? – Yes. Can a Catalpult be a consortium partner? Yes – but not funded… etc) – to the more challenging (Can you give a %age of successful applications? Not really, varies from call to call).

We had a very informative case study from Dennis Kehoe, CEO of Cloud computing specialists AIMES Grids Services, who’ve received several rounds of TSB funding. Dennis’ key points included:

  • You do have to invest time and effort in this process
  • Collaboration – share the risk of innovation
  • But get collaboration agreements in place
  • Don’t do it for the money!
  • Don’t reverse engineer the call
  • Timescales are rigorously applied
  • Don’t over stretch your cashflow
  • Feedback is key
  • You do get better at this stuff!
  • Allow something like 15% for project management

Dennis then joined the KTN’s Rachel Jones, Lech and Tom for a panel discussion which I chaired. Dennis had used the wonderful line “We are all Pavlovian Dogs”*, so I kicked off with a question on the disturbing ubiquity of advertising and branding around online content. The conversation, spurred on by the audience, went on to cover the user-centred design approach, the difficulty of combining Agile methods of project management with TSB “process”, and ways in which users might start to take back control of their data.

The next day, Friday, we finished off in Bristol with a morning session at the Watershed**. The pre-break format was much the same, and was followed by a case study from Paul Appleby, formerly of the BBC and now CEO of VID Communications. Paul’s tips included:

  • Read the application very carefully
  • Establish the potential value of the bid
  • Use the language of the application
  • Get clarity on roles, especially on “who has nagging rights”
  • Avoid “Spreadsheet  Romanticism”*
  • Establish lead roles: PM, tech lead, UX lead
  • Get a central document store together
  • Get routine meetings in the diary
  • Stay in scope!
  • Keep your Monitoring Officer informed of potential changes of direction
  • Define and montor deliverables
  • Take your time on the application
  • The consortium needs history

We finished the morning, of course, with a panel discussion of the usual suspects, plus Jo Reid, of local company Calvium and Andy Proctor, Lead Technologist for Satellite Navigation at the TSB – all ably chaired by Rachel. Interestingly, of all the three discussion we had this was the most caught up on TSB process, but all in all it was a lively session and great to get fresh insight from Joe and

So, all done for now! As ever, thanks to the KTN’s Anita Onwuegbuzie for making the whole thing run so smoothly; and to Rachel, Tom, Lech and Matt for good company along the way.

* I’m seriously thinking that “We Are All Pavlovian Dogs” by Spreadsheet Romantics has to be my next music project, a kind of early 80s agit-pop pastiche. Or maybe not.

** The Watershed is also home to the Pervasive Media Studio, which I’d not previously seen. Thanks to Mark Leaver who showed me round during a break in the day’s proceedings.


Tuesday was a busy (and long) one, then. We kicked off in Kings Place in the morning with the TSB briefing then started all over again in the same room that afternoon with the launch of the TSB’s Digital Strategy for the coming period.

The afternoon was introduced and generally hosted by Frank Boyd, one of the directors of the newly formed Knowledge Transfer Network, who briefly talked us through the KTN’s work before handing over to the TSB’s Head of Digital, Nick Appleyard who presented the organisation’s current and upcoming position on the digital industries. Some headlines, then:

  • The stakeholders in the digital space the TSB seeks to link up are the development or tech community and the businesses on whom an impact has been made by digital. The latter group often struggles to understand digital but the former group, as steeped in it as they are, struggle to find a route to market.
  • The TSB achieves this “joining up” with its IC Tomorrow team and via the KTN.
  • The principal elements of the idea cycle are conception, IP, business models and technology. But all too often the missing piece is the user.
  • So the TSB really seeks to fund projects that are user- and market-led.
  • UK is a great place to build digital technology businesses because of users’ expectations and behaviour: we’re Europe’s leaders in online and mobile penetration and the world’s leaders in terms of online transaction.
  • The TSB has launched the Connected Digital Economy Catapult in order to provide a platform and tools to help tech SME’s reach new markets.
  • Collaboration is hugely important to the TSB, and current partners include Nesta, ESRC, Creative Skillset, UKTI, and on and on…

Nick handed over to KTN’s CEO Chris Warkup whose opening remark struck a chord in the room: that the UK is great at innovation but not so good at the exploitation of new ideas. He then talked us how the new KTN was going to work. Here are some headlines:

  • The new company merges 14 previous KTNs.
  • Knowledge Transfer isn’t something that can simply be conducted virtually; rather, it’s a “contact sport”.
  • Often businesses have lots of information and knowledge to hand, but can lack wisdom.
  • The KTN seeks to drive cross-sector collaboration, join up business, technologists and funders, build multi-disciplinary teams and help the TSB in ths scoping of competitions.
  • He ended with a line of Matt Ridley’s, which I’ll paraphrase: “The future’s most limiting resource won’t be water or oil, but good brains.” (Matt, of course, is, in his own terms, a rational optimist; being rather more of a glass half full guy when it comes to civilisational development I think I’m rather more with Jared “Collapse” Diamond on the lack of water front, but still, I take his point.)

Chris was followed by the first of two talks from companies who’d received TSB funding. Jeff Clifford and Graham Jack represented Double Negative, the largest visual effects company in Europe, and often in the world. They discussed the huge changes in their field over the last decade, with the requirement for CG increasing dramatically – a real headache both in terms of logistics and the use of artists’ time. They went on to illustrate their work with a showreel of scenes from the upcoming Thor 2, which showed pre-VFX shots with finished ones; they were, of course, barely recognisable from each other.

They went on to say how innovation in workflow was absolutely essential for them, and that’s where they’d focussed for their TSB-funding work. SIM, a project run in collaboration with FilmLight and Surrey University sought to address these issues, and a project about to start is ASAP : a Scalable Architecture of Production.

Next up was Pilgrim Beart, founder of, and the chief architect of Hyper/Cat, a TSB-funded project looking to create interoperability standards to join different verticals all working in the Internet of Things space. He opened with the fascinating observation that as the number of connected devices on the planet grew to outnumber, vastly, the number of people, then devices were going to have to start “looking after themselves”. And what was getting in the way of the IoT actually happening? Interoperability between verticals.

So Hyper/Cat sought to crack the problem, or at least examine how it might be cracked down the line, with a multi-party demonstrator. The outcome had plainly been successful, and Pilgrim outlined the chief drivers of its success:

  • Learning by doing
  • Strong relationships
  • An early market
  • The development of a process for coming up with a spec
  • Global potential – the UK can truly lead here.

He finished with the observation that in 10 years the very term Internet of Things would be long gone; we’ll simply talking about “The Internet”.

The day ended with a panel discussion on the challenges facing digital innovators, hosted by Jon Kingsbury, currently at Nesta but about to come over to the KTN as Head of Digital Economy. Jon was joined by Allesandro Guazzi of Sentimoto, who are developing smart wearables for older people; Emer Coleman of TransportAPI, who make apps based on public data made available through APIs; and Databarta’s Jane Lucy, a specialist in the use of digital media to deliver campaigning. The panel took questions from the floor and debated a range of issues, including:

  • The desperate lack of tech talent, especially devs and most especially Ruby devs.
  • The importance of the engagement with Europe and the Horizon 20/20 project.
  • The difficulty of actually doing x-disciplinary collaboration.
  • The value of mentoring.
  • The importance of “social”.

Jon put a final question to the panel: what could the TSB do to improve the situation. Three answers came through loud and clear: skills development; help finding the right collaborators; and a push for open data standards.

It was a nice, sparky ending to a generally lively and hugely informative afternoon.