Journal

The Bridge: pushing the boundaries

We work solely in the arts, culture and heritage sectors. Our clients include a number of theatres. Last year we were approached by the team behind The Bridge Theatre, the new theatre being set up by Nicholas Hytner and Nick Starr. This was very exciting (it still is).

When Andrew (General Manager) at The Bridge first approached us about the project it was obvious they already had very clear ideas around where they wanted to spend their time in terms of their website (indeed, I’ve just dug out his original email to us which said “having both time and the luxury of being able to start from scratch has meant that we’ve done a lot of thinking!” – and they definitely had).

The problem to solve

Individual productions at The Bridge may run for around 12 weeks, meaning potentially up to 100 performances of each being presented. Against that background, the team at The Bridge had identified that they wanted to make it easy for people to narrow down this long list of performance dates/times to identify specific performances that had availability matching the users’ criteria, whether that was around price, number of tickets, access needs, date or the type of performance (e.g. matinee).

The challenge was less around presenting a particularly large or diverse programme or representing lots of content related to outreach, fundraising or labyrinthine membership schemes, more around how to make it quick and easy for users to get straight to a performance that had the tickets they wanted.

Andrew Leveson, General Manager – The Bridge Theatre:

One of the most exciting aspects of shaping The Bridge has been the ability to step back and take a fresh look at the audience experience as a whole – from the relationship between stage and auditorium to the ease of getting a drink at the interval and the number of ladies’ loos.  The website was always critical to this – it’s most likely the first point of contact our customers will have with us and a vital opportunity to establish our brand.  How fantastic would it be, we thought, to aim for an easy, intuitive and (as much as possible) customer-joyful purchase experience?  Of course, there are always challenges and time and budget have meant we still have a long list of thoughts, instincts and ideas which we can develop as we launch, build and audience and see how customers respond to using the site – but we’re thrilled with where Substrakt’s resourcefulness and the great collaboration with Spektrix has led us.

If you look at the analytics of any ticket-selling site you will frequently see users bounce between lists of performance dates and select-your-own-seat screens in a vain attempt to find the tickets that match their criteria (whether that’s around number of seats, specific seats in the house or seats at a particular price) – if anyone reading this tried to book tickets for Hamilton recently (or indeed, any popular show) you’ll know how frustrating this can be.

We wanted to reduce this as much as possible so we put the user in a situation where they were always presented with relevant options and enough information so that they were moving forward through the purchase pathway as seamlessly as possible (rather than, for example, having to check the seatmaps of multiple performances before they found the seats they wanted).

Throughout the project, the team at The Bridge have been keen to make the most of their fairly unique position of starting a new theatre company from scratch. Their approach was free of assumptions which means we have had the freedom to achieve something unencumbered by the need to slot into pre-existing processes. They have also been constantly looking, with an enjoyably ruthless eye, at best practice across the board, not just within the arts sector. This has lead to more-or-less doing away with a traditional ‘what’s on’ feature and replacing it with what, for  want of a better name, we have been referring to as the ‘performance picker’.

‘The performance picker’

The approach we have taken to enabling the user to narrow down the extensive list of 300+ performances is inspired by, and applies to a theatre ticketing context, the approach of e-commerce sites such as ASOS and Virgin America (i.e. sites in which the user is confronted either with long product lists and numerous ways of filtering this down or forced to make a choice before they are shown any results).

Through audience research, user testing, sector research and the development of personae, we have identified that audience user needs typically begin from either date, price or programme (or a combination thereof).

We have aimed to make moving through the process as clear and unambiguous as possible. By stripping back the options available to the user at each stage and asking them to make their choices (related to those user needs outlined above) in what (we hope) is a logical order, we have simplified what can be quite an overwhelming number of options. We have aimed to strike a balance between the number of filter options available to be really specific with your requirements whilst not demanding the user makes 100 decisions before they’re shown any results.

Integrating with Spektrix

None of this would have been possible without a flexible and semantic API with which to integrate. Fortunately, v3 of Spektrix’s API is just that. As with all the sites we build, we have looked to reduce administrative effort wherever possible. What this means is data managed within Spektrix then automatically updates the site (changing event dates, times, prices, access performances, previews, etc).

When building the picker we have taken an approach of mixing regular API calls (around price, availability, start times etc) with less frequent calls (concerned with venue, seat positioning, contiguity of seats etc) to build an easily-queryable data structure in the website’s database. This approach means we can ensure the front-end performance of the picker is as swift as possible (without tying Spektrix up with tens of realtime API calls every time a user uses the picker) whilst ensuring the data being displayed to the user is always as up-to-date as possible.

Whilst v3 is still in beta we’ve been fortunate to be able to forge a strong collaborative relationship with the team at Spektrix from the outset. We have also enjoyed the benefit of input from Michael Nabarro, Spektrix’s CEO:

This has been a great opportunity for us all to put our heads together to figure out how the combination of an inspiring vision, a great web design team and a powerful backend system can all come together to deliver a really exciting online audience experience.

What’s next?

There has been a lot of fairly complex work to get to this point. The things we have been trying to achieve and the way in which we’ve been working with Spektrix are both totally new – we don’t think anyone has has attempted any of this in this way before. But we’re really proud with where we’ve reached and are excited about the possibility of developing the system further. Both Substrakt and the team at The Bridge see the initial go-live as v1 of this approach which we will be looking to develop and refine as people use it and we have insights into user behaviour that we can use to inform future iterations.

Pauline Fallowell, The Bridge’s Head of Sales and Audience Insight commented:

We know technology evolves constantly, as does the way people use it.  Therefore, we don’t want a site that is out of date as soon as it goes live, we want something that keeps developing in response to the people that use it, and is always looking to improve their experience.  Working on this project we are constantly learning and being inspired by opportunities that can enhance our site, however, we try to keep grounded by first and foremost making it be great at doing the most important thing it needs to do – sell tickets.

More and more of the organisations we work with are taking this longer-term, more considered and more strategic approach to their web projects. Rather than seeing the go-live date as the end of the project, we all now see it as the chance to begin tweaking the site based on real user feedback. Ultimately this informed, iterative approach means that your website is constantly evolving and responding based on your users’ needs which means it will be both more effective, and ‘last’ longer. We hope we are seeing the end of the ‘launch it then forget it for 3-5 years then start from scratch’ which dominated the sector for so many years.

This has been a brilliant project to work on. We work solely in the arts sector just because of projects like this. Interesting, ambitious work, with fantastic people. If you are interested in talking to us about a similar project, or simply to find out more about our approach to this project then drop us an email at team@substrakt.com.

My talk at Ticketing Professionals - APIs

I gave a talk at this year’s Ticketing Professionals conference – which advertises itself as “The Place Where Professionals Talk Ticketing”. It was rather vaguely titled ‘The API – what next for our 3 favourite letters?’ which gave me a pleasingly large target to aim at. This post is a write-up of that talk, it is not exactly the same as the talk (this approach is inspired, in part, by this GDS post).

This post is pretty long, so the tl;dr version is: APIs are how software applications talk to each other, they are pretty essential for how most things on the internet work. In relation to ticketing systems APIs have historically been used extensively (which is expensive and difficult) or not at all (which has resulted in a sub-par user experience). We (Substrakt) have seen success by adopting a ‘hybrid’ approach which utilises both API integrations and ticketing web products to deliver an improved user experience. You should expect your ticketing system to be a part of your digital makeup, it will never be the only part and in order to reap the benefits of joined-up thinking, some API work will be necessary. Ticketing APIs are imperfect because they are frequently inconsistent or undocumented which makes them difficult to utilise efficiently. Use the data in your ticketing system as the hub of your understanding of your audience(s) but look to augment and enhance that data with insights from other touchpoints, and also be looking to see how the data in your ticketing system can be used to drive other parts of your business, some API work may be needed in order to enable this. Cultural web experience should be better, they can be better, let’s make them better.


This talk (/post) is going to look at APIs, what they are, how they’re used and how – I think – cultural organisations could be making better use of them (and the challenges that this may present).

As user/customer expectations around their online experience continue to grow, and you all know more and more about your audiences and what those expectations are I believe that understanding the potential of APIs will allow you to keep up with your audiences’ needs and develop appropriate, sophisticated and meaningful web experiences. The days of developing entirely bespoke systems are over, however all modern software has (or should have – we’ll come onto that later) powerful and usable APIs that allow you to utilise, extend and enhance the data and functionality within them for your specific needs.

HOWEVER the systems that many organisations use, particularly within the cultural sector, don’t really seem to treat their APIs in the way that modern software companies should nor do many organisations seem particularly aware of or interested in the potential uses of APIs. And This Is A Problem.

OK, that all sounds great (or worrying), but, let’s start at the beginning….

API. WTF. LOL. (aka What is an API)

You may all already know what an API is – but for those who don’t; API stands for Application Programming Interface.

WTF, as Phil in Modern Family points out, stands for Why The Face. You’re welcome.

APIs are how software applications talk to each other. I’m not going to go into too much detail the ins and outs of how they work but they basically let software applications expose data and services in a structured way to other software applications.

There are loads of quite unhelpful comparisons that try to explain what an API is and how it works. I’ve seen them compared to menus, dictionaries and rulebooks. This, in part is because there are lots of different ways of delivering and consuming an API. I figured I’d add another one to the mix:

Think of an API as both a phone system and a set of rules on how to use the phone system. You want to call Jane to find out if she’s free on Saturday; you need to know what Jane’s phone-number is, how to dial that number into the phone and what to say when Jane answers. This is a rather tortured comparison, but APIs work in a similar way; if one application wants to get data or functionality into, or out of, another application this will typically be achieved via the relevant API.

This article tries to ‘explain it to you like you’re 5’ (and does a fairly terrible job of achieving that aim) but may be helpful in getting your head around some of the different levels of API (public, private, etc). It uses a full-on restaurant analogy.

It’s worth mentioning NOW (cue a rush to the exits) that everything I am going to cover from here on will be the use of APIs to deliver data/functionality via website(s). I am not going to be talking about the use of APIs to deliver functionality such as the agency mode in Spektrix or the stuff that Ingresso does or whatever (although they are all achieved via the same process, an API integration, and an API is like a phone system, or a menu, or a dictionary…ok? cool).

APIs in practice

OK, so, an API is how software applications talk to each other. But what does that mean? And why should you all care?

Most of the modern internet is powered by APIs.

APIs power every app on your phone, every programme on your computer.

They are “a big deal”.

If you have ever bought anything online, read an email, used social media, watched Netflix – you have benefited from an API.

In this section I could talk about almost any modern digital service and that would be an example of, often many, APIs at work.

But simply waving my arms to encompass ‘all of the internet’ probably isn’t that helpful. So I’m going to look at a specific example:

  • Twitter

A quick search on the app store tells me that there are lots of (more than 140) Twitter apps, every single one of these makes use of Twitter’s API (and a bunch of others but we’re just looking at Twitter here).

Let’s consider a basic Twitter app. At a minimum it’ll want to allow you to login to your account, read tweets, compose tweets, send tweets, reply to tweets. It’ll probably also enable you to search Twitter (in a variety of ways), do things with DMs, Images, Video, Emojis and whatnot. But we’re just considering the absolute basic functionality.

This app only works because of the Twitter API. The Twitter API has a documented and defined way of enabling people to ask for the data and services you need to do things you want your app to do. It also has documented and defined ways for your app to instruct Twitter to perform certain actions (e.g. post a tweet, or whatever). A good API operates in a predictable, documented and consistent way.

OK, I’ve probably laboured this point enough. Are we all vaguely clear on what an API is, and roughly how they might be useful and used? Yes? Excellent.

Arts context

All well and good but WHY SHOULD WE CARE, ASH. WHY SHOULD WE CARE.

Good question.

You should care because, much like many digital things in the arts, the solutions that are available probably aren’t entirely appropriate for your particular needs.

I mean, sure, they’ll do 80% of what you need but in having to be all things to all (cultural) people it is inevitable that there will be features that don’t quite meet your requirements.

It is also the reality that the makeup of almost any organisation’s digital estate will involve multiple systems, the end user doesn’t know or care about this, and to present them with anything other than a seamless experience is going to result in a loss of effectiveness and, plainly, people getting annoyed. Whether that’s around seat maps, integration between ticketing and other e-commerce systems, personalisation of content, being able to upsell at various points in the user journey (not just within the purchase pathway) or whatever, you need to be able to build the ‘joins’ and, inevitably, this will require – you guessed it – APIs.

Sidenote: I’ll be focusing mostly on ‘commercial’ applications of APIs, there are numerous other things the can be used for but as this is a ticketing conference I thought I’d focus on ticketing and stuff. It seems museums have been having this conversation for longer than the performing arts and I think that’s mainly because collections are big catalogues that are relatively easy to make API-able – The Rijksmuseum have taken this approach which has allowed them, and other people to build products and experiences ‘on top’ of their APIs. The V&A also have an API which they’re apparently in the process of upgrading (I suspect as part of the wider web project).

In the recent past ticketing websites’ usage of APIs was usually either non-existent or very involved, in the absence of any particularly sophisticated web products organisations would look to a ‘fully integrated’ approach in which an entirely bespoke purchase pathway would be built drawing on the ticketing system’s API.

This was a complex, costly, high-maintenance and relatively ‘brittle’ approach although it did allow the organisations who followed this approach to be specific in how they built the user experience, however it also meant they had to do boring things like build their own payment processing gateways which is *difficult* and *boring*.

It also, in my observation didn’t necessarily result in a much better user experience, even though it frequently cost an order of magnitude more than the off-the-shelf product. But some people just seem to like spending lots of money on things. Maybe it’s ‘a brand thing’.

This, in my experience, is pretty representative of most people’s approach to their online user experience. You’ve got a website. You can sell tickets. You’re taking money. It’s fine, everything is fine.

Actually EVERYTHING IS PROBABLY NOT FINE

It is the reality that the makeup of almost any organisation’s digital estate will or SHOULD involve multiple systems, no one system can do everything and anyone that claims that isn’t the case is a liar. However the end user doesn’t know or care about this, and to present them with anything other than a seamless experience is going to result in a loss of effectiveness and, plainly, people getting annoyed.

Whether that’s around seat maps, integration between ticketing and other e-commerce, booking on their phone, personalisation of content, being able to upsell at various points in the user journey (not just within the purchase pathway) or whatever, you need to be able to build the ‘joins’ and, inevitably, this will require – you guessed it – APIs.

But didn’t I just say that bespoke API-based development was expensive and dumb? I did. And I’m not naive, I know the UK cultural sector is time, expertise and cash poor. You want solutions that are straightforward to implement, deliver and maintain. A first step will be a hybrid approach, let the various systems each do what they’re good at. Your ticketing system will (probably) be good at letting someone pick a ticket and taking their money. So let it do that and lean on something, or someTHINGS else to deliver the rest of the user experience.

Let’s reconsider the Twitter app I outlined earlier, it is not attempting to replicate Twitter, it’s not trying to build a social network, it’s probably trying to make it easier for the user to consume and engage with a social platform through the app.

Similarly I am not saying you want to try and build your own way of taking a customer’s payment but you might want to build your own way of them navigating your event listings, or performance dates, or managing their account, or sharing your event or whatever.

So how might you utilise your ticketing system’s API when you’ve finished reading this post?

  • Reduce/remove duplication of data-entry
  • Doing ‘stuff’ with performance-level attributes
  • Sitewide cross-/up-selling

I hate the phrase low-hanging fruit but THESE ARE ALL LOW-HANGING FRUIT.

Are you having to duplicate event setup? Setting up things in your ticketing system and then again in your website? Are you having to update the same thing in two separate places (like if a start time changes or your pricing changes). If so this is a terrible use of your time.

Do you have some performances which are signed, or audio described, or are previews, or a press night – would you like that to have some effect on how your website displays these particular performances?

Do you have specific cross or upsells set up within your ticketing system? Are you trying to achieve these elsewhere on your site?

All of these can be achieved by accessing your ticketing system’s API and using that data to manipulate the UX on your website.

Return to your constituencies and prepare for government, or ask your web agency to help you achieve all of these. They are all relatively straight-forward, we have achieved them with Tessitura, Spektrix and eSRO (TopTix).

You will notice that all of these things relate to a relatively simple reading-basic-data-out-of-a-system-and-affecting-the-ui-of-a-website, and you’re right, this is simple stuff.

What’s next?

OK, I’ve got about halfway through my talk (/post) and I’ve not really talked about what’s next, what’s exciting. What innovative-change-making-nonsense you could consider.

I’d say, broadly, the web products that are on offer are good enough for most organisations to consider. They offer enough flexibility and customisability (is this a word? probably not) that it probably isn’t a good use of anyone’s time and money to go ‘fully custom’ these days. They allow you to sell a ticket. Trouble is, they’re not brilliant at much beyond this. But maybe that’s fine, in fact, it probably is. We should maybe tell them to stop trying.

Having a single customer record is absolutely the best way to approach managing your customer data, but as a result of this your CRM needs to be actually able to M the C R (manage the customer relationship). Buying a ticket will not be your audience’s only touchpoint with your organisation and any insights you are looking to gain into your customer data are only going to be enhanced by building a more rounded picture of what’s going on. The best way to do this? Probably via an API. (I mean, in reality it’ll probably be with another solution that integrates with the CRM’s API but the point stands).

Your ticketing system should also play nicely with any other business tools you may be using which could include; dashboards, full-spectrum/multi-channel analytics, 3rd party segmentation tools, 3rd party ticketing platforms.

That all sounds very nice and neat and whatnot but it’s always helpful if there are real examples to point to. To this end it’s certainly worth looking at Salesforce – Salesforce, amongst other things, is a giant, enterprise-level CRM system. Their API (or rather, APIs) enable an entire ecosystem of apps and custom integrations to be created that utilise and extend the data and functionality available natively within Salesforce, this allows organisations to utilise Salesforce as the data engine that drives numerous other parts of their business. Ticketing systems try to achieve this too but often it is by trying to develop and offer functionality natively within the system rather than recognising that there’s probably something out there that does it better and enabling that other system (whatever it may be) to integrate in a straight-forward manner.

But what else might be worth considering? People have been talking about digital memberships for years (I seem to remember the Barbican either launched one or merely talked about launching one in a brief I saw from about 10 years ago). Kevin from Tessitura mentioned the work the Detroit Symphony does around engaging with new audiences and monetising their digital content. APIs are essential in building an engaging online platform through which people can consume whatever it is you’re going to sell them (content, experiences, whatever) and you can get the money and data and shove it back into your ticketing system. Does this mean your ticketing system should offer the ability to paywall content? ABSOLUTELY NOT, THAT IS MENTAL.

The future of the web is in massively scaleable and extensible systems talking to each other to deliver a consistent user experience across all devices. You are already seeing low-level automation with tools like IFTTT and Zapier which integrate with an ever-growing range of systems to automate tools, there’s no technical reason why this shouldn’t be possible with ticketing systems. The sheer volume of day-to-day work that could and should be automated is likely to be, if you think about it, staggering. How much more effective, strategic and measured could you be if you could automate all of those fiddly things that you have to do every day, week, or month. Whilst you’ll never be able to automate strategic thinking I’d urge you to consider how some level of automation may help you on an operational level. Automation saves me from becoming totally overwhelmed and going mad, I suggest you try it.

What else? The next way that users will want to interact with your digital presence is probably via chatbots, inevitably ticketing systems are going to be slow off the mark with presenting this functionality natively so I imagine ambitious organisations will want to experiment with some of the open chatbot libraries that are already available to help with the natural language processing (thanks Google and Amazon!) alongside an integration with their ticketing/CRM system’s API.

This is usually the point at which someone will say “what about an app?” and I will go “yes, like an app BUT DEFINITELY NOT AN APP LIKE YOU’RE IMAGINING IT”.

Why?

Silicon Valley analyst Andrew Chen attests that the average app loses 77 percent of its users in the three days after they install it. After a month, 90 percent of users eventually stop using the app, and by the 90-day mark, only 5 percent of users continue using a given app.

HOWEVER I do believe there is going to be a growth in ‘customer service-focused apps’ which will allow things like in-venue-patron-management e.g. your development team can see at a glance which patrons are in attendance alongside some sense of their customer history (I believe the Royal Danish Theatre have embarked on something along these lines, I saw a bit of a talk at last year’s Ticketing Professionals conference that showcased it) – this will also enable front-of-house staff the chance to offer a more personalised service, could enable you to give patrons with access needs a more tailored service, could allow the deployment of more appropriate/reactive/sensitive in-venue marketing (via screens and whatnot).

Basically if you want to make your theatre like the shopping mall in Minority Report (and why wouldn’t you!?), that will happen via an API (or APIs) – obviously you don’t want to do that, I’m being hyperbolic to make a point.

We hear a lot at these conferences that your business needs to be driven forwards by reacting to your audience needs/expectations/etc – the smartest, most measurable, most connected way of doing this is going to be in integrating disparate systems (through, you guessed it, an API) so that they deliver both a joined-up and meaningful customer experience but also an actionable and useful level of insight.

What are the issues?

So, if the future of the web is in massively scaleable and extensible systems talking to each other to deliver a consistent user experience across all platforms and devices. Are ticketing systems ready for this? Right now? Probably not.

I can sort of understand why this might be the case, at the moment there probably aren’t loads of customers clamouring for this.

Are web developers part of the conversation about how the APIs of the future will be built? They may be, but based on a small straw poll I ran last week these conversations doesn’t involve anyone I am aware of or in touch with. And I basically only hang out with web developers. Because I’m cool.

In relation to the actual APIs that are available in the sector: one word sums up the difficulties. Ok, actually, two words:

  • inconsistency
  • undocumented

A good API behaves, as I said earlier in a consistent, structured and documented way. The vast majority of the APIs available in the arts sector do some of these things but rarely do they do all of them. 

Some of this seems to be down to the way that the systems in question are set up – rarely are two ticketing systems ever implemented in the same way and this lack of consistency feeds through to the API. But beyond these explicable symptoms it seems there is a total lack of interest in and understanding of how anyone might ever use an API, more and more (all?) ticketing systems are selling themselves on the ‘single customer record’ thing and if you want to be the master record for customer data then you’re going to have to play nicely with other systems.

Ticketing systems are software products, most other software product have an extensively documented public API that will usually – by now – be a nice RESTful product for us to play with.

Most ticketing systems do not. Granted there are moves in that direction but these will mostly be incomplete until next year…or the year after.

Most ticketing system APIs are constructed in a slightly unhelpful way, there are essentially a bunch of endpoints that allow you to go into the system, as it is configured for the organisation, and request data that conforms to those structures. No thought seems to have been given to the structure of the API, what makes sense internally for an organisation will rarely make any sense to anyone else. An example:

To get the venue name in a system we work with regularly, we have to make three API calls (3!). One to get the production, here there is the seating plan ID. Using the seating plan ID, we make a call to the plan endpoint. This returns the venue ID, and then finally we can call the venue endpoint to get the name of the venue.

This will almost certainly have been constructed this way to deliver native functionality such as reporting or because of some other totally valid reason but you can see that it is not an efficient way to expose this data to any external developers wanting to make use of it. This is not the worst example in the world but gives you a sense of the spaghetti we’re frequently trying to untangle. Oh and most of the calls I mention there were undocumented.

As I said earlier, ticketing systems are usually good at enabling ticket purchasing, they are usually very very bad at the other things you might want to offer online.

Fortunately there seems to be a move away from offering every service under the sun, a recognition that there are other specialist products out there that offer a better <insert service here> experience than a multi-functional ticketing/crm system ever could – this is a good thing . (I assume) Ticketing probably drives most of your revenue at the moment but with an ongoing squeeze on funding organisations are having to ever-increase their revenue from other sources, whether that’s through donations, memberships, shops, commercial hire or whatever – the systems that achieve these growing revenue streams need to be able to integrate with your ‘main’ crm system (cos otherwise you get yucky fragmentation that makes everyone sad). An example of the potential issues is that lots of e-commerce systems will also demand that they are the CRM. How do you solve this? Through an API integration. Will that be possible with every ticketing system’s API? Truth be told, I couldn’t tell you.

A conclusion

As is perhaps becoming clear, this topic is almost impossibly large to cover in one go (without this turning into the longest post/talk ever). So, what are my main conclusions (for today at least)?:

  • Ticketing system providers need to step up their efforts to get up to speed with what they offer when it comes to the web. Too often their approach will begin and end with their web product which, whilst perhaps understandable from a business point of view, also dangerously underestimates or misunderstands just how far the ticketing ecommerce experience is from the ‘best-in-class’ experiences out there. Through being able to offer proper APIs system providers make it possible for developers to build appropriate solutions for organisations.
  • I would encourage ticketing system suppliers to work WITH developers and agencies to identify where their strengths and weaknesses are and do more of the stuff they’re good at and less of the stuff they are frankly terrible at. The fact that it is 2017 and most of the major systems’ web products still haven’t really figured out how to deliver an optimal mobile experience is frankly ridiculous.
  • Your ticketing/CRM system is an important system, however I believe that it will not be the only important system. The broader trend is towards a proliferation of focused, specialist services (and microservices), rather than monoliths that try to do everything. In order to be able to make the most of this you need to be aware of the potential around integration and have at least a broad idea of how it might happen – which will be via an API.
  • Think about where you can improve the user experience, and understand where you can’t. Some things are just really really difficult. Seat maps are pretty hard. Taking payments is pretty hard. Multi-faceted search is quite tricky. We’ve heard a lot over the past two days about measuring things, gaining insights, and then actioning those insights. But the harsh reality is there will just be some things that you can’t change, your system provider is going to shrug and say not enough people are asking for that feature, or it’s coming in some mystical future release with a delivery date of the 40th of nevember (this joke works better written down). Work out where that’s a battle that you just need to give up on, and where there are opportunities for you to forge your own path. I’d love to see more partnership working between cultural organisations and agencies, or even better still between cultural organisations, ticketing systems and agencies, or even better still between groups of cultural organisations, ticketing systems and agencies. You get the point. We are luckily enough to be working on a couple of projects at the moment which are taking this approach and these are, undoubtedly, where we are doing our best work. The ticketing system freely admits there are areas in which they are lacking, and we are working with them and the client to deliver a vastly improved user experience. Sorry, I know that’s a bit vague but one of them is going live next month and involves Nick Starr so watch out for that.

So what do I hope you take away from this talk? I’d encourage you to think about how you can use the data in your ticketing system to drive other areas of your business. Don’t be constrained by the limitations or functionality inherent in your system, just think about what use you can put your data to. Then see if there are systems that can help achieve that end goal, then work backwards and see if the two systems can be knitted together – via their respective APIs – rather than through trying to go out and totally reinvent the wheel, ain’t nobody got time for that.

Secondly I would ask you to encourage the ticketing system providers to get a move on with modernising their APIs. And documenting them. That I can receive 3 different sets of  documentation from a supplier (incomplete in different ways) for apparently the same version of an API is stupid. Stop it.

Thirdly I would encourage you to take a long, hard look at the web products available to you from your supplier, compare them to other ecommerce experiences and work out where the flaws are. It may be that there gaps that some sensible custom work can fill. It’s almost certain that there are. I think the immediate future will be in these hybrid systems that use native functionality for the heavy lifting and deliver something bespoke to enhance the custom experience.

Fourthly, talk to your colleagues, lots of people struggle with the same issues and there may be benefits in a joined-up approach to solving some of these problems. A problem shared and all that…

Finally, and this may not relate to ticketing system APIs in any way, but give automating some of your workflow a go, it’s a godsend.

Thanks!

One month in...

Had you told me 3 months ago that I would be working for a digital agency, I would have laughed you all the way to the nearest cat cafe. Although, if you told me 6 years ago that I would spend the next 5 and a bit years working for a ticketing company, I would have laughed in your face then too.

I have only ever chosen to work at places where I feel a strong passion for the people and what they are striving to do. This was certainly true at Spektrix and one month in, I can say this is also true at Substrakt.

So why had I have never thought to work for a digital agency before and why Substrakt? If I’m honest, my own experiences of web projects and time spent supporting clients navigating often toxic agency relationships resulted in a wariness of digital agencies. Overly technical explanations, a level of “digital arrogance” and, in the male dominated world of tech, all too often a faint whiff of misogyny… (I won’t tell you how many times a developer had asked to be transferred to “one of your technical guys”, when they had failed to follow my instructions for filtering iframes).

Working with agencies was therefore never a favourite pastime…. Mistrust of web agencies seems a little endemic in the arts in my experience, it’s certainly something that featured high on the list of concerns for clients attending our first Digital Works sessions.

So why Substrakt? I certainly wasn’t job hunting at agencies, or seriously job hunting. The opportunity to work at Substrakt was a bit of a curveball and if I’m totally honest, I took a punt. In helping them spread the word of their new Head of Client Services role, I found myself drawing a number of parallels with the approach to client relationships at Spektrix. Both have a clear aim to do it differently. For Substrakt it is to work in a sector on projects they enjoy, to take away the shroud of mystery when it comes to pricing and to deliver work in agile sprints – involving the client at regular intervals in open and honest discussion, in plain English.

A move to Substrakt provided the perfect trade-off of; bringing a wealth of previous experience, of the sector itself, of ticketing technology and of developing meaningful, collaborative client relationships, whilst learning more about a hitherto unknown of digital and web development.

My career to date (minus facepainter, bouncy castle designer and cat sitter) has focused on data-driven marketing, audience development, segmentation and email strategy. In the broader sense of arts marketing; digital and web projects are something I have less experience of. However, I’m still occupying that space between the client and the technical team (henceforth affectionately known as The Nerds) –  translating PHP, Rails or C# into artspeak and vice versa. Whilst also still getting to keep a toe dipped into Spektrix developments via a number of shared clients. Triple win.

This first month has been spent meeting and getting to know our existing clients and their various projects. Feeding into pitches and new projects and sharing my knowledge of the sector with the rest of the team. Clients thus far have commented how much they enjoy the direct relationships they build with the developers working on their projects – I’m not there to replace that, but I am there to ensure that Substrakt remains empathetic to the sector that they have chosen to work with. Championing the voice of the client and providing a critical understanding of the challenges and idiosyncrasies that make the sector such an exciting one to support.

Substrakt’s firm belief that the working partnership should be just that, a partnership, is right up my street. It’s an opportunity to get to know the organisation inside and out, pulling in the right people at the right time to produce something we’re both really proud of. It’s exactly how I like to work. Rather than seeing website projects as a standalone piece of work we see them as a constantly evolving, developing piece that can respond and adapt not only to internal changes but to the rapidly changing pace of technology.

This is certainly an approach that will benefit the sector, and really what it’s crying out for. A lot has been written of late about the sector falling behind in the adoption of new technology. Due to a mixture of attitude, lack of funds and digital skill, specifically at senior or board level, it unsurprising that this is the result. However, working in partnership with digital agencies and experts, there’s no reason that digital expertise can’t become more widespread across the sector. It’s certainly happening in pockets, so I hope I can go some way in helping Substrakt, and our clients, to make it our norm.

OKRs at Substrakt

Having run a digital agency for over 10 years, the one thing I do know about is the importance of a good team (especially when you can’t design or develop a website yourself!). It is essential that we all stay passionate, motivated and on top of our game at Substrakt, otherwise quality suffers, the enjoyment disappears and then, well, what would be the point!?

In addition to delivering great projects for great organisations, we felt we could use something more to help with morale and professional development. We’ve tried several things over the years, but have found OKRs to be the most effective to date. This post gives a brief run down of what OKRs are and how we are using them at Substrakt.

What are OKRs?

OKR stands for Objectives and Key Results. They are a way of setting ambitious objectives and working towards them by breaking them down into achievable and measurable elements (Key Results). When I was first learning about OKRs this video from Google Ventures helped outline the benefits of this approach.

How do we use OKRs at Substrakt?

We decided to use OKRs at two levels, individual and team. You can also include company ones, but we felt the team ones would cover any broader company ambitions. The whole team present and discuss potential OKRs at the start of every quarter (as part of our regular strategy days), together we refine and agree our goals for the forthcoming quarter. We’ve settled on roughly 2-3 Objectives for both individuals and teams, and each Objective tends to have about 2-4 Key Results. During the strategy day we also review the previous OKRs, scoring each other on the Key Results from 0-1, with 1 being a complete smash out of the park and 0 being a non starter. The idea is they are very ambitious, so if there are plenty of 1s knocking about then we’re not thinking big enough. If we’re averaging around 0.6-0.7s then we’re doing pretty well.

In an ideal world the individual OKRs tie in nicely / work towards the team ones, aligning with the company vision, which is why it’s essential to discuss and agree as a group with transparency over all OKRs.

An example of a Substrakt team OKR for this quarter is ‘More Open Source!’
That is a fairy broad Objective, so the key results are more specific and measurable:
– 5 new open source projects (we included our specific project names too)
– Maintaining existing (X number of forks/likes on Y projects)

One example of an individual Objective was ‘Learn Go’, this involved the following Key Results:
– Implement on a project
– Go or no (deciding if it was going to be useful for the wider team and our work at Substrakt)
– Giving a show-and-tell session to the team
– Write a journal post

For a few years we have reserved Tuesday afternoons for internally-focussed development. This is now mostly dominated by working towards our OKRs. We quickly meet (in a standup format) before starting so everyone can share where their focus will be for that afternoon.

To manage OKR progress, we use a simple Google spreadsheet that includes the team members, associated objectives and key results and a column for the score. We have new sheets to cover each quarter.

It feels great to have an effective framework that works well for our team. Tuesday afternoons are now much more focussed with everyone having a clear route to achieving ambitious goals. We have seen positive results across the board including the the introduction of new languages, improved processes, introduction of events and the development minimum viable products.

 

Becoming a JavaScript Ninja

Used on 72% of all websites, it’s pretty safe to assume that most web developers will run into JavaScript from time to time. Albeit, in a variety of different shapes and sizes, such as jQuery, React, Vue.js or AngularJS, there’s no avoiding it.

Yet, somehow, I did.

I was a professional LAMP developer for seven years before joining Substrakt last August. Previously, I had worked solely on one product for my entire career and, as a result, I never ran into JavaScript.

I know…

Shame, Shame, Shame

Armed with a copy of John Resig’s and Bear Bibeault’s ‘Secrets of the JavaScript Ninja’ and a CodeSchool account, I decided to use my Substrakt OKR time to learn native JavaScript in three months.

Substrakt Objective and Key Results (OKR) time is half a day each week where we get to up-skill or work on personal projects. We set and review two to three measurable and ambitious goals every quarter at a team strategy day. 

(Keep posted, as a journal post on how we use OKRs is currently in the pipeline.)

On my journey from absolute-novice to borderline-JavaScript-Ninja, I have pretty much immersed myself in the language, reading every article and blog that I have come across.

It’s been pretty intense – there’s a lot of content out there!

So, here is my roundup of the five things that I have found to be instrumental to making the most out of JavaScript and some of the better resources that I have come across along the way.

1. Browser Developer Tools

The harder it is to debug, the harder it is to develop.

With all browsers interpreting client side languages in a slightly differently way, making use of the built in developer tools that many browsers have to offer is essential.

Chapter 2 in Secrets of the JavaScript Ninja offers some great advice on using Console to debug. If you don’t have a copy of that book yet (which you should, by the way) then these are some of the better resources that I have come across:

In addition to understanding how to get the most out of browser developer tools, it is also really useful to know what console commands are available to you. This article by Dwayne Charrington has a great section on console commands.

2. Scope

Generally speaking, most developers will do their best to ensure that the global namespace is clutter-free. However, in JavaScript, it’s actually quite easy to flood the global namespace because, unless you explicitly define your variables with the var keyword they will be globally scoped to the window object.

Implicitly defining variables without the var keyword is a common and totally uncool mistake that is often made by many novice JavaScript developers.

See the example below:

var varOne = 'Global Scope';

function scopeNinja() {
 var varTwo = 'Local Scope';
 varThree   = 'Global Scope, Again...';
 
 console.log(varOne); // Global Scope
 console.log(varTwo); // Local Scope
}

scopeNinja();

console.log(varOne);   // Global Scope
console.log(varThree); // Global Scope, Again...
console.log(varTwo);   // Uncaught Reference Error

For different reasons, varOne and varThree are both available globally whereas varTwo is only available from within scopeNinja function due to the use of the var keyword. If you test the code yourself, you will notice that varThree is available from outside the function, simply due to its implicit declaration.

This is an extremely rudimentary explanation of local and global scope and the use of the var keyword. For a more thorough tutorial on scope, check out either of these write-ups:

A strong understanding of scope is key to being an efficient and safe JavaScript developer. If you know how to make good use of scope, you can protect your applications from vulnerabilities which is vital when performing delicate tasks.

For example, if you were writing a JavaScript game that saved a score into a database, it would be foolhardy to have your score property scoped to the window object. This would mean that people could modify score via console and potentially compromise your database or simply ruin your score set with fake data.

For those who are interested in application security through scope, this article on protecting properties and methods by using JavaScript’s object model by Douglas Crockford is extremely useful.

3. Hoisting

JavaScript is an exceptionally flexible language – you can declare a variable almost anywhere! However, without a basic understanding of hoisting, you can quickly turn simple declarations into annoying little bugs (and there are plenty of “bugs” in JavaScript without the need to add your own).

In a nutshell, hoisting is the act of being able to forward reference declared functions and variables.

“Forward referencing”, or as it’s sometimes known, “forward calling” is the act of calling upon declarations in your code that have yet to be defined.

If you do forward reference a declared function or variable, the JavaScript interpreter will recognise this and hoist all required declarations to the top of the current scope, ensuring that every line of code can be executed when it’s required.

Understanding the very logical principles that govern hoisting is essential to making efficient and robust JavaScript applications. I found level three of the JavaScript Road Trip Part 3 by Code School to be the best resource for learning about hoisting.

For those who don’t have a Code School account here are some other great links for learning everything there is to know about hoisting:

4. Functions are first-class objects

As a JavaScript developer, “functions are first-class objects”, “functions are first-class citizens” and “first-class functions” are phrases that you are likely to hear from time-to-time.

As you may already know, a JavaScript object is an unordered collection of key-value pairs. If it’s not a primitive such as boolean, integer, string, undefined or null then it is an object.

In JavaScript, functions are of the object type, so when people say “functions are first-class objects” they are simply saying that you can do with a function what you can do with an object.

This means they can be assigned key-value pairs, they can have constructors, contain other functions (methods) etc, etc.

Check out any of these helpful but rather chunky articles for a good understanding of object oriented JavaScript and how to make the most of functions as first-class objects.

5. Prototypes

If, like me, you are/were a complete novice then you may have already used prototypes without realising it. Admittedly, I hadn’t even heard of prototypes until I reached the end of my JavaScript Road Trip on Code School but wow, are they useful!

A prototype in JavaScript is an object that other objects will inherit properties from. By default, every object has a prototype and since prototypes are also objects, they have prototypes too!

A great example of a commonly used prototype is one that you may already be familiar with:

var myString = 'Hello World';
console.log(myString.length); // 11

That’s right, length is a prototype of the string primitive, just assum is a prototype of the integer primitive.

Not only can you simply call on existing prototypes such as length and sum, but you are also able to create your own prototypes or modify existing prototypes.

As you can imagine, this is extremely useful and a solid understanding of JavaScript prototypes will really begin to pay dividends as you delve more into the object oriented side of JavaScript.

CodeSchool does a wonderful job of explaining prototypes on Level 5 of their third road trip. Alternatively, I found these two articles particularly useful:

Conclusion

Whether you like it or not JavaScript is here to stay.

Understandably, as Jose Aguinaga quite hilariously points out, the thought of learning JavaScript can be a daunting prospect with all of the different frameworks that are available.

My advice: Try to resist the temptation of learning a framework first.

Keep it simple and learn the fundamentals of native JavaScript.

The skills that you ascertain from learning the native language will be a lot more transferrable than those picked up when learning a framework. As a result, any frameworks that you do decide to master later down the line will be that extra-bit easier to learn as you will be a far more robust developer for your troubles.

Finally, if you would like to talk about any of these points in more depth or would simply like to talk dev, hit me up on Twitter.

Design Drops #2

Here are a few useful bits of design related inspiration and toolage I’ve found over the past few weeks.

reMarkable: e-Ink tablet & pen

This caught my eye due to the fact that I swapped sketch books for an iPad pro and Apple Pencil a year ago. This promises to be a lightweight, longer lasting version only with a black and white screen and more realistic paper feel. It looks nice, but with a $700 asking price it’s not going to tempt me away from the iPad. However it is available at 47% off pre-order. Tempting, but I’ve been hurt by these kind of pre-orders before (looking at you Wacom Inkling).

Yassin’s Falafel House – Square

Square pioneered receiving payments via your mobile device, driving the ability for anyone to set up a shop anywhere. Focussed on a Syrian refugee’s struggle to start life in a new country, you won’t find a more poignant customer testimony than this one. It’s really nicely laid out too.

The underestimated power of colour in mobile app design

Smashing Magazine discusses the fundamentals of colour in app design (but applicable in many other cases).

The Type Snob and how to turn into one

This article talks about appropriate typography, combinations and optimising type for purpose.

 

Blue Light Filter Glasses

Apparently blue light from computer screens is really bad for your eyes. I started reading up on glasses you could get to filter the blue light, like a physical version of f.lux or Apple’s night shift. These stop you from suffering eye strain from prolonged computer use as well stopping the blue light keeping your brain awake at night. They’ve been pretty good so far, I’d certainly recommend them. Mine are from Blueberry however the Felix Gray ones are super nice (and not available in the UK, boo).

Enormous Collection of Star Wars Cross Sections

Some kind fellow posted dozens of scans of pages from the incredible cross sections books much to this geek’s delight. They are awesome.

What makes 'a good brief' - an agency perspective

I get asked for ‘examples of good briefs’ a lot so I thought it might be useful if we did a quick rundown on what we, as an agency, think goes into a good/useful/successful brief/rfp document. I’ve been on both ends of this process having lead digital procurement at arts organisations, universities and charities, advised on the development of briefs (in the arts sector predominantly) and responded to briefs/RFPs as a freelancer and now at Substrakt.

Rationale:

Why are you getting a new site (or commissioning the project, whatever it may be)? Is there something wrong with the current site, if so – what? There’s nothing wrong with ‘we want to refresh things’ but inevitably there’s usually a bit more to things than that and the sooner we can get a sense of those underlying drivers, the better; has the relationship with your current agency broken down, is it the result of a new marketing or artistic director starting and wanting to make their mark, is it someone’s vanity project?

Ambition:

What are you looking to achieve with the new site? Sell more tickets? Improve your ecommerce offering? Provide a better platform for your education activities? Give greater insight into your collection? Bring all of your micro-sites together into one site? It’s always helpful if you can articulate what the project is trying to achieve.

Audience:

Who is this project aimed at? And ‘everyone’ isn’t an especially useful answer. You know who your audience are, so tell us! Do you cater mostly to a younger audience, the majority of whom visit your site on a mobile? That’s the stuff we want to know.

Timelines:

When does it need to go live? If there is a specific reason that the site needs to go live on a particular day then let us know and we’ll work to that (unless it’s mad, in which case we’d tell you), otherwise it’s always helpful if this is an area for flexibility.

Most of the timelines that are communicated to us are on the ambitious side but are usually as a result of the organisation’s eagerness to get the new site live ‘as soon as possible’. Artificially shortened timelines aren’t great news for anyone, depending on the nature of the project things will take a certain time and it’s useful if you are open to being guided by the people responding to your brief about what is realistic. This is usually where prioritisation is required, breaking things down into must have/nice-to-have/ideal world can quickly help clarify what’s important.

It is also worth considering what has to be in for go-live and what can follow after that initial deadline.

Technical things:

Your website will not exist in isolation, it’s helpful (actually: essential) if you let us know what other systems we will be dealing with (ticketing, CRM, ecommerce, email, etc etc) and how you’d like us to deal with them. If there are things that have to happen in terms of integration then please detail that in the brief.

It is also helpful if you are able to provide read-only access to your analytics account so that any hosting recommendations can be made according to the particular requirements of your actual traffic. Just including numbers in your brief doesn’t help with this as we need to know the shape as well as the size of the traffic to the site.

Budget:

Something that was discussed at Digital Works #1 was the reluctance of arts organisations to put a specific budget in their briefs. The perception was that if you put a number then inevitably all of the briefs came in at exactly that number.

Having been on both sides of this I can appreciate the arguments, however without at least some indication of the budget it’s impossible for anyone responding to let you know how much of your ambition is actually achievable. If you’re worried about everything coming in at exactly the same price then put a range in but please, please, please put something. It’s also important to consider everything that is required to deliver a website: the initial build, hosting, assets (fonts, images, etc), support beyond the initial launch et al.

Brand:

Sometimes we will be involved in projects that are running alongside a rebrand so any brand-related information will be out-of-date and not applicable. But you will likely have some sense of brand, tone, look and feel, and organisational identity. The more of this you can communicate to us, the better, it’s also helpful if you provide the details about which agency may be working on the branding/rebrand.

UX/UI opinions:

It’s very helpful if you can point to some examples of what you consider to be particularly good (or bad) UX and UI – and a brief explanation about why you think it’s good (or bad). This ensures we’re all starting from, if not the same page, then at least the same book in terms of look and feel.

Length:

We find the best briefs are about 10-15 pages long (this may seem totally arbitrary but, for whatever reason, we find it’s usually the case). We use our Discovery sprint to really get to know the ins and outs of your organisation and how each department functions. For the purpose of the brief however, it’s much more useful to have a concise snapshot of each department’s requirements for the website. Brevity may be the soul of wit, but it also serves as a snapshot to understanding your organisation’s mission and where the project fits within that.

Beyond the brief:

The most successful working relationships, in our experience, are formed when the procurement process involves something more than simply a brief or RFP being issued and then followed by a pitch after which an agency is appointed. Andy (Managing Director) has likened this to getting married after one date, which is a pretty good analogy in my opinion.

You should expect your brief to be the start of conversations, by all means be specific about how you’re going to deal with queries (North American organisations seem better at this than the Europeans) – and circulate any answers to everyone involved in the process so that no-one is at a disadvantage but please be open to this discursive element of things.

A final thought:

A website is a complicated, technical project. There’s no getting away from this, simply treating it as a design project is going to result in something that’s – most probably – not going to meet the needs of your users (regardless of how innovative or beautiful it looks).

If you feel you simply don’t have the requisite technical expertise in-house to be able to make a judgement on who can and can’t meet your requirements then there are a number of consultants who operate exclusively in the arts sector and can sit in on your procurement process to advise you.

Equally it’s worth seeking out the opinion of other suppliers with whom you already have a relationship (e.g. your ticketing system supplier) as it’s likely they will have come across agencies and will be able to offer their opinion on who to approach (and avoid).

Digital Works #1

I go to my fair share of conferences and have noticed that all of the best and most useful things are said in the coffee breaks, in the bar or…basically anywhere outside of the main, organised programme. Alongside this I’ve long lamented the, often ridiculous, range of digital things that people in arts organisations are expected to be experts on and the lack of honest discussion around what success looks like and how people got there.

So with that in mind we (Substrakt) decided to try and remedy this by organising an event that aimed to create a space where these valuable, peer-to-peer discussions could happen, examining the broad range of things that arts folk are expected to make happen on a daily basis. This included procurement, outsourcing, analytics, user testing, content creation and distribution, connecting digital and physical experiences and ended with a look at chat bots, automation and AI.

Rather than writing a blow-by-blow account of the day I’ve tried to highlight the most useful/interesting elements of each discussion below.

Throughout the day we tried to return to 3 key questions:

What does success look like?

What’s the value?

What are the obstacles?

Session 1: Digital as a business function – procurement

I often speak to friends and colleagues in the arts sector who struggle with the difficulties of digital procurement, there seems to be very little in the way of shared knowledge on the subject and we, as an agency who are on the other side of the process, frequently see lots of the same missteps being made.

Clarity & Honesty: Broadly people working in arts orgs said they are looking for more honesty and clarity from agencies – an admission or frankness from agencies that perhaps not all of a brief may be achievable for the budget available, or things may have to take longer than anticipated. People also said they wanted it to be explained with much more certainty exactly what the money was being spent on. Communication in general was also identified as an area for improvement on all sides. There was also an admission that arts organisations need to be more honest with themselves about what their priorities are and be able to communicate that more clearly through briefs.

Internal politics: Often big digital projects can highlight, or exacerbate, internal tensions in organisations. From an agency’s point of view it is preferable if they aren’t being used as the tool to solve those tensions and if buy-in from all parts of the organisation is already secured  by the time an agency is appointed. A couple of people who have gone through this process a few times said this can often be achieved by doing a lot of the initial thinking before the procurement process begins – involving as many colleagues as you can, as early as is appropriate when writing a brief. People had also seen success in adopting a phased approach to delivery so that not everyone was expecting everything to be there on the initial go-live.

Mix it up, when appropriate: It seemed everyone agreed it was preferable to build long term relationships rather than changing suppliers on a project-by-project basis. However there is also the danger that this can breed complacency – on both sides. It may be that you have a ‘core’ ongoing relationship and you use those people to advise/input on other ‘project-specific’ relationships you may look to develop. This ensures you are still getting quality, ongoing support and input from a group of people who know you well and can advise you from an informed point of view but you are also benefiting from a regular injection of new ideas and fresh perspectives.

Session 2: Digital as a business function – when to outsource

Approach: There seemed to broadly be 3 different types of approach; 1) Outsource all specialised skills (in relation to digital this might include development, content production, strategy, etc) and call on specialist external freelancers/agencies as and when they’re needed, 2) Outsource ‘production’ (i.e. “the doing”) but retain some in-house expertise to act as a central point to manage the outsourcing and provide a degree of in-house capacity, 3) Build specialist, in-house capacity, use very few – or no – outsourcing. It was felt that option 3 was really only feasible for larger organisations or those with significant financial resources to call on. There also seemed to be a feeling that digital is evolving, developing and fragmenting at such a rapid speed that it was difficult to build in-house capacity that matched or exceeded the expertise that was available by working with external suppliers.

Video is an outlier: An area where a number of organisations had seen success with bringing capacity in-house was around content production, specifically video. The main driver for bringing this totally in-house was usually budget. It was also felt that bringing this specific skill in-house gave you easily understandable and usable options to experiment and try things (without being restrained by cost).

Capacity: It was widely acknowledged that inevitably the more complex (and therefore interesting/fun) projects are often those that are outsourced, inevitably this is because in-house staff don’t have the capacity to take these projects on. Discussions raised the prospect of outsourcing some of the day-to-day (‘boring’) things to free up room to tackle (or upskill so you can tackle) these more complex projects. The benefits of this are obvious – staff develop, expertise is grown in-house, etc although this approach may take longer initially it is perhaps a more strategic way to address this issue.

Rationale: Almost unanimously the deciding factor on whether or not to outsource seemed to be one of cost – if it made sense to keep something in-house then people would and could, otherwise they’d look externally. However there was also an acknowledgment and discussion of the other implications around the benefits of in-house/outsourcing; namely that outsourcing exposes you to expertise, contacts, knowledge and ideas that you may not otherwise be able to access whereas building in-house teams allows you to build the capacity to experiment and try things out without having to be as constrained by cost concerns, it also enables you to focus and develop your teams’ skills very specifically on meeting your organisation’s needs.

Session 3: Analytics, data and agility – analytics and user testing

This session was lead by One Further’s Chris Unitt who is always worth listening to on this subject.

Google Analytics is awful: A widespread admission that GA was felt to be baffling, overwhelming and unnecessarily complex. Chris recommended using dashboards and custom reports so you don’t have to delve into the murky depths too often.

Use a range of tools: Consider a mix of tools; usabilityhub.com, optimalworkshop.com, in-house user testing, heatmapping (hotjar, crazy egg) are all worth considering to build a well-rounded range of insights. Having a range of outputs (visual, text and numerical) may also help you make a stronger case internally to people who perhaps don’t respond well to ‘just stats’.

Context is key: Always be aware of the context in which the insights/numbers exist e.g. if you have an extraordinary event one week, any meaningful comparison with that week (or using that week in any benchmarking) is going to be impossible and unhelpful.

Analytics should be organisation-wide: Currently it’s often the case that analytics, data and insights are seen as ‘marketing things’. There’s a lot of value that other parts of an organisation can enjoy by better understanding their ‘digital audience’ but some people felt that historically the people who run these departments aren’t always open to being persuaded by data-based arguments.

Don’t try to measure everything: Although you can measure everything that doesn’t mean you should (as Jeff Goldblum once said…), work out what you want or need to know and identify the most useful way to report on that.

Be prepared to act on your findings: A couple of attendees had really useful examples of where they had changed fundamental parts of the user experience (in one example this involved entirely getting rid of the homepage) in response to insights they’d drawn from their analytics setup. Although inevitably this will require a working culture in which the value of data, and data-based decision-making is recognised.

Session 4: The digital user experience – Content creation and distribution

Content or audience first?: Arts organisations create a lot of content – how much of this is created strategically or for a specific purpose and how much is created just to meet an internal expectation? To interrogate this Ammba’s Robbie Beak examined the two broad approaches you can take to content production: audience-first (i.e. you identify what your audience wants, and you make it for them) or content-first (i.e. you work out who might like your content and you go and find them). Inevitably the ‘right’ answer falls somewhere between the two but it was interesting to see people confronting these two ways of thinking about content creation in a black and white way, on the whole the discussions seemed to agree that when the content was a means to an end (e.g. a trailer for a show, articles about a director, interviews with performers etc) it made sense to produce things that your audience wanted to see. However if you are an organisation for whom the content is the point then you shouldn’t be afraid of following a vision and then seeking an audience – although with the specific examples and case studies that were discussed it was widely experienced that this second approach would usually result in a smaller audience (although this audience may be more engaged).

Channels: Everyone agreed it was probably best to pursue an approach of ‘fishing where the fish are’ rather than trying to reinvent the wheel. However it was stressed that specific channels have specific expectations and demands when it comes to content, what works well on one will not necessarily work well on all. Several examples were also given where organisations had seen success by limiting the number of channels they were active on. Arts organisation resources being what they are (limited and stretched) there has to be some honesty that noone (or at least, very few people) can realistically be active across the full range of channels and remain effective.

Session 5: The digital user experience – Connecting physical and digital experiences

What’s the value?: Broadly ‘useful’ enhancements to the physical experience through digital were welcomed. However this was not really most people’s experience; “gimmicky”, “faffy”, “pointless” and “annoying” were just some of the words used to describe efforts in this direction so far. The value to the user has to be clear and immediate, people are impatient – if a digital aspect has been developed without the user being at the heart of everything then, inevitably, it isn’t going to be successful – the Cooper Hewitt ‘pens’ (with which you can interact with touchtables and ‘tag’ objects during your visit, you can then access your list – after your visit – with additional info about all of the objects you tagged) were identified as a good example of this, easy to use, of value to the user and unintrusive.

Contextual enhancement: Following on from the ‘useful’ point raised above, discussions seemed to settle on location/time-based ‘enhancements’ as the most obvious ways to add useful value to a physical experience. There was less discussion about interpreting a physical experience into a digital one for a digital-only audience, but this is definitely an area I’d be keen to explore in the future. I think there is an interesting discussion to be had around things like live broadcast and the experience of the audience on the receiving end of the broadcast – but these are thoughts for another day!

Social norms: When the issue of ‘phones in theatres’ was discussed almost everyone was against it, although there was recognition that this may simply be because the idea of experiencing a live performance with the aid of, or through, a device was still new (and therefore “a bit yucky”). In 10, 20 or 30 years it’s likely the digital will be so embedded in people’s day-to-day lives and lived experiences that to not be able to use a device during a performance may be the oddity. It’ll be interesting to see how attitudes towards this develop in the coming years.

Technology: A ‘Google Glass-like’ experience was proposed as the most seamless way to implement digital enhancements to a physical experience – although noone was saying Google Glass was a particular success, that type of ‘straight to vision’ augmented reality would be a way to meaningfully enhance museum visits, introductions to new artforms etc. Although obviously there are accessibility considerations with tech like that.

Differentiation: There was a discussion around how much arts organisations should try to differentiate themselves when it came to UX. Of course every arts organisation will have some unique aspect but should that translate into user experiences and interfaces that require additional effort/learning-how-to-use on the part of the user just for the sake of being different? Probably not. There are established expectations, norms and best practise when it comes to user experience and arts organisations need to be aware of those and that their (arts organisation’s) digital products sit within a landscape – users don’t think “I’m using an arts website/app/whatever so I’m happy to spend 5 minutes working out why it doesn’t resemble anything I’ve ever seen or used before”, they’re not happy to do so, and they won’t.

Session 6: Future focus – bots, AI and automation in an arts context

We wanted to end the day with a bit of a look at something we think may begin to impact the arts sector in the not-so-distant future. Namely the growing prevalence of chat bots (in various forms). This discussion was led by one of our senior developers, Sam Knight. Sam kicked things off by looking at the rapid adoption of touchscreens (to the point where it’s now an expected part of most screen interactions) and asked whether or not chat bots would quickly reach the same point and, if so, how organisations should be responding to this challenge. Sam has written an introductory post about the subject you can read here.

Cynicism: The overall feeling was one of cynicism, would this really become something that customers would come to expect (e.g. asking Siri to order you some tickets to the opera)? Certainly recent research (“80% of UK businesses want chat bots by 2020“) indicates this as an area where other sectors are expecting to make rapid advancements so it is not unreasonable to think that the arts sector will experience a knock-on effect as a result.

Complexity: An issue we’re well aware of is the basic complexity of the ticketing-purchasing process in some scenarios; the plethora or performances, seats, prices, concessions, etc mean there are a lot of combinations you need to get a system to understand (e.g. if the user asks for tickets are they implying they want best available unless they specify otherwise), at the moment there are lots of libraries that can help you build chat bots but they are aiming to solve relatively straightforward queries (Sam’s post goes into this in more detail).

Usage: The most obvious use that was discussed was one of augmenting customer service. However there was also some interesting discussion around the delivery of an artistic experience e.g. chatbot as narrator.

Final thoughts

I’m really pleased with how the day went, and the feedback we’ve had has indicated that the attendees felt the same. There were some fantastic conversations about a, frankly ridiculous, range of topics. We will be planning more of these, future events will be more focused (on fewer things per event) but I wanted to kick things off by looking at the range of things (from procurement through to analytics through to content through to AI) that digital bods in the arts have to consider.

Anyways, if you’d like to come to future Digital Works events then drop us a line (team@substrakt.com) or watch this space for news on the next one.

Thanks to everyone who came and got so wholeheartedly involved in the discussions, it was great to be able to hear from such a diverse range of organisations and backgrounds. Special thanks to Lucy and Reuben and all the guys at Hackney Empire for being such brilliant hosts.

Let us know what you’d like to talk about at the next digital works via this short survey.

Until next time!

 

 

Thinking about chatbots

Chatbots dominated new tech last year but what are they and how can an organisation start to think about making use of them?

Anatomy of a Chatbot

Modern chatbots are enabling natural language, either by voice or text, in the home – allowing users to retrieve information or perform transactions often made accessible by apps and web APIs.

A typical conversation will attempt to convert a sentence into various entities and an intent,

“I want 2 tickets to see Rogue One tomorrow at 8pm”
Intent: Purchase a ticket
Entity: Rogue One (Film title) , 2 (quantity), Tomorrow at 20:00 (time)

A chatbot will aim to retrieve all the required information in a conversational manner before making the transaction.

“I want 2 tickets to see Rogue One tomorrow”
Intent: Purchase a ticket (requires a film title, quantity and a time)
Entity: Rogue One (Film title) , 2 (quantity), Tomorrow at ? (time)

As you can see the time object is incomplete.

-> “What time would like see it?”
“8pm”
Entity: Rogue One (Film title) , 2 (quantity), Tomorrow at 20:00 (time)

Future of Chatbots

Although we’re still in the early days of this technology with many teething problems, there is a large investment being made by most of the major technology companies. Google (Google Assistant), Apple (Siri), Amazon (Alexa), Facebook (Messenger) and Microsoft (Cortana) all have their own “assistant” and they are hoping to become the industry standard.

Not only are they creating their own assistants, they are also creating platforms in the hope that developers will support their assistant and become the de facto standard.

Although it’s unclear which platform (if any) will become the industry leader, the above tools mean that developers won’t need to create their own natural language processors to build their own chatbot, reducing the cost for organisations to add this new platform.

When touch screens started to feel more intuitive to touch they quickly become a necessary part in mobile phones and now there are a generation of children who are surprised when their TVs don’t respond to their touch. Natural language interactions has the potential to be just as intuitive to the next generation and become an expected medium.

With lower development costs, assistants in every personal device and a growing expectation; it will only be the quality of response and range of tasks available that will determine the rate of adoption.

Considerations when making a chatbot

Firstly it’s important to ask if chatbots are right for your organisation. Certainly at the moment your target audience may not have the devices available or are comfortable with this method of interaction yet but will that change? Are your audience early adopters?

Considering how you incorporate chatbots to your organisation’s digital offerings requires the same attention to detail as any other interaction your organisation offers including websites, social media, newsletters, leaflets and call centres.

Features and user experience lies at the heart of all of these and should do so with chatbots.

Features to offer

If you already have a digital presence some of the main functions that a chatbot needs to support should be easy to determine.

A few quick questions to ask

  • Main goals: What are the primary user flows/call to actions?
  • Search: If you have internal search, what do users search for?
  • Phone: When you receive calls, what are the main questions?

Quality interactions

In UX circles there is a mantra of “Don’t make me think”. In natural language this could be applied as “Don’t make me talk like a robot.”

Most tools to create chatbots will allow you to “train” the system based on example sentences. This training is the cornerstone of machine learning and will help the system learn how to map sentences into intents and entities. There are countless regional differences in sentence structure so the accuracy of the chatbot will depend on how well trained your system is.

Branding

Branding is also an essential part of any interaction and the same thought needs to go into a chatbot. These will be the same questions that get asked when deciding on content for your website.

  • Is it formal or informal?
  • Chatty or matter of fact?
  • Are sentences understandable to the average reading age of your audience?

Conclusion

Development platforms will quickly start to improve and with more devices being sold with chatbot assistants, it could soon become an expectation to offer services on these platforms.

Now is a perfect time to start creating a roadmap for what services you want to offer and how to create quality interactions for this specific platform.

How to: Debug GitLab CI Builds Locally

Your tests all pass locally. One test fails in your GitLab CI build. Sound familiar?

Of course it does.

This is painful work to resolve. You have to make single changes to your GitLab CI build file and push them to your origin one at a time until your builds pass or you give up and just live with the fact that only one test fails. Or you delete the test because you have no respect for the law.

But no more! You can now run individual test jobs using the gitlab-ci-multi-runner binary on your local machine exactly as GitLab CI runs them and be 100% sure that your tests will pass without having to commit tiny changes hundreds of times.

First of all, you need Docker installed on your machine. This should only take a few minutes.
Second, you need to install the gitlab-ci-multi-runner locally too. This is the exactly same application that runs the tests on the GitLab CI instance.

Then run the following command:

$ gitlab-ci-multi-runner exec docker {test_name}

That’s it. In my example Ruby project, my tests are run in a job called ‘test’, so replace {test_name} with test and it’ll run that on its own.

Screen Shot 2017-01-11 at 10.17.00

I’ve now got total confidence that these tests will pass during the GitLab CI run and it’s only taken me 10 minutes to fix rather than several hours.

Substrakt Health

We have always been interested in working with the Health sector, and have delivered several projects over the years which whet our appetite. Earlier this year came our big opportunity to give the sector the attention we’d hoped for…

The Prime Minister’s Challenge Fund was created to support primary care, improving service and accessibility to GP surgeries and facilities. South Doc Services (a non-profit GP co-operatative) led a successful funding bid to the Challenge Fund with a federation of 26 South Birmingham GP practices.

Substrakt, having worked with South Doc Services on previous website projects, were a natural partner to support on the digital elements of the proposed Challenge Fund project which included online appointment booking, viewing medication, ordering repeat prescriptions, personalised lifestyle guides and more. Over the last 10 months we have been delivering the project, creating an ambitious patient facing primary care product for MyHealthcare. The app is currently in closed beta testing and due for public release early next year.

We are keen to keep this work and our positioning within the sector as focussed as possible, so decided to launch Substrakt Health. We will be continuing our Substrakt ethos and values for this sector, ensuring we are proud of everything we achieve.

Please keep an eye on the Substrakt Health website over the coming months if you’d like to keep abreast of our work in this area. We will be publishing our progress, providing insight into the challenges, approaches and results of our work as well as our future plans.

1 2 3 4 5 18