We won the moral argument but did we lose the business case for UX? | February 11, 2016
When we first started Clearleft 10 years ago, the bulk of my effort was focussed on explaining to clients what user experience design was, the extra value it offered, and why design needed to be more than just moving boxes around the screen. I’m pleased to say that it’s been a long time since I’ve had to explain the need for UX to our clients. These days clients come to us with a remarkable understanding of best practice, and a long list of requirements that contain everything from research, strategy, prototyping and testing, through to responsive design, mobile development and the creation of a modular component library. I think it’s safe to say that the quality of the average digital project has soared over the past 10 years, but so has the effort involved.
This isn’t unusual and happens across all kinds of industries as they develop and become more professional. You only have to look at the advances in health care over the last 50 years to see the dramatic rise in quality. Back in my childhood, the most advanced diagnosis tool was probably the X-ray. These days a whole battery of tests are available, from ECGs to MRIs and beyond. The bar has been raised considerably, but in the process, so has the average cost of patient care.
Over the past few years I’ve seen client expectations rise considerably, but digital budgets have remained largely unchanged. We’ve done an amazing job of convincing digital teams that they need proper research, cross-platform support, and modular style guides, but somehow this isn’t filtering back to the finance departments. Instead, design teams are now expected to deliver all this additional work on a similar budget.
I believe one of the reasons for this apparent lag is that of tempo. Despite the current received wisdom of continual deployment, most traditional organisations still bundle all their product and service improvements into a single big redesign that happens once every 4 or 5 years. Most traditional organisations’ understanding of what a digital product should cost is already half a decade out of date. Add to this the fact that it takes most large organisations a good 18 months to commission a new digital product or service, launch it, then tell whether it’s been a success, and you have all the hallmarks of a terrible feedback loop and a slow pace of learning.
I think another problem is the lack of experienced digital practitioners in managerial positions with budget setting authority. It’s relatively common for digital budgets to be set by one area of the company, completely independently from those setting the scope. Project scope often becomes a sort of fantasy football wish list of requirements, completely untethered from the practical realities of budget.
I couldn’t begin to tell you the number of projects we’ve passed on the last couple of years because their budgets were completely out of whack with what they wanted to achieve; or the number of clients who have asked for our help when their previous project failed, only to discover that the reason was probably due to their previous agency agreeing to deliver more than the budget would actually allow. These organisations end up spending twice as much as they could have done, because they wanted to spend half as much as was necessary—the classic definition of a false economy.
Fortunately once you’ve made this mistake once, you’re unlikely to make it again. Speed of learning is hugely important. In fact I think the organisations that will fare best from the effects of digital transformation are those who can up their tempo, fail faster than their competitors, learn from their mistakes, and ensure they don’t happen again. Basically the standard Silicon Valley credo.
It is possible to avoid some of these mistakes if you hire strategically. I’ve seen a fairly recent trend of hiring in-house digital managers from the agency world. You end up hiring people who will have delivered dozens of projects over the past 5 years, rather than just one or two. These people also tend to be fairly savvy buyers, knowing which agencies have a good reputation, and which are little more than body shops.
As for us practitioners, I think we’ve done a great job of convincing our peers on the value of good UX design and digital best practices. We now need to up our effort getting that message across to the people commissioning digital services and setting budgets, to ensure we can actually deliver on the claims we’ve made.
The Industrialisation of Design (or why Silicon Valley no longer hires UX designers) | February 3, 2016
Despite having their roots in Silicon Valley, UX designers are a rare breed inside traditional tech companies like Google, Facebook and Twitter. In some cases they are so rare that other designers claim UX design doesn’t even exist. As a result I thought it would be interesting to explore where this attitude has come from, to see if it can hint at where our industry is heading.
In my (largely anecdotal) experience, Silicon Valley startups are focussed on hiring product designers at the moment. If you haven’t come across the product designer term before, you can think of them as next generation web designers; talented generalists with an affinity towards mobile and a desire to create great digital experiences for all.
While hiring product designers is all the rage at the moment, that hasn’t always been the case. Many early stage start-ups were originally conceived by individuals who considered themselves user experience designers. Many of these individuals have subsequently moved into design leadership roles at companies like Amazon, Adobe and IBM.
UX design is undoubtedly a specialism, focussing on the strategic and conceptual aspects of design, rather than the more tangible elements of UI. In that regard it has close similarities with service design, but is typically scoped around digital experiences. As practitioners traditionally came to UX design later in their careers, either through Information Architecture and Human Computer Interaction, or UI design and front-end development, there are naturally fewer experienced UX Designers than other disciplines.
This lack of supply, combined with increased demand, started to cause problems. Thankfully, a rising awareness around the general concept of user experience (as opposed to the practice of user experience design) saw more and more UI designers explore this space. Designers started to gain an increased sensitivity towards the needs of users, the demands of different platforms, and an understanding of basic interaction design practices like wireframes and prototypes. A new hybrid began to emerge in the form of the product designer; somebody who understood the fundamentals of UX Design, but retained their focus on tangible UI design.
The viability of the Silicon Valley product designer was made possible by several interesting trends. First off, tech companies started to hire dedicated design researchers; a role that UX designers would often have done themselves. They also started to hire dedicated product managers, releasing the need for designers to engage in deep product strategy. The has led many experienced UX designers to follow careers in research and product management, while others have moved towards IoT and service design.
At the same time, the rise of design systems has reduced the reliance on traditional craft skills. Rather than having to create interfaces from scratch, they can now be assembled from their component parts. This has allowed product designers to spend more time exploring newer fields of interaction design like animated prototypes. You could argue that thanks to design systems, product designers have become the new interaction designers.
This is further helped by companies with a vibrant developer culture and a focus on continual release. Rather than having to spend months researching and strategising, you can now come up with a hunch, knock up a quick design, launch it on a small subset of users and gain immediate feedback.
As a result of these infrastructure changes, tech companies no longer need people with deep UX expertise at the coalface. Instead these skills are now centred around management and research activities, allowing the companies to grow much faster than they otherwise would.
However this approach is not without growing pains, as I learnt when chatting to a design team director at one of the big tech companies recently. There was definitely a sense that while the new breed of product designers were great at moving fast and delivering considerable change, they lacked some of the craft skills you’d expect from a designer. Instead, design languages, prototyping tools, research teams and multi-variant testing were maybe acting as crutches, hiding potential weaknesses. There was also a concern that product designers were so focussed on the immediate concerns of the UI, they were struggling to zoom out, see the big picture and think more strategically.
All these concerns aside, it’s easy to see why, inside the tech industry bubble, UX design may no longer be recognised as a distinct thing.
Digital Education is Broken | January 31, 2016
Ever since I started blogging in the early the naughties, the emails came in. At first in dribs and drabs, one every few months. However by the end of the decade they were one or two a week. Emails from disgruntled students who had spent up to £9k a year on tuition fees, and even more on living expenses, to find themselves languishing on a course that was woefully out of date.
Their emails were filled with tales of lecturers from engineering, graphic design or HCI departments, co-opted to teach courses they didn’t understand because, well, it’s all just computers really? Tales of 19 year olds effectively becoming teaching assistants on the courses they were paying for, because they knew more than their lecturers. Students walking out halfway through their courses, because they were learning more from their evening gigs than they ever could at school.
It was in this context that Clearleft started our general internship program way back in 2008; to provide the growing ranks of self taught designers and developers the knowledge and experience they needed to succeed in the workplace.
Now don’t get me wrong. I’m not one of those libertarian Silicon Valley types who believe the role of education is to churn out dutiful employees. Far from it. Instead I want my tax funded education system to produce well rounded members of society; individuals who are interested in following their passions and who have been taught the tools to learn and think. Sadly digitally focussed courses, in the UK at least, are failing on even these most basic standards.
As I walk the halls of the end of year degree shows, I’m amazed and saddened in equal measure. The work coming out of digitally focussed courses with “User Experience”, “Interaction Design” and “HCI” in their titles are shockingly poor. The best courses represent the fetishes of their course directors; more art than design in most instances. The worst courses have the whiff of Kai’s Power Tools about them.
You’d be excused for thinking the institutions themselves were broken, were it not for the amazing digital work coming from other courses like Product Design, Motion Design, Graphic Design and even Architecture; work that showed a deep understanding of creative problem solving and an appreciation of the medium. So why are digital courses so bad?
I sit down for lunch with a lecturer friend of mine. He bemoans the state of digital design education, as he attempts to scratch a living on the higher education equivalent of a zero hour contract, working far more hours than he was paid for. Fighting for quality inside an organisation that doesn’t really care; that has too many other stakeholders born in a different era to worry about this “digital thing”.
The students are keen to learn, but how much can you really teach in 6 hours of lectures a week, by somebody who has never designed a commercial website in their lives; or at least the last 6 years? Is it any wonder that the graduates from a 10-week General Assembly course leave with more face time (and a better portfolio) than an 18-month Masters?
And so we continue to do what we can. Answering emails from disgruntled students, speaking on courses, offering student tickets, hosting CodeBar events, and running our internships.
And my lecturer friends do what they can. Running the best course possible within a broken system; hoping (and fearing) digital transformation will eventually disrupt their sector, like other sectors before it.
However there’s only so much any one individual can do on their own, which is why I’m pleased there are events like The Interaction Design Education Summit. I hope that through events like this (and others) we can put pressure on the institutions, improve the quality of courses, and help bring digital education out of the dark ages, in order to give students the learning experience they truely deserve.
Why do agency account managers exist? | January 26, 2016
This morning Alison Austin asked the question…
Why do agency account managers exist? #seriousquestion— Alison Austin (@alicenwondrlnd) January 26, 2016
It’s a valid question and one I’ve often wondered myself. As a company we’ve always been resistant to hiring dedicated account managers, having seen the worst excesses of our industry. I remember chatting to an account manager from a large digital agency, during a BBC supplier evening a few years back. She bragged at how she only got the job because she went to the same private school as one of their major clients and had a lifetime membership to The Ivy. It seemed her job largely involved getting clients drunk.
I suppose this is what account management was like back in the days of Madmen. You would win a big “account” made up of multiple smaller projects, then do everything you could to keep the client sweet. This is somewhat understandable when I remember another conversation I had a few years back, with the marketing manager from a large fizzy drink brand. He explain that their agency selection process involved picking 3 agencies from the NMA top 100 list each year, hiring one, and firing one of their incumbents. In this environment of fear, is it any wonder why agencies would do everything in their power to curry favour?
Fortunately I’ve only experienced this attitude once in my professional life. It was in the early days of our company and we’d just had a really positive meeting with a prospective client, so we invited them to lunch. From the booze fest that followed, it was clear these folks were used to being entertained; as they explained how they judged their agencies on the quality of restaurants they got taken to.
In some ways I could understand the attitude. I got the sense that they weren’t especially well paid (or indeed respected) by their company, so agency entertaining was one of the few perks of the job. However I looked back on the episode thinking that if we had to win work based on our ability to entertain clients rather than our ability to deliver, we would have failed as an agency.
While this attitude may still exist in some corners of our industry, it’s not one I recognise anymore. I like to believe that the majority of projects in the digital sector are awarded based on skill, experience, quality and price. So if the Madmen age is over, what do modern account managers do?
For very large accounts spanning multiple projects, the account manager acts as a constant presence on the project, ensuring the needs of the client are met. They’ll have a good understanding of the big picture challenges the client is facing, and be able to share those insight with the individual teams. They will also be there to help solve problems and smooth over any bumps in the road; essentially acting as the client champion within the organisation.
From the agencies perspective, they are also there as a consultant; helping to develop the client as a longer term prospect. This means working with the client to find new opportunities to solve their problems, possibly in areas the client didn’t know they had experience in.
In smaller agencies, this role is often done by the founder, project managers and project leads. In larger companies it’s centralised amongst a small number of account executives. It’s an important role, but not without it’s challenges.
Speaking with friends at agencies with a strong account management ethic, common gripes often come up. The main one being less experienced account managers promising clients new features with little understanding of what’s entailed. This is especially problematic on fixed price, fixed scope projects where margins are tight.
I tend to hear more concerns around account management from clients, who often feel that account managers are either too overtly sales driven (constantly trying to get them to spend more money) or acting as blockers between them and the people working on their projects.
Too often, these problems are caused by a misalignment between the clients needs and the way account managers are being judged and remunerated. Either that to it’s a reflection on poor agency practices and an attempt to keep clients at arms length, possibly to hide an ever changing team of junior practitioners and freelancers.
As such, while I understand the benefits of larger agencies hiring a small number of very experienced account managers, with a solid understanding of the industry, a large number of junior account managers always feel like a bit of a warning sign to me. However as somebody who has never really experienced account management first hand (good or bad) I’d love to know what you think?
Divergent/convergent thinking is a fundamental part of the design process, and something most experienced practitioners are familiar with. Essentially the design process is broken down into two phases; a phase where you open up the problem space and explore as many different directions as possible; and a phase where you start analysing all the possible solutions you’ve come up with, in order to settle on the perfect answer.
It’s easiest to see this approach play out in the world of branding; the designer filling their notebook with pages and pages of graphic experiments, before selecting a handful that meet the brief in different and interesting ways. Rest assured that all good designers work this way, from physical product designers cycling through dozens of concept drawings, through to interface designers exploring countless different UI variations.
If you’ve been involved in a well executed brainstorming session, you’ll understand the benefits of this approach; allowing you to explore a large number of ideas, without the dampening effect of analysis.
You may have also experienced a badly run “brainstorming” session where ideas are debated and discarded as soon as they are created. This approach not only slows the process down, severely reducing the volume of ideas that are generated, it also discounts potentially novel ideas before they’ve had chance to breath.
This process always reminds me of classic crime dramas where there detectives post all of the clue up on a wall in search of patterns. The mediocre detective will jump to the most obvious conclusion first, spending the rest of their time trying to prove their hunch right (and often arresting the wrong person in the process). Meanwhile our hero spends their time assembling clues, exploring the problem space, and analysing all the possible angles, before coming to the less obvious, but ultimately correct conclusion.
So as a designer, how do you decide how much time to spend exploring the problem space and generating ideas, versus honing in on the end solution? And what are the risks involved in spending too much or too little time on either activity?
In my experience, novice designers tend to jump to the convergent phase far too quickly. This is partly because they’ve been mis-sold the idea that design is driven by that elusive spark of creativity, rather than a deeper process of problem solving. Creative ideas are viewed as rare and precious things in need of immediate nurture.
Early in your career, all your ideas seem fresh and novel, so you’re eager to get stuck into the execution, especially as your craft skills are more developed than your ideation skills. Essentially you end up running from an area you don’t feel comfortable with, to one you better understand. I’ve seen plenty of novice designers abandon potentially interesting ideas in favour of more fully fleshed but obvious ones. These ideas may not seem obvious to the designer in question, but more experienced designers will have seen the same tropes time and again.
Good design educators work hard to prevent their students for jumping to the most obvious conclusion, running exercises like “100 designs in a day”. As the name suggests, the students are encouraged to come up with 100 versions of a common design problem, like designing a new chair. The first fifty or sixty designs are usually easy to come by and are typically discarded for being too obvious—variations of designs they’ve seen many times before. It’s the next twenty or thirty designs that get really interesting, where the designer has to really think about the problem and come up with something truly novel.
The “100 designs in a day” exercise is a type of “design game” that acts as a “forcing function”; essentially a way of forcing you to think divergently. The best designers will tend to have an arsenal of similar activities in their toolbox to draw upon when needed.
I’m always nervous when I come across designers who appear to be driven by “creativity” rather than process. Eventually this unbounded creativity will dry up, and they’ll be reduced to aping the styles of other designers, unable to explain their designs other than “it felt right”. Instead, like my old maths teacher, I like to see the workings out; to understand how the designer got to the current solution, and make sure they could replicate the process again and again.
If novice designers spend too little time exploring the possibility space, experienced designers often spend too long; trying to explore every nook and cranny and gather every piece of evidence possible before starting down the route to a solution. This is evidenced by the classic Einstein quote many senior designers love to re-iterate; “If I had only one hour to save the world, I would spend fifty-five minutes defining the problem, and only five minutes finding the solution.”
While it’s true that any nontrivial problem requires a good amount of divergent thinking, spending too much time exploring the problem can form a mental trap akin to analysis paralysis, making it difficult to come up with a solution that solves all the problems you’ve uncovered. This is one of the reasons why large organisations often benefit from enlisting the help of external consultants who can bring a fresh perspective unencumbered by years of exploration and analysis. But these external agents may only have a 6-month grace period before they get indoctrinated into the organisation and start getting similarly overwhelmed.
Architect Eliel Saarinen said it best when he famously said “Always design a thing by considering it in its next larger context - a chair in a room, a room in a house, a house in an environment, an environment in a city plan.” Novice designers regularly jump straight to the chair, ignoring the room it’s in, while very senior designers get so obsessed with the room, the house and the city plan, they ignore the impending seating needs. The logic often seems to be “how can I possibly design a chair, when the city infrastructure to deliver the chair is broken!”
From my experience working with students, interns and junior designers, novices often spend less than twenty percent of their time on divergent activities, and end up obsessing over the convergent process. This works for relatively simple projects, but fails for anything remotely complicated. By contrast, many senior designers will spend up to eighty percent of their effort on divergent thinking, leaving their production team to do most of the converging. Although the ultimate figure depends on the problem you’re solving, in general I think the balance needs to be closer to 60/40 in favour of divergent thinking.
If the idea that designers start their careers focussed on convergent thinking and become more divergent over time holds true, this may help explain why many designers seem to reach a creative peak around 8 years into their careers. At this point they have got out of the habit of rushing to the most obvious solution, and are spending a good deal of time understanding the problem and exploring a variety of leads. They still have enough focus on delivery to reserve enough time for convergence, thereby avoiding the divergence trap.
Design like a Michelin Star Chef | January 19, 2016
The England of my youth was a desert for good food. The difference between a “good” restaurant and an average one lay mostly in the surroundings; that and the use of slightly more expensive ingredients. But white cotton table cloths and snooty service weren’t enough to hide the mediocre food that lay therein. That’s why I used to relish my regular trips overseas, to eat at restaurants where the owners actually cared about what they were producing.
Jump forward 20 years and the landscape has changed dramatically. England is awash with top-end restaurants and Michelin Stars abound. Quality cooking now permeates popular culture, thanks to shows like Master Chef. This attitudes has trickled down to neighbourhood bistros, mixing locally-sourced produce with the skill of the chef. As a result we’ve developed the vernacular and know when something doesn’t make the grade; we’ve basically become a nation of food critics.
We still have average restaurants, but they are few and far between. Instead, a rising tide has raised all boats. Even pubs, and more recently the humble pizza restaurant and burger joint, have gone gastro. The UK really is in the midst of a food revolution. So much so that I now look forward to returning from overseas trips, because of the food.
In this environment, it’s no wonder that a recent show on Netflix charting some of the best restaurants in the world was an immediate hit amongst my colleagues. The level of passion and craftsmanship the chefs demonstrated was amazing. These chefs sweated over every detail, from the provenance of the produce, to the service experience. Experimentation was key, and you could tell that every dish they produced looked and tasted fantastic, elevating cooking to an art form.
This focus on quality struck a chord with me as a designer. It’s an attitude that’s been baked into Clearleft from the outset, hiring people who really care about the details and want to go the extra mile, not just for our clients or their users, but for the field itself. Like great chefs, designers find it difficult to explain the extra effort that goes into an amazing composition. It’s actually fairly easy to knock up something palatable if you have the tools to hand. However it takes a huge amount of effort to craft something noteworthy.
Where quality is concerned, whether it’s with food or design, it usually takes 20% of the effort to deliver 80% of the quality, and a further 80% of effort to deliver the last 20% of quality. I call that the effort to quality curve, and most people stop where the differential is highest. But it’s the last 20% that elevate a dish from average to amazing.
Sadly the current design climate reminds me of 90s cooking. The big studios, like the big chain restaurants, are more interested in delivering a consistent experience rather than a quality one. So they put processes in place that ensure minimum quality, but do nothing to foster true creativity. Many agencies and individuals come off looking like fast food joints, using frameworks and templates to speed production and deliver a slew of me-too products lacking in love or a sense of craft.
By comparison, when I look around our studio—and others like ours—I see the similarities between a kitchen full of expert chefs. Each one with their own areas of expertise, but brought together through a passion for good design and quality code.
However in a world dominated by fast food and even faster design, it’s often difficult to explain the difference to customers—why a meal by a Michelin Star chef is worth more than a chain restaurant. It’s difficult because, unlike the restaurant world, most customers haven’t seen the effort required to deliver quality; haven’t sampled enough dishes to tell bad from good.
The only way to combat this is for designers to make their effort visible as well as their output; to educate customers on the importance of ingredients and technique; and to design like a Michelin Chef.