Craggy Island: The climbing gym that hates boulderers? | September 25, 2014

Over that last year I’ve got really into bouldering. I’m not especially good, but I enjoy the mental and physical challenge of solving bouldering problems over the tedium of a regular gym. I tried rope climbing once, but wasn’t keen on all the equipment or the need to climb in pairs. So I much prefer the freedom and flexibility that comes with bouldering.

When a work trip took me to Guildford, I decided to head down the evening before and check out Craggy Island. I’ve met a few people who climb there and highly recommend it, so I was looking forward to my bouldering session the whole drive up.

I’ve been to mixed climbing and bouldering places before, and have never had a problem. However when I arrived at Craggy Island I was turned away at the door. You see, despite only wanting to go Bouldering, the folks at Craggy Island won’t let boulderers into their gym unless they also know how to rope climb. I tried to explain that I only wanted to use the bouldering wall, but it fell on deaf ears.

Having been rope climbing once before, I decided to try my luck and give it a go. However without the necessary muscle memory I hit a mental block and couldn’t remember how to tie a belay knot. As each successive person passed by on their way into the gym, I began to feel more and more humiliated.

I tried to reason with the guy on the front desk. After all they were asking me to prove I could do something I had no intention of doing, just to get in. A little like asking for proof you can high-dive, when all you wanted to do is a couple of laps of the pool.

I asked if he could jog my memory as I was almost there, but he wasn’t willing to help. Instead he suggested I came back another time to do a refresher course - something I obviously couldn’t do because I was only there for a day, and didn’t want to do because I was only interested in bouldering.

Rather than trying to help, or give me the benefit of doubt, there was a real “jobsworth” mentality at play here. What if they let me in because I said I was going bouldering, but I lied and actually tried to scale the climbing wall not correctly tied in? This would make sense had I not been to mixed bouldering and climbing gyms in more litigious countries like the US and faced no such problem. Instead you just sign a waiver, hand over your money and are trusted to do the right thing.

Walking out of the gym I felt angry, humiliated and dejected. What should have been the highlight of my trip turned out to be the low point, and put me in a bad mood for the rest of the day. Instead of a great climb I left with the feeling that boulderers aren’t welcome at Craggy Island, and I most defitly won’t be going back.

Comments (0)

Could the movies of your childhood be made today? | July 20, 2014

I’ve been thinking a lot about the effect digital technology is having on society of late. I’m especially curious how it’s changing our most formative years, when the stories we tell about ourselves are generated and our identity formed.

Looking back, my adolescence seems like a halcyon time, devoid of mobile phones and status updates. Heading into the big city was an adventure into the unknown, and even something as mundane as meeting up with friends was fraught with uncertainty and excitement.

A lot of the movie tropes of my childhood, relied on these vagaries. For instance the whole premiss of Desperately Seeking Susan relied on two individuals not being able to find each other, and the hilarity and intrigue that ensued. So how different would this movie have been if it were set in the modern day?

The lead characters would now be following each other of Facebook, Twitter and Foursquare, so only a check-in away. Similarly (spoiler alert) the people searching for Susan would only have needed to check her profile image to know if they’d got the right person. So the whole movie would have been reduced to “I wonder were Susan is?”, swipe, “oh, she’s there”.

A lot of 80s comedies relied on the fish out of water scenario. However can you imagine National Lampoons European Vacation in an age of Trip Advisor? No laughably bad hotels, terrible meals or getting lost on the backroads of Europe. Just highly rated B&Bs and top class restaurants. Similarly Mick Dundee would have had a much easier time navigating the cultural differences of New York with a Lonely Planet Pocket Guide on his Fire Phone. Call that a knife? I could buy that on Amazon for $9.99 with next day shipping.

Getting from A to B was another common trope in these old movies. Whether it’s the antics of Smokey and the Bandit, The Cannonball Run, or Every Which Way But Loose. How different would all these movies have been with Waze telling you the bridge was out and your Sat Nav warning you of speed traps? Right turn in 200 yards Clyde.

I wonder how Planes Trains and Automobiles have played out if Steve Martin was a ZipCar user? Or how about Martin Scorsese’s After Hours if the main character could have called up an Uber? Probably a lot better come to think of it.

I always liked the “day-off” or “parents out of town” fantasy. However I wonder if Risky Business would have been different if Tom Cruise had advertised his activities on Craigslist or set up a Kickstarter campaign? And what about Ferris Bueller? No doubt his performance on the float would have gone viral and he’d now be making a living from posting inane clips on YouTube.

Another popular conceit is the fallibility of memory. About Last Night? Check-out my Instagram feed. Dude, where’s my car? Oh, my parking app says I left it over there.

Other famous plot lines would have similarly shortened. So rather than the “will they won’t they” romance of When Harry Met Sally, all Billy Crystal needed to do was fire up Bang With Friends to see if Meg Ryan felt the same way.

In fact most high school romances would have been rendered redundant thanks to services like this. For instance would John Cusack have traveled across the country for a Sure Thing, when he could have found one on campus with Tinder?

Similarly The Breakfast Club would have been a movie about 6 outsiders playing Angry Birds while bitching to their friends on Facebook. No need to crawl through air ducts or share snatched conversations in the hallway to bond. Swipe left. Swipe right. Let’s meet in the janitors closet in 5. I’m sure what Judd Nelson saw under the table would have gone on Instagram straight away.

So If that’s what’s happened to romantic comedies, what about Sex? Would Sex Lies and Videotape have been renamed Sex, Lies and SnapChat? Would the protagonists in 9 1/2 Weeks have started their own webcam channel, ordering ever more obscure food from Tesco Direct to sate their subscribers increasingly niche interests. Taramasalata anyone?

I can think of hundreds of classic movies that would have been irrevocably altered in todays environment. Sure, some of the examples are slightly artificial, but it’s an interesting thought exercise nonetheless. So hit me with your best shot and let me know what movies you think would have been changed by todays technology and how?

Comments (1)

My Advice to Young Designers and Developers | March 16, 2014

I meet them on a regular basis, tech-savvy teens who’ve been coding websites from an early age. They’ll often seek my advice about breaking into the industry. Should they continue their studies or jump straight into the labour market? I usually tell them that ability trumps education and I don’t put much faith on the current raft of tech degrees. So I’d prefer to see three years of experience than three years of study.

That being said, I’ll also point out that University is about much more than just acquiring a skill. It’s a formative experience that will shape your attitudes for the rest of your life. It’s also a huge amount of fun, or at least it was in my day.

As University becomes more expensive, it’s understandable that people question its value. So it makes sense for many young designers and developers to skip higher education in favour of the workplace, and who can blame them? As somebody who was earning less than ‘300 a week throughout most of my twenties, I can’t imagine what it must be like for a twenty year old to earn that much a day freelancing.

However I worry that in the rush to join the establishment, people may be missing out on formative experiences they’ll never get back.

Now I don’t want to romanticise low paid jobs or suggest poverty tourism for the soon to be tech-elite, but there’s something to be said for getting by on minimum wage to enforce a respect for money. There’s also something to be said for working behind a bar, in a call centre or any number of service-based jobs to instil a sense of empathy for other people.

If you’re interested, I’ve worked variously at a chip shop, a supermarket, a warehouse, a watch factory, a restaurant kitchen, a bank, a call centre, a travel agent, a farm, a hostel and various dive centres around Australasia. In all of these situations I met interesting characters and learnt valuable life lessons.

The tech-sector is a wonderful place to work, but it’s also a homogenous and often self-entitled one. So I wonder if the young engineers making their way down to Silicon Valley may have been better equipped to handle the ire of ordinary San Francisco citizens if more of then had tried living in the city on minimum wage themselves?

When I’m asked for advice from school age designers and developers about breaking into the industry, my answer is usually the same ’ ‘don’t rush into a career at 18, only to look back when you’re 28 or 38 and wonder where the time went.’ ‘Instead’, I’ll suggest, ‘why not take a few years off to go travelling?’

I know it’s a clich’ but travelling really does broaden your horizons and expand your mind. Not only do you get to meet new people and experience different perspectives on life, but you also get to reflect on the choices you’ve made or are going to make. For instance I was convinced that I wanted to be a pilot at the age of 18 and even took up flying lessons. However it was only through travel that I realised what a mistake that would have been and what I really wanted to do with my life.

By comparison, the majority of people I know who went straight into a career ended up hating what they did for a living, but only realised this once it was too late. There really is no rush to start down the career path, so I find it weird how many people are settling into their careers so soon. It’s something I associate more with my parents or grandparents era and is oddly conservative.

More importantly, travelling is a lot of fun. It’s also something that gets harder to do as you progress in your careers, buy houses, raise families and settle down. So it’s something I always recommend people to do when they’re young, or risk missing out. After all, the tech industry will always be there, but you only live once so you may as well make the time count.

Comments (6)

Specialism, Ego and The Dunning-Kruger Effect | February 19, 2014

Every few weeks I see a discussion emerge that tries to dismiss the need for specialists in our industry, or refute their existence entirely. It usually goes along the lines of ďIím a [insert discipline] and I do my own [insert activity] so [insert specialism] is unnecessary or doesn’t existĒ.

While itís great to have people with a broad range of skills and abilities, itís also a little hurtful to people who have dedicated their careers to being good at a particular thing, as it implies that all their choices and hard work were a waste of time.

Sometimes this conversation spins into the area of job titles and their general inability to sum up exactly what an individual does. Other times it has us dismissing fairly well understood disciplines or defining the damned thing. The conversation usually ends up with somebody saying something like ďWell Iím just going to call myself a Designer/DeveloperĒ as if picking the broadest and most generic term adds clarity to the conversation.

The problem is that I really do understand the sentiment. If youíve been working in the field of design for a very long time at a reasonably high level, everything starts to look the same. So when Iíve seen product designers, architects or moviemakers talk about their process, the similarities are uncanny. As such itís no surprise when very experienced people pronounce thatís its design (or development) all the way down.

However when you start to unpick each discipline, you discover that while the thought processes are similar, the individual activities and craft skills are often very different. You also realise that scale has a big influence.

If youíre working on relatively simple projects, itís entirely possible for a talented individual or small team of generalists to create something amazing. You see this in everything from Web design and indie-publishing to residential architecture and independent filmmaking.

For somebody who has built a successful career in this space, itís very easy to look at large design agencies, architectural firms or film companies and boggle at all the specialists they have. After all, do Hollywood movies really need a best boy, key grip and clapper loader when youíve just produced a short that you wrote, filmed and directed yourself?

It seems excessive, and maybe it is. I do think some industries have gone crazy with specialists, so thereís always room to assess whether a certain level of specialism is necessary for your particular requirements or situation.

However therein lies the problem! People really do have the habit of making statements about the whole industry based on the small corner they inhabit. I know, as Iím as guilty of this as most. As such we see lots of comments dismissing the need or even existence of certain specialisms not because they donít actually exist, but because those individuals just havenít hit the limits of their abilities where having those specialisms would help.

This is actually a fairly common cognitive bias known as the Dunning-Kruger effect, which sees people inaccurately assess their own level of knowledge while failing to recognize the skills of others. So if youíve ever watched a man try to build a fire, cook a BBQ or put up a shelf youíll know what I mean.

In itís most passive state, the Dunning-Kruger effect manifests itself as naivety and may actually be a key driver for learning. After all, isnít it more enticing to think that if you start learning a new skill, youíll get good at it quickly, rather than realising that youíre going to need 10,000 hours to perfect your craft?

As such we can help people get over this hump by expanding their worldview and explaining the deep specialisms of others. It doesnít mean that you have to become a specialist yourself, but itís useful to know when youíre reaching the limits of your own abilities, if only to inform where you go next.

However I think the Dunning-Kruger effect can have a more divisive role. Iím sure weíve all come across the egotistical designer or creative director who rates their own abilities above all else. This approach often leads to really bad design decisions to the detriment of the product or its users.

Iíve seen similar issues in other fields, like developers feeling they are the most qualified people to design the UX because they are expert technology users, or interaction designers believing they will make great visual designers because itís just a case of learning Photoshop right? This is an interesting area where Dunning-Kruger overlaps with the Halo effect to make people think that because they are good at doing one thing really well, they must be good at doing other things equally well.

I think this attitude is also holding a lot of people back. Iíve met plenty of talented practitioners over the years that had the opportunity to be great, if it wasnít for the fact that they already believed they were. A lot of this is environmental. For instance if you happen to be great at creating mid-sized projects or happen to be the best designer in an above average agency, itís easy to think that youíve got it nailed. However put that person in a world-class team or a hugely complex project and watch them struggle.

I think this is why some of the best designers I know are going to work with big Silicon Valley Tech companies. It forces them to move out of their comfort zone and up their game.

For me, the very best practitioners usually exhibit the opposite of the Dunning-Kruger effect, known as the imposter syndrome. With this cognitive bias, people often have so much visibility of the great work and skill going on in their sector, that they never feel they match up. So they quite literally feel that at some stage they will be unmasked as an imposter.

This bias has a number of benefits in our sector as it forces people to up their game and learn new things, while at the same time making them realise that they will never know everything. As such, people with imposter syndrome tend to specialise. It also means that people are incredibly critical of their work and are constantly striving to improve.

However the imposter syndrome also has its negative effects, like never believing that youíre worthy, or giving overdue credit to people demonstrating Dunning-Kruger like behaviour. So in it’s worst manifestation I seen really amazing people stuck in mediocre jobs because they don’t truly believe how great they are.

As such the key learning is to try and develop a well-rounded view of the industry, the skills you have and the skills and expertise of others. So please, no more tweets or articles like this one decrying a particular skill, discipline or job title. It turns out they are very unhelpful, and more often than not wrong.

Comments (0)

Better design through Web Governance | February 8, 2014

I meet a lot of in-house designers in the course of my travels and the same frustrations keep bubbling up - ďhow can I convince the company I work for to take my expertise seriouslyĒ. It seems that companies have a pathology of hiring highly talented people but taking away the decision making abilities they need to do their job.

Quite often the people at the top of the business know what is broken and are trying desperately to fix it, while the people at the coal face can see the solutions but are unable to act. So whatís going on here?

It seems to me that thereís a mid level of management responsible for turning strategy into tactics. So itís their job to understand the business goals and communicate them to the experts in a way that ensures the problems find a good solution. If this was their only responsibility, I think we’d be in a good place. However a lot of the time this middle tier also start filtering solutions and this is where things start to go wrong.

Iím a firm believer that the people with the most experience in a particular facet of business should be the ones making the decisions for that facet. As such it would be nonsensical for the tech team to be making core financial decisions, as it would for the finance department to drive the technical infrastructure. So why do product managers, designers and UX practitioners constantly find their recommendations being overridden by managers from different departments with little experience in digital.

I think one of the problems lies in the hierarchical approach to management which is a layover of the industrial age. There has always been the assumption that as you rise up the hierarchy you gain more knowledge than the people below you and are therefore more capable of making important decisions.

However in the knowledge age this process is often reversed, with the people at the top forced to rely on the experts below them. Sadly a lot of mid level managers still believe they are in the former model and end up prioritising their opinions over the expertise of others.

This is one reason why I really like the idea of Web Governance. The idea is simple Ė to put in place a governance strategy that explains how decisions get made in the digital sphere.

Web Governance allows an organisation to identify the experts in a range of different disciplines and cede responsibility for those areas over to them, even if they happen to be lower in the organisational hierarchy. For instance, the governance document may state that a senior stakeholder has responsibility for delivering a set of business objective and metrics, but that UI decisions are the ultimate responsibility of the head of UX.

Imagine working in an organisation where the head of UX actually had genuine responsibility for the user experience of their product and can turn down bad poor ideas if they canít be demonstrated to be in the service of a specific set of business outcomes.

Of course, there will be times when these issues clash, so the governance document needs to include information about who needs to be consulted on various decisions. However the goal here is to encourage discussion and negotiation over blanket control based on status alone.

The main thing here is to clearly set out the roles and responsibilities of each individual, rather than have them implied by status or inferred by domain. Itís also about breaking out of the traditional corporate hierarchy and allowing experts to have decision making responsibilities that can override more senior members in certain well defined areas.

Web governance feels like an effective solution to me and all the documentation I’ve reason on the subject so far seems extremely logical and positive. So if you’re struggling to get your expertise heard, maybe itís time to start thinking about a governance strategy.

Comments (0)

Paying Speakers is Better for Everybody | August 16, 2013

When I attend a conference I’m not there for the food or the venue, I’m there for the content (and occasionally the after parties). So it amazes me that conference organisers typically pay for everything but the thing people are there to see. That’s right, despite the often high ticket costs, very few events pay for speakers for their time. I think this is bad for conference goers, event organisers, speakers and the industry as a whole. I’ll explain.

When speakers don’t get paid for their time it’s really hard to justify putting much effort into their talks. So I’ve been to plenty of conferences where speakers will rush their preparation, and end up delivering a mediocre performance. They’ll joke that they wrote the talk the evening before, and will duck out of the speakers dinner early to finish off their slides. This shows a certain amount of contempt for the audience, many of whom have had to fight for the budget to attend, or save up out of their own pocket. However it’s really not their fault. Even first time speakers are busy people and if you’re not able to justify spending the time to write, hone and practice your talk during working hours, the quality will suffer.

Another justified criticism I hear is that conferences are full of the same old voices. Interestingly enough I believe paying all of the speakers, and not just the experienced ones, would help balance this out. This is because many first time speakers give up after their first couple of attempts because they just don’t see the value in speaking. Maybe it took them much longer to write the talk than they expected and their work or home life suffered, or maybe the fame and fortune the conference organisers promised didn’t actually materialise. If potentially great new speakers don’t see the conference circuit and a viable and sustainable ecosystem, they just won’t partake. I think this is a potentially huge loss.

From the organisers perspective, conferences are very expensive, so if they can avoid any additional costs, they will. The venue, catering and AV team most definitely won’t work for free, but it’s relatively easy to convince a speaker to do this, so many of them will. The usual arguments are that the conference organisers aren’t making any money so why should the speakers? As a conference organiser myself, this argument doesn’t hold water for the reasons already stated. In relation to the other costs involved, speaker remuneration is actually very low, and I’m sure most attendees would be happy to pay an extra £10 or £20 to ensure the speakers had enough time to write their talks and deliver good content.

The other argument is that the speakers will be getting exposure and possible work. This may be true in a few instances, but I’ve never had anybody give me work as a direct result of a conference. I’m not saying it does’t happen, but it’s not as common as conference organisers would like you to expect. In fact this argument is a bit like sleazy movie moguls doing screen tests with young models for exposure and a shot at the big time ó a shot that rarely ever happens.

In truth, it takes a speaker at least a week to prepare a new talk, if not longer. You’ve then got to add on the time spent out of the office traveling to, and being at, the event. So even if you pay them $500 or $1,000 it’s unlikely they’ll be making a profit. It just makes it easier to justify the loss of income. As such the arguments around exposure should’t be used as an excuse not to pay, it’s just the icing on the cake if they do.

As an organiser I think paying speakers is actually a very good idea, whether they ask for it or not. This is because it changes the relationship from a voluntary one to a business one. When you’re not paying somebody you really can’t expect them to put a lot of effort into their talks, help you promote the event or respond to your emails quickly (a constant bugbear for organisers). However by paying speakers for services, you set up a different relationship and a level of expectation that makes your life easier and the quality of your event better. We’re not talking huge piles of cash in un-marked bills btw. Sometime a few hundred dollars or a voucher from Amazon is enough to make a speaker feel valued.

Now I’m not saying that speakers should always charge to speak. Far from it. There are plenty of situations where it’s not practical or even desirable, such as small local community events or the local University. There are also plenty of speakers who are paid by companies to speak as part of their jobs, so don’t expect payment. However if an event organiser is charging for attendance and paying other suppliers, I think it’s reasonable to expect to be treated similarly.

When you don’t pay your speakers, they will often try and get value back by other means like pitching their product, service or upcoming book. This is especially common in the tech and start-up arena where many of the speakers will be promoting their companies, looking for investment opportunities or attempting to hire. So I’m sure we’ve all sat through sessions which were essentially thinly veiled product pitches. I’m not saying this doesn’t happen when you pay people to speak, but it tends to be a lot less overt. Instead, folks tend to focus more on sharing useful content than gaining additional value.

On a broader level, I think conference organisers wield a huge amount of influence in our community and this sends the wrong message about the amount of value we put in a persons time and expertise. It’s basically saying that your experience is worthless and you should only get paid to push pixels or deliver code. This is the same problem I have with speculative design work, free “design competitions” and unpaid internships. So as community leaders I think it’s important for conference organisers to help define the industry they want to be part of, rather than simply save a few pounds because they know they can get away with it.

That being said, it’s also the responsibility of every speaker to ask for a fee and turn down the event if it’s not forthcoming, just as it’s your responsibility to be paid for your design work and turn down creative pitches if they don’t want to pay. If you don’t behave this way it’s not just yourself that you’re hurting, but every other speaker (or designer) out there. Conferenced can get away with not paying their speakers because speakers allow it to happen.

When I first started speaking it was very rare for people to actually offer to pay me to speak. However when I went back to conferences with a fee, they almost always agreed. At the very least it was the start of a negotiation. So I think speakers should be a little bolder and ask for speaker fees.

Ultimately I think the default setting should be for speakers to expect to be paid and for conference organiser to expect to be asked to pay. Not exactly a radical suggestion I’m sure you’d agree. This creates a market and helps ensure quality and longevity. As things currently stand, most conference organisers expect everybody except the biggest names to speak for free, and do a good job of making people feel guilty if they ask. Consequently only a few people jump the chasm to become “big names” and end up speaking at every conference under the sun.

Want more quality and diversity in your conferences? Pay your speakers.

Comments (5)