I once had a web product that failed big-time. A major contributor to that failure was tedium of getting new users through the sign-up process. Each screen they had to step triggered the lost of 10 to 20% of the users. Reducing the friction of that process was key to survival. It is a thousand times easier to get a cell phone or a credit card than it is to get a passport or a learner’s permit. That wasn’t the case two decades ago.
Public health experts have done a lot of work over the decades to create barrier between the public and dangerous items and to lower barriers to access to constructive ones. So we make it harder to get liquor, and easier to get condoms. Traffic calming techniques are another example of engineering that makes makes a system run more slowly.
I find these attempts to shift the temperature of entire systems fascinating. This is at the heart of what you’re doing when you write standards, but it’s entirely scale free… In the sphere of internet identity it is particularly puzzling how two countervailing forces are at work. One trying to raise the friction and one trying to lower it. Privacy and security advocates are attempting to lower the temp and increase the friction. On the other hand there are those who seek in the solution to the internet identity problem a way to raise the temperature and lower the friction. That more rather than less transactions would take place.
The idea of ‘process friction’ which is especially pertinent as applied to architectures of control. Simply, if you design a process to be difficult to carry out, fewer people will complete it, since – just as with frictional forces in a mechanical system – energy (whether real or metaphorical) is lost by the user at each stage.
This is perhaps obvious, but is a good way to think about systems which are designed to prevent users carrying out certain tasks which might otherwise be easy – from copying music or video files, to sleeping on a park bench. Just as friction (brakes) can stop or slow down a car which would naturally roll down a hill under the force of gravity, so friction (DRM, or other architectures of control) attempts to stop or slow down the tendency for information to be copied, or for people to do what they do naturally. Sometimes the intention is actually to stop the proscribed behaviour (e.g. an anti-sit device); other times the intention is to force users to slow down or think about what they’re doing.
From a designer’s point of view, there are far more examples where reducing friction in a process is more important than introducing it deliberately. In a sense, is this what usability is?. Affordances are more valuable than disaffordances, hence the comparative rarity of architectures of control in design, but also why they stand out so much as frustrating or irritating.
The term cognitive friction is more specific than general ‘process friction’, but still very much relevant – as explained on the Cognitive Friction blog:
Cognitive Friction is a term first used by Alan Cooper in his book The Inmates are Running the Asylum, where he defines it like this:
“It is the resistance encountered by a human intellect when it engages with a complex system of rules that change as the problem permutes.”
In other words, when our tools manifest complex behaviour that does not fit our expectations, the result can be very frustrating.
Going back to the Ben Hyde article, the use of the temperature descriptions is interesting – he equates cooling with increasing the friction, making it more difficult to get things done (similarly to the idea of chilling effects), whereas my instinctive reaction would be the opposite (heat is often energy lost due to friction, hence a ‘hot’ system, rather than a cold system, is one more likely to have excessive friction in it – I see many architectures of control as, essentially, wasting human effort and creating entropy).
But I can see the other view equally well: after all, lubricating oils work better when warmed to reduce their viscosity, and ‘cold welds’ are an important subject of tribological research. Perhaps the best way to look at it is that, just as getting into a shower that’s too hot or too cold is uncomfortable, so a system which is not at the expected ‘temperature’ is also uncomfortable for the user.