Over the last couple of years, this site has examined, mentioned, discussed or suggested around 250 examples of ‘control’ features or methods designed into products, systems and environments – many of which have come from readers’ suggestions and comments on earlier posts. I’d resisted classifying them too much, since my original attempt wasn’t entirely satisfactory, and it seemed as though it might be better to amass a large quantity of examples and then see what emerged, rather than try to fit every example into a pre-defined framework.
As I start work on the PhD, though, it becomes more important to formalise, to some extent, the characteristics of the different examples, in order to identify trends and common intentions (and solutions) across different fields. My thinking is that while the specific strategy behind each example may be completely disparate, there are, on some levels, commonalities of intention.
Abstracting to the general…
For example, paving an area with pebbles to make it uncomfortable for barefoot protesters to congregate – U Texas, Austin and a system which curtails a targeted individual’s mobility by remotely disabling a public transport pay-card have very different specific strategies, but the overall intention in both cases is to restrict access based on some characteristic of the user, whether it’s bare feet or some data field in an ID system. In one case the intended ‘strength’ of the method is fairly weak (it’s more about discouragement); in the other the intended strength is high: this individual’s freedom must be curtailed, and attempted circumvention must be detected.
In the case of the pebbles, we might describe the method as something like “Change of material or surface texture or characteristic”, which would also apply to, for example, rumble strips on a road; the method of disabling the pay-card might be described as “Authentication-based function lockout”, which could also describe, say, a padlock, at least on the level of keyholder authentication rather than actual identity verification. (Note, though, that the rumble strip example doesn’t match the access-restriction intention, instead being about making users aware of their speed. Similar methods can be used to achieve different aims.)
…and back to the specific again
Of course, this process of abstracting from the specific example (with a specific strategy) to a general principle (both intention, and method) can then be reversed, but with a different specific strategy in mind. The actual specific strategy is independent of the general principle. Readers familiar with TRIZ will recognise this approach – from this article on the TRIZ Journal website:
TRIZ research began with the hypothesis that there are universal principles of creativity that are the basis for creative innovations that advance technology. If these principles could be identified and codified, they could be taught to people to make the process of creativity more predictable. The short version of this is:
Somebody someplace has already solved this problem (or one very similar to it.)
Creativity is now finding that solution and adapting it to this particular problem.
…
Much of the practice of TRIZ consists of learning these repeating patterns of problems-solutions, patterns of technical evolution and methods of using scientific effects, and then applying the general TRIZ patterns to the specific situation that confronts the developer.
So, following on from the above examples, where else is restricting access based on some characteristic of the user ‘useful’ to some agency or other? (Clearly there are many instances where most readers will probably feel that restricting access in this way is very undesirable, and I agree.) But let’s say, from the point of view of encouraging / persuading / guiding / forcing users into more environmentally friendly behaviour (which is the focus of my PhD research), that it would be useful to use some characteristic of a user to restrict or allow access to something which might cause unnecessary environmental impact.
An in-car monitoring system could adjust the sensitivity (or the response curve) of the accelerator pedal so that a habitually heavy-footed driver’s fuel use is reduced, whilst not affecting someone who usually drives economically anyway. (A persuasive, rather than controlling alternative would be a system which monitors driver behaviour over time and gives feedback on how to improve economy, such as the Foot-LITE being developed at Brunel by Dr Mark Young). Or perhaps a householder who throws away a lot of rubbish one week (which is recorded by the bin) is prevented from throwing away as much the next week – each taxpayer is given a certain allocation of rubbish per year, and this is enforced by an extension of the ‘bin-top spy’ already being introduced to prevent the bin being opened once the limit has been reached (OK, cue massive fly-tipping: it’s not a good idea – but you can bet someone, somewhere, has thought of it).
Both of the above ‘control’ examples strike me as technical overkill, unnecessarily intrusive and unnecessarily coercive, but thinking on a simpler level and extending the ‘characteristic of the user’ parameter to include characteristics of an object borne by the user (such as the key mentioned earlier), we might include everything from the circular slots and flaps on bottle banks (which make it more difficult to put other types of rubbish in – restricting access based on a characteristic of what the user’s trying to put in it), to narrower parking spaces or physical width restrictions to prevent (or discourage) wider vehicles (such as 4x4s) from being used in city centres.
At this stage, these thoughts are fairly undeveloped, and I’m sure the methods of classification will evolve and mature, but even writing a post such as this helps to clarify the ideas in my mind. The real test of any system such as this is whether it can be used to suggest or generate worthwhile new ideas, and so far I haven’t reached this level.
2 Comments