Freedom to Tinker – The Freedom to Tinker with Freedom?

1984, Arbitrary, Bad design, Black box, Blog, Bureaucracy, Business model, Cargo cult, Circumvention, Civil rights, Consumer rights, Control, Copyfight, Corruption, Creeping erosion of norms, Crime, Democracy of innovation, Design, Design engineering, Design philosophy, Design with Intent, Designed to be unpleasant, Designers, Digital rights, Distasteful corollary, Do artifacts have politics?, DRM, Dystopia, Embedding code, Engineering, Engineering design, Erosion of liberty, Exclusion, Feature deletion, Freedom to tinker, Future, Gadgets, Gravy train, Greasing palms, Indoctrination, Innovation, Intellectual property, Interaction design, Internet economics, Intrusive technology, Invention, Killjoy technology, Law, Legislation, Liberty, Lobbying, Monopoly, Norms, Open source, Patents, Philosophy of control, Political design, Privacy, Product design, Propaganda, Prophecy, Protest, Public money, Punishment, Regulation, Rent-seeking, Restriction, Reverse engineering, Stifling innovation, Technical protection measures, Techniques of persuasion, Technology, Technology policy, Technology underclass, Treacherous computing, Trusted Computing, User experience, User Psychology, Worldwide, Your property

An open bonnet At Freedom to Tinker, David Robinson asks whether, in a world where DRM is presented to so many customers as a benefit (e.g. Microsoft’s Zune service), the public as a whole will be quite happy to trade away its freedom to tinker, whether the law needs to intervene in this, and on which side: ensuring freedom to tinker, or outlawing it in order to enshrine the business model that “most people” will be portrayed as wanting, given the numbers who sign away their rights in EULAs and so on.

“Many of us, who may find ourselves arguing based on public reasons for public policies that protect the freedom to tinker, also have a private reason to favor such policies. The private reason is that we ourselves care more about tinkering than the public at large does, and we would therefore be happier in a protected-tinkering world than the public at large would be.”

Many of the comments – and those on the follow-up post – look in more detail at the legal issues, with some very interesting analogies to freedom of expression and points made about the impact on innovation – which benefits everyone – when power users are prevented from innovating.
I felt I had to comment, since this is an issue central to the architectures of control research; here’s what I said:

“I think I’d ask the question, “Even if it becomes illegal to tinker with a device, what is there to to stop someone doing it?”
If it is purely the fear of getting caught, then tinkering will be stifled, to some extent. But power users will form groups just as they do now, and some tinkering will still go on. (If the tinkering is advanced enough, it will be too difficult for law enforcement to detect/understand it anyway).
At present much file-sharing activity is illegal, but it still goes on in vast quantities. The fear of getting caught is a major retardation to that activity, I’d suggest; there may also be an ethical component to the decision in many people’s minds. They’re told it’s analogous to stealing a CD from a store, and they believe or are persuaded, partially at least, by that. It seems immoral or unethical.
But does anyone seriously believe that tinkering with devices is unethical? (There are probably a few people who do, e.g. ZDNet’s Adrian Kingsley)
Tinkering with devices will never seem immoral or unethical to the vast majority of the public, hence the only barriers to stop them doing it are a) fear of getting caught and b) lack of knowledge or desire. Most people don’t bother tuning up their cars or tinkering with their computers, even though they could.
Power users do, and in a future where tinkering is illegal, it will again only be power users who do it, and fear of getting caught will be the only reason for not doing it.
So what about this fear of getting caught? How likely is it that one’s modifications or tinkering will be detected by some kind of enforcement agency? The only way I can see that this could be carried out in any kind of systematic way would be if observation/reporting devices were embedded in every product, e.g. every PC reporting home every few hours to squeal if it’s been modified.
But we already have that! Or at least we will soon, and therefore it seems irrelevant whether or not it becomes illegal to tinker with devices. If every computer is ‘trusted’ and spies and reports on its user’s behaviour, whether it reports to Microsoft or a Federal Anti-Tinkering Agency is, perhaps, beside the point.
Architectures to prevent or stifle tinkering can be designed into products and technologies whether or not there is a law requiring them. The user agrees to
have his/her behaviour and interactions monitored and controlled by the act of purchasing the device.
Even if the law went the other way, and there were a legally guaranteed right to tinker, all that would happen is that manufacturers will make it more difficult
to do so by the design of products. Hoods (bonnets) would start to be welded shut, in Cory Doctorow’s phrase, (the Audi A2 already has this, sort of), backed up by stringent warranty provisions. You might have a right to tinker with your device, but no law is going to compel the manufacturers to honour the warranty if you do so.
This, I think, is the crucial issue: the points Lessig makes about the designed structure of the internet, the code, superseding statute law as the dominant shaper of behaviour in the medium, apply just as strongly to technology hardware. Architectures of control in design will control users’ behaviour, however the laws themselves evolve.”