modified: October 15, 2017

draft

I am embarrassed by certain software tools — for instance, that a popular taxi app for Android expects authorization to surveil my phone and erase arbitrary data. It may be that I don’t understand the granularity of the permissions grants, and that I am improperly assuming the worst from the words “read all SMS messages” and “modify or delete the contents of your USB storage”. It is clear at least that the platforms in question have done little to educate users in what their requests mean, and that their engineers take no responsibility for greedy requests that risk misuse of data. Since automatically verifying dataflow properties of programs is hard, they say, it is impossible to impose fine grained constraints on app developers. Instead, users must cede coarse permissions.

A user freely in control of their software could ask questions like “is my phone transmitting private data?” and stop it immediately. They could specify which data-generating processes they consider private, and excise application components dependent on these processes.

Unfortunately, applications are generally abstract and inscrutable. Even when “open source”, a user must be highly motivated to study the structure of a source code repository. Since there are practically no universal standards for software design, most of their resulting knowledge will not transfer to other applications.

We might try to name the vague property of software suggested above; call an app with high facility for user modification moldable. Without a formal, social notion of moldability, it is hard for users to even conceive of their right to individually restrain or extend the software they use. Without better tools, building secure and moldable applications will remain impractical for developers. Constructing these things should be our ethical imperative.