From: CSBVAX::MRGATE!AWALKER@RED.RUTGERS.EDU@SMTP 11-NOV-1987 18:18 To: EVERHART Subj: Why secure systems? Date: Monday, 2 November 1987, 21:56-EST From: Nick Papadakis <@eddie.mit.edu:nick@MC.LCS.MIT.EDU> Subject: Why secure systems? To: security@RUTGERS.EDU ReSent-Date: 11 Nov 87 03:16:14 EST ReSent-From: *Hobbit* ReSent-To: Security: ; ReSent-Message-ID: <12349694177.28.AWALKER@RED.RUTGERS.EDU> In the interest of beginning a flame war (things have been too quiet lately ...), I offer the following text, which was written by Richard Stallman in 1983. If I ignore for the moment RMS's interpersonal skills and concentrate on *what he is saying* rather than how he goes about persuading people of its truth (which has alienated a good many folks), I have to admit that it sounds to me as if he is on the right track. What do you think? - nick --- file is oz.ai.mit.edu:whyhack.text.10 --- Recently the teen-age computer "hacker", the security cracker, has become a topic of national concern. But the many articles on the subject have condemned the cracker without showing the galling aspects of the way of life he is rebelling against and without questioning its ethical foundation. There is no hint that the confused cracker of today may be resisting, albeit ineffectually, a serious social problem of tomorrow. If you look at the social organization of the users on a typical timeshared computer of today and compare it with other social groups, it most resembles the Soviet Union. It is pervaded by suspicion, ruled arbitrarily by a small oligarchy, and hostile toward outsiders. This arouses resentment, which inspires the security crackers. But the authoritarian social organization itself is a worse problem than the crackers are. Most computer users see no alternative. I am fortunate in having experienced one. At the MIT laboratory where I have worked as a researcher for ten years, our old computer system treated users as free equals with a responsibility to cooperate, and guests were welcome. Our hospitality guided clever young people to become responsible engineers rather than crackers. On the typical computer system, the activities of the ordinary users are regulated very narrowly and precisely by the elite, who are bound by no principle of fairness or due process and allow no appeal. Which files you can read, which files you can write, how many files you can have, what programs you can run, how long you can use them, and when you can log in are under their control. They can bump you off the computer at any time. They can watch what you type as you use the computer; you cannot watch them. They can make it very easy for you to do your job, if they like you, or if you curry favor. Or they can obstruct you at every turn, making your life miserable. You have no recourse. They can use the commands that change a user's restrictions, and you cannot, because your restrictions don't permit it. The users are suspicious of each other, and use "file protection" to deny each other access to files. Often this means you cannot make progress in your work because you need to fix a program you cannot get at. People with high morale become discouraged and cynical because of this. The authorities are immune to file protection, however, and can easily erase your file if they do not like what it says. People outside the organization are viewed with hostility and suspicion. They are presumed to lack only an opportunity to delete or scramble all the files on the computer. If the computer is idle, at night for example, its computing power goes to waste rather than allow an outsider to use it for a constructive purpose (such as learning to program). Now imagine that one of the people outside the organization, the recipient of all this suspicion and hostility, is a hacker: a person who is curious, playful and enjoys clever humor. (When computer researchers at MIT in the 1960's first began calling themselves "hackers", this is what they meant. I am proud to call myself a hacker, and I call security-breakers "crackers" to emphasize the distinction.) A hacker, finding a mysterious and complicated computer system, wants to understand it. He would like to explore the computer system, to learn how to use it, or to learn how it works. He knows in advance what reception he will get if he simply asks to use the computer when there is spare time. And he senses intuitively that computer system authorities in general are amoral and do not deserve respect. Naturally, he tries to sneak in and use the computer anyway. He becomes a cracker. If successful, he gets to explore and learn, and can be proud of his cleverness as well. Beefed-up security measures only make the battle of wits more challenging and absorbing. But if he is only a teen-ager, he is probably not used to the kind of thinking that would enable him to question the social system he is part of. (The teen-agers who are politically aware are usually not the computer enthusiasts.) He knows only that he has something to resent. So he does not make a serious attempt to change the system. The best he can manage is instinctive, furtive disobedience. This is why the young cracker seems so usure of the rightness of his actions, and occasionally may do minor damage, almost without noticing. He has not asked the question of how he ought to behave, or how the computer owners ought to behave. This is also why it is so easy to win a cracker over to the security-enforcing establishment with personal inducements. Joining the authorities will end his direct personal difficulties and recognize his cleverness, even better than successfully evading them. Without an ethical awareness, he does not see that he solves his own problem only by contributing to similar problems for others. The software on most computer systems is designed to support the ruling class just as surely as the KGB is. The software written and used by the hackers at MIT was designed to make users free and equal. Our system had no restrictions that could be imposed on selected users; all users were treated alike. Thus nobody could seize power by restricting everyone else. We did not care whether a change to the files was authorized; we cared whether it was an improvement. This can only be decided by human beings, on a case-by-case basis. So, rather than having file protection to control changes, we called for discussion of any planned change. And if a stranger came to the lab and wanted to play with the computer when it was not fully needed by us--we let him! Chances are he would appreciate some of the value of our work, learn from it, and spread the knowledge to others. At best, he would become enthusiastic for our software and our attitudes, join our lab, and contribute to our work. People hearing about our lab usually took it for granted that our system would be destroyed by vandals. Actually, vandalism was very rare, and the damage done by vandals was small compared with the damage caused by the inevitable computer malfunctions and our own mistakes. Simple measures analogous to the glass window on a fire alarm discouraged dangerous activities, deliberate or accidental, without actually forbidding anything. Ultimately it was rising commercialism that destroyed the lab and caused our old computer system to be junked. The technology of computer security is not suited to any middle ground between the extremes. Unless security is iron-fisted and dominates the lives of the users, it is easy to circumvent, and useless. We should put military secrets, bank records and the like on computers with strict security. For other activities, we should have computers that are free of security, and free of its burdens. Then we need not attack the symptom of morally confused crackers with jail threats, security technology, or hiring them as security enforcers to breed more resentment and new crackers. We can invite them to use computers openly on terms of mutual respect, and they will repay our friendship tenfold. Their cleverness and curiosity are just what make for a creative engineer. So far the issue of security versus freedom on computer systems affects mainly computer hackers. But, in the future, computer systems will play a bigger and bigger role in everyone's life. And these systems will be built on today's entrenched authoritarian tradition, unless we stop it. The crackers are a warning sign of a problem that every American is going to face--soon.