Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
News

Should Virus Distribution be Illegal? 436

mccormi writes "In a guest editorial on Newarchitect Sarah Gordon looks at whether posting malicious code should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on malicious code doesn't take into account who it's malicious against and what truly defines malicious." Note that she's not talking about actually infecting computers, but merely making the code available for others to examine (and for some of them, no doubt, to try to spread in the wild).
This discussion has been archived. No new comments can be posted.

Should Virus Distribution be Illegal?

Comments Filter:
  • by NeoSkandranon ( 515696 ) on Friday April 12, 2002 @03:42PM (#3331172)
    Unless the law specified dstribution of *machine readable* malicious code (ie binaries) then MS et.al. could start nailing those who post proof-of-concept code to demonstrate the flavor of the week exploit in IIS or WinxP or what have you...more security by obscurity, yippee
  • by Anonymous Coward on Friday April 12, 2002 @03:43PM (#3331178)
    You take the good, you take the bad, that's the facts of life.
  • Well... (Score:5, Insightful)

    by IronTek ( 153138 ) on Friday April 12, 2002 @03:43PM (#3331179)
    Though no one likes to get a virus, and I often wonder who writes them and for what reasons, I do believe that there probably is much information to be gained from their examination as far as system function goes. From a learning standpoint, those who write them, while having too much free time on their hands, are learning some hard-core programming concepts, as are those who fight them. For the casual programmer, taking a peek at their code every now and then can actually be beneficial. But, as always, it's the person that can make good code cause bad things and vice-versa. As always, it comes down to the person, not the code. The code itself should not be illegal. Knowledge cannot be locked up, and if it is, it can break free in a dangerous way. Better to have it out in the open where the "good guys" can combat it if needbe, and everyone can learn from it.
  • Of course not (Score:3, Insightful)

    by jvbunte ( 177128 ) on Friday April 12, 2002 @03:44PM (#3331190) Journal
    How is posting potentially harmful virus code any different than posting OS vulnerabilities and exploits? If this were to become law, how long would it take a certain OS manufacturer to extrapolate that same concept to cover all 'malicious' code fragments that could be used to target their OS?

    I don't like people who write viruses, I like getting them even less, however censoring the ability to post/review it is just another step in the slippery slope towards censorship of other things.
  • Re:Well... (Score:1, Insightful)

    by Anonymous Coward on Friday April 12, 2002 @03:46PM (#3331204)
    Software is a form of speech. By not allowing me to distribute my software, be it a virus or otherwise, you're restricting my freedom of speech, and that's unacceptable.
  • by mistermoonlight ( 80842 ) on Friday April 12, 2002 @03:46PM (#3331205)
    If you're using mailicious code for analyzation so it can be diffused, yes.


    The more known the code becomes, the easier it is to counter it.


    It also separates the wheat from the chaff in terms of IT employees. Whoever keeps up is a valuable resource in a sea of lax workers

  • Re:Hmm. (Score:3, Insightful)

    by 56ker ( 566853 ) on Friday April 12, 2002 @03:51PM (#3331232) Homepage Journal
    What along the lines of

    If this virus causes you problems with your computer the author cannot be held legally responsible.

    Do you agree [Y/Yes]?
  • by dryueh ( 531302 ) on Friday April 12, 2002 @03:51PM (#3331236)
    Well..this issue raises some interesting, and very classic, ethical issues.

    Freedom of speech is protected, and rightly should be, but there are limitations to that freedom and even --gasp-- responsibilities. Writing codes for viruses, or supplying them to the public, isn't bad in itself--it's the usage of them were the ethical complications come in. Thus, one could claim that simply posting the code for viruses is fine...the people to be blamed are the ones using that code for negligent purposes.

    The same could be true for yelling 'FIRE' in a crowded theatre, right? If a avalanche of trouble ensues, the fault must lie in those people who push over old ladies to get out of the theatre first, right? I mean, the person who yells fire may have played a role in facilitating all the chaos, but the actual causers of the injury are those running around..

    Of course, these two scenarios are completely different (being the virus/yelling fire), but raise similar points. Freedom of speech doesn't make you free from responsiblity of your chosen speech...whether that's yelling 'Fire' or writing/supplying codes for viruses..

  • by RailGunner ( 554645 ) on Friday April 12, 2002 @03:52PM (#3331240) Journal
    The United States Constitution protects free speech, but virus writing and subsequent distribution aren't pure speech. Rather, they're speech plus action. The U.S. Supreme Court has recognized that speech and action, while closely intertwined, aren't one and the same. Thus, the act of putting virus code on the Internet isn't necessarily protected.

    I have to strongly disagree with this. Putting up information on the web that shows a person how to write a virus or a DoS bot or anything else is purely free speech, it's the free release of information. The action she's talking about here is the action of posting information, which is not malicious at all.

    To further illustrate her misguided logic by being absurd, let's apply this reasoning to other realms. By her logic, if you teach a person to use a gun, and that person takes that knowledge and shoots and kills someone, then you should go to prison for murder. Sorry, that doesn't fly. Just because you know how to write a virus and teach others how to write a virus, it's not illegal until you compile that source and make an effort to infect computer systems with that virus.

    Information, no matter what can be done with it, is never "good" or "bad" - it's what you do with that information, the actions you take, that are good or bad.

    Like it or not, even virus code should be protected under the First Amendment. However, for actually implementing and distributing a virus, there should be stiffer penalties.

  • by coyote-san ( 38515 ) on Friday April 12, 2002 @03:55PM (#3331258)
    Damn it, what part of "Freedom of Speech" do people not get?

    History has made it clear that the people pay dearly when free speech, esp. free speech regarding a matter of community security, is abridged. Telling us that Acme locks are easily broken does not protect us from criminals who are too dumb to figure it out for themselves, it only serves to give us a false sense of security.

    (As an aside, this is also the foundation of some of the most damning condemnations I've seen of "child protection" laws. As some judges have observed, the true obscenity is attempting to protect minors from all adult concerns until their 18th birthday... at which point they are thrown to the wolves with absolutely no preparation for the very real challenges adults must face.)

    A virus exchange site is similar. Yes, there will be some idiots (who deserve to have the full wrath of the law on them for their acts) who will use those viruses for ill will. But the same sites will also allow others to be warned that viruses against this specific software exists and is in the wild. No more Microsoft stonewalling about the existence of such attacks. No more trivializing them as highly specialized and not a concern to the average user.

    This is a bit scary... but that's part of being an adult. A child can go to bed at peace that the closet is empty of monsters, but part of being an adult is knowing that there are bad guys out there *and* that you've done everything you can to keep them away. I, for one, and getting damn tired of my self-appointed "betters" trying to infantilize me.
  • by dryueh ( 531302 ) on Friday April 12, 2002 @03:57PM (#3331269)
    By her logic, if you teach a person to use a gun, and that person takes that knowledge and shoots and kills someone, then you should go to prison for murder.

    No, that's wrong. If you teach someone to shoot a gun, and then they go and kill someone, it's true that you shouldn't be held responsible for that person's actions.

    Her point is something different. If you give a loaded handgun to someone and they run out the door and shoot someone, you're an accessory...right? There's a difference between supplying someone with knowledge versus supplying them with a weapon.

    So, if we teach someone how to program and they use that programming knowledge to write virus code, that's not our fault. However, if we give someone the code for a virus program and they simply release into the mainstream, I don't think many people would argue that we played a role in that destruction.

  • by bluGill ( 862 ) on Friday April 12, 2002 @04:01PM (#3331306)

    I can distribute instruction on how to turn a gun into a machine gun, that is legal.
    I can legaly distrbute instructions on how to make drugs.
    It is legal to distribute instructions on how to make bombs.
    I can join a club that intends to destroy the current goverment.
    I can legally plan a murder.

    In all of the above situations, following though and doing the act in question is illegal. However knowing how to do it, and discussing it is not. However once it is done, not only is the act illegal, but possessing/doing the above turns it from a legal act to a conspirecy which makes the act a high crime.

    But we are not even talking about the above situations where there are no legal reason to use that information. Instead we have:

    I can buy and use lockpick.
    I can own and shoot a gun
    I can own and use a car.
    I can drink alcohol

    All of the above are legal, and have legal uses. all can be used illegally.

    Likewise there is benifit from distributing the source code for a virus. Programers should study such things to understand how they work. Only through such understanding can we go the next step and write programs that prevent them from working. (This is an arms race, virus writters are getting better all the time, so we need to get better)

  • by Philbert Desenex ( 219355 ) on Friday April 12, 2002 @04:03PM (#3331327) Homepage
    Sarah Gordon may have some good points. It's hard to tell.

    She never bothers to define the term "virus" in a way that an arbitrary individual (me or an intellectual property lawyer or a World Court Judge) can use to determine whether or not some source code constitutes a "virus".

    If she follows Fred Cohen's definition ("sequences of instructons in machine code for a particular machine that make exact copies of themselves somewhere else in the machine" - "A Short Course on Computer Viruses" 2nd ed ISBN 0-471-00769-2 John Wiley & Sons 1994) which is pretty much an english transliteration of the mathematical definition - even things like /bin/cat or /bin/cc become "viruses" under some circumstances.

    Sarah Gordon is just fear-mongering at this point. Until she says "The term 'virus' means code that ....." objecting to her editorial is just automatic: she's using a term that has (1) a specific technical or mathematical meaning (to Fred Cohen and many Slashdot readers) and (2) a vague "common sense" meaning (to Windows users the general public and a few Slashdot readers). She's arguing based on both meanings. She's hoping that emotional or poorly intellectualized reactions to meaning (2) will get code representing meaning (1) outlawed.

    It's crap. Give it up Sarah.

    And just for good measure: http://cm.bell-labs.com/cm/cs/who/doug/v101.ps Read it and weep Sarah. Neener neener neener!
  • by rtm1 ( 560452 ) on Friday April 12, 2002 @04:04PM (#3331331)
    It says in the article: virus writing and subsequent distribution aren't pure speech. Rather, they're speech plus action

    But it is never elaborated on at all. I do not understand how it can be said that posting something on the web is any more of an action than the physical act of mailing a letter to the editor, but we do say that mailing a letter to the editor falls squarely under free speech. How are we supposed to separate speech and action (something the article acknowledges are different) on the internet if the act of posting places your content beyond pure speech? How are we supposed to have free speech if we are prevented from speaking to others by posting our thoughts?

    There is a big difference between saying "This code will infect machines and do this to them" and then compiling that code and releasing it with malicious intent. One is speech, the other is action. It is the same as the difference between saying "I could break into your home by doing this" and then actually going out and doing it. One is not illegal, the other is.

    This reminds me of another issue. How long before distributing an MP3 player makes you an accomplice to copyright infringement because you haven't included draconian copy-protection schemes? The problem is social, not technological.

  • by HMC CS Major ( 540987 ) on Friday April 12, 2002 @04:06PM (#3331345) Homepage
    This sets a dangerous precedent.

    By outlawing the distribution / posting of software deemed "malicious", it becomes only a matter of time until someone attempts to apply it to security tools such as nmap, ethereal, and any/all proof of concept exploits.

    The distribution of "malicious" code should be regulated (or intentionally unregulated) much the same as file sharing should be: posting things for others should be legal ; using things for illegal and malicious acts should not .

    The problem, though, is the impossibility of catching everyone who uses a "malicious program" once it has been posted. Much like peer-to-peer file sharing, once something is online, it is difficult or impossible to contain. Hence, a paradox: legislators intelligently see that the only way to truly stop these nuisances is to stop it at the source, the single point of failure; unfortunately, this seems to violate fair use and free speech principles. The only way to stop these nuisances is to trample on protected principles.

    I, unfortunately, see no easy solution to this problem.
  • by gnovos ( 447128 ) <gnovos@NoSpAM.chipped.net> on Friday April 12, 2002 @04:07PM (#3331351) Homepage Journal
    ...and do a damn good job. Without an *iron clad* definition, then you could make a case for things like say, Outlook, being "malicious". I don't mean to attack on Microsoft, I mean *anything* that unintentionally or intetionally causes damage could be considered malicious. Could "rm" be considered a "malicious" piece of code?
  • by DotComVictim ( 454236 ) on Friday April 12, 2002 @04:16PM (#3331404)
    I don't think it's possible to come up with a generally acceptable definition for "malicious code". Prove me wrong.

    Counterexamples:

    Internet Explorer and Netscape both trying to become the default system browser, with or without user knowledge. Are these pieces of code being malicious to each other?

    A trojan horse which requires willfull (but not knowing) participation from the user to install.

    A piece of software which serves a controversial, but generally beneficial purpose. For example, a spam bot trap, or news cancellers.

    A script kiddie proof buffer overflow exploit (even if it does just change /bin/sh to " bin sh". In hex though.)

    Anti-virus software which could produce false positives and stop software packages from running.

    A background ad-server which gets installed automatically, and unknowningly, by ISP or P2P client software. (Yes, I would like that to be considered malicious).

    An auto update server which gets installed automatically, and unknowningly, by the OS, which transparently downloads new software components and security fixes as they are available. (That does serve a useful function, for some people).

  • by bluprint ( 557000 ) on Friday April 12, 2002 @04:16PM (#3331406) Homepage

    After all, making things illegal is so effective.
    Can you get child pornography? No, it's illegal.
    Can you get cracked software? No, it's illegal. Can you get ripped music? No, it's illegal.
    Do servers ever suffer from DOS attacks? Do people ever make charges on other people credit cards without the owner of CC knowing? Do people ever hack into private networks?

    Of course not, it's all illegal. Logically, if we make viruses illegal to write, noone would write them...right?

  • Re:Hmm. (Score:4, Insightful)

    by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Friday April 12, 2002 @04:24PM (#3331457) Homepage Journal
    Really? Well, I got this virus the another night that was intentionally installed along with KaZaa. The virus watches every packet I send across the internet and reports it back to the hackers that control it.

    Some people call it "ad ware" or "annoyance ware," but since I didn't want it, it reduces the effectiveness of my PC, and I wasn't alerted to its presence, I consider it a virus.

    Can I sue the manufacturers for damages?
  • by ebyrob ( 165903 ) on Friday April 12, 2002 @04:39PM (#3331555)
    In a guest editorial on Newarchitect Sarah Gordon looks at whether criticizing large corporations for their mistakes and shoddy products should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on criticism don't take into account who it's against and what truly defines criticism." Note that she's not talking about actually infecting computers, but merely making the criticism available for others to examine (and for some of them, no doubt, to use as a tool for damaging corporate profits).

    From the article:
    It's true that the scientific community encourages research, but only when it's conducted within the ethical boundaries of a given discipline.

    So let me get this strait... It's ethical to create software that has tons of security exploits, and spies on unsuspecting users who purchase it, but it's unethical to give people the tools they need to test their systems for vulnerability and gaurantee security for their own piece of mind. It might be OK to give such tools to large corporations, but private individuals just shouldn't need that kind of privacy...
  • Re:Hmm. (Score:3, Insightful)

    by Restil ( 31903 ) on Friday April 12, 2002 @04:39PM (#3331561) Homepage
    I know this was written somewhat in jest, but should the creator of the rm command be held liable because someone got careless with the -rf option?

    Some programs by design can, if used improperly, cause a great deal of damage. Certainly, someone using a program to delete files can't exactly claim ignorance if the program actually DELETES the files they told it to.

    So what if I download a program, and the eula specifically warns met that running the program will spread itself to 100 people and promptly wipe all accessible harddrives. That's what the program was SUPPOSED to do, and it specifically stated that in a document that by default almost nobody reads.

    Outlook, or any email program for that matter, has features that allow you to forward messages to other people. So when someone receives a message, if an executable attachment is automatically run (because the email program allows that function), a message pops up explaining that the user's computer "will now send 100 copies of the current message to anyone/everyone it can find, then wipe the disk, press ok to continue"... and the idiot user presses ok without ever reading the message, who's to blame here?

    Yes its a virus (or a worm if you would). Yes, its intent is malicious. But the user gave permission to execute it, just as if the user gave permission to erase his computer by using deltree /y \ instead.

    What's truely sad here, is a virus based on the previous model would probably spread just as well as your typical covert variety.

    -Restil
  • by cybermage ( 112274 ) on Friday April 12, 2002 @04:47PM (#3331597) Homepage Journal
    the Constitution is illegal to distribute!

    And you think the People in Charge (tm) have a problem with that?

    Did you know that there is a company in Texas (I've forgotten their name) that has the copyright on a Standardized Municipal Code in use across the US and that they don't allow licensees (i.e., cities) to publish it. In many places, if you want to read your city's laws, you need to pay for a license or go down to city hall and read their copy. I swear I'm not making this up.

    Ignorance of the law is no excuse. That'll be $20 for your copy.
  • Re:Hmm. (Score:3, Insightful)

    by Slynkie ( 18861 ) <jsalit&slunk,net> on Friday April 12, 2002 @04:50PM (#3331615) Homepage
    Heh, that made me imagine some little 1337 H4X0R kid running around stabbing people with pieces of trash or empty soup cans.

    Anyways, my intent was not to end the discussion by simply calling it "art". My point was, there -are- some reasons that distribution of virus code (note, I -do- say code and not executables) should not be made illegal, beyond the problem of "what constitutes malicious code" and "free speech". Virus code is -interesting-.

    Beyond that though, I think this is very similar to the Anarchist's Cookbook argument...should writings detailing how to make bombs and other harmful objects be illegal to distribute? I certainly don't think so, it's way too much loss of freedom for an indeterminable amount of safety in my book. And we're possible talking real, physical harm to real people with that.
  • by Anonymous Coward on Friday April 12, 2002 @04:51PM (#3331616)
    Your theory is flawed. You teach someone to write code, not code a virus. There is no reason to make that information specifically available. Here is an analogy. Learning to code is like learning marksmenship. Writing a virus is like shooting someone. Wrongly applied knowledge.
  • by RatOmeter ( 468015 ) on Friday April 12, 2002 @04:55PM (#3331653)
    Posting, distributing or making available source code to viruses should be illegal? You mean, like this?

    CodeRed.zip at Eeye.com [eeye.com]

    and

    CodeRedII.zip at Eeye.com [eeye.com]

    Eeye.com has often posted the proof-of-concept exploits as a part of their advisories... is the author of the guest editoral saying eeye.com is doing wrong?

    Back when the original Code Red was stirring up a ruckus, I posted its disassembled code (from eeye) to alt.comp.virus.source, and an short discussion of several weird aspects (poor coding) of the code ensued. I don't think I did anything wrong by posting it. If some weasel used that post (or other such sources) to create CRII, so be it. IMO, by that time any servers that were still vulnerable to CR/CRII deserved to be hit and, better yet, TOS'd by there ISP.

    I just don't subcribe to the idea that suppressing potentially dangerous source code will do good in the long run. Having the source available and widely distributed has several advantages:
    - promotes understanding of exploit mechanisms in order avoid making the same mistakes in the futre
    - promotes rapid deployment of fixes. There is no pressure greater than knowing every little script kiddy's got the code
    - raises awareness of code weaknesses/failure modes/common pitfalls (maybe *someday* CS courses will teach future coders to prevent buffer overflows!)

    I firmly believe that being open about software/network/OS weaknesses will gradually drive the state of the art in secure software to a much higher level. The "keep quiet", "head-in-the-sand" approach that M$ is promoting these days will only hinder such advances. I'll make a loose analogy to the old outlaws & guns argument: "If you outlaw virus source code, only outlaws will have virus source code."

    In fact, I think it is *imperative* that malicious source code NOT be suppressed. How else can we arm the next generations of app and OS coders to develop resistance code?

  • Because it's difficult or impossible to define what exactly is "dangerous" speech. In fact, as soon as you start outlawing speech because it's "dangerous" rather than actually harmfull (and even that is hard to define) you quickly get into definitions of "dangerous" that include "works against the status quo".

    For example, look at Napster - I dispute your argument that people wouldn't have broken those copyright laws anyway - how many people make copies of tapes for thier friends? It's simply that Napster allowed it on a SCALE that hadn't been seen before. And I'm somewhat of the argument that if the majority of people, when given the opportunity to break a law, would do so then we need to re-think the law. Especially when the result of breaking the law causes no direct harm to anyone.

    However, rather than considering that we might want to re-think copyright law, into something more compatibile with modern technology, instead they simply drop even heavier bombs and try to legislate it out of existence.

    This attitude toward speech is like the Victorian attitude toward sex - if you keep it in the dark where nobody can see it, we can all pretend it doesn't exist - but it still does. Keeping it in the open means that everyone knows it's there, and we can all talk about it. Yes, some people will abuse it - but I'd rather get hit by something I know about and can prepare for, than something which is kept secret and underground and that I don't even know about.

  • by Macrobat ( 318224 ) on Friday April 12, 2002 @05:08PM (#3331773)
    It is painful for me to hear people say that "a point of view is dangerous."

    First, we already have a lot of readily-available "dangerous" information, such as how to make napalm, pipe bombs, or homemade poisons. We have since before the advent of the internet. And I mean before 1969, not 1993. The information about how to kill one or several people is not hard to find, and never has been.

    Second, cracking and counter-cracking technologies are running an arms race, where exploits run a smaller chance of causing damage as time goes by. Some of the counter-cracking measures may advance because of altruism, but they are significantly hastened when a proof-of-concept demonstration is released to "arbitrary" parties (i.e., security-minded software consumers--the general public). They cannot afford the perception of sitting still while their security measures are overtaken.

    This is why your time-travel argument makes no sense, because you are deliberately speculating about an impossible scenario, one that does not exist in the world today or in a foreseeable future, and using it as a basis to restrict basic freedoms. Who's being dangerous now?

  • Re:Hmm. (Score:1, Insightful)

    by Anonymous Coward on Friday April 12, 2002 @05:20PM (#3331872)
    But I think there is still a difference, in that the book describes how to do these bad things, and the virus actually does these bad things.


    Actually, virus source code is also just telling you how to do those things and will only do those things if you take extra measures to make it do them, eg: compilation and execution. Go ahead, try and convince windows or linux or -insert os here- to execute asm source without compilation.
  • by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Friday April 12, 2002 @05:34PM (#3331959) Homepage
    Your ability to make those judgments about trust, and to change them if circumstances warrant, is what's at stake here. what if those conspiracy theories about Symantec engineers writing viruses in order to promote their own products are true? You'd want to be able to re-evaluate who you trust, right?

    Here's something to keep in mind. You know how whenever an article comes up about unethical behavior by a corporation, someone always brings up the fudiciary responsibility thing? About how companies HAVE to make money, and they can be held liable if they don't do everything in their power to make money? Are you sure you want a company like that in charge of, well, anything? (Come to think of it, doesn't this mean if Symantec ISN'T driving sales of Norton AV by releasing viruses, they should be?)

  • by Jerf ( 17166 ) on Friday April 12, 2002 @05:34PM (#3331962) Journal
    Without going into a point-by-point rebuttal, of course "that point of view is extremely dangerous". And of course much of what you said is plausible, inasmuch as wacked-out examples made for the purpose of outrage and extremism is plausible. (That's not sarcasm; it's a common rhetorical device that is serious overused and abused, but it's still somewhat valid when understood correctly.)

    But you provide no evidence that of the two alternatives, yours is better. Your scenarios are for the most part equally applicable to the hiding case; instead of information spreading openly, it spreads covertly. Doesn't change much. You can't keep information from a determined person; people are just too smart.

    I'd say that the post you are replying to is much better constructed as an argument, because it says why the alternative is better: The good guys can find it and learn from it. How is your proposal better? The bad guys still find it*. Now maybe the good guys don't. The "demented person" scenarios remain.

    Step up a meta level. You're focusing too tightly on a small part of the problem, and missing the global implications.

    I say that both revealing and hiding the information is dangerous. The danger comes from people, and therefore cannot be removed from the equation. (This is what you implicitly try to do, by hiding the information. The problem is, the information is not the danger.) But of the two alternatives, open discussion is clearly the preferable choice, both in theory, and in practice.

    (*: Proof: Look at the real world. Happens all the time. This is undeniable.)
  • Code = Speech (Score:2, Insightful)

    by SoftwareJedi ( 461057 ) on Friday April 12, 2002 @05:43PM (#3332013) Journal
    If we are trying to defend the DeCSS code on the grounds that Code is Speech and therefore protected by the first amdenment then we cannot say that distributing virus source code should not be allowed. That would restrict one form of speech but not another. That would play into the RIAA and MPAA's hands.
  • by BoyPlankton ( 93817 ) on Friday April 12, 2002 @06:08PM (#3332172) Homepage
    It'd be great if information could always be free, but unless we restrict dangerous forms of it, we are simply giving up our safe way of life. Although one might *want* to give arbitrary individuals access to all information, you're essentially allowing arbitrary individuals the power to do anything they desire. This system will eventually lead to catastrophe, because you cannot make the entire world's population obey an honor system.


    The biggest problem with this line of thinking is that without the research being done on this stuff, there's no way to develop defenses. Someone is going to develop it eventually, and without the necessary defenses then everybody will be vulnerable. It's like you said, "because you cannot make the entire world's population obey an honor system."
  • by rmassa ( 529444 ) on Friday April 12, 2002 @06:34PM (#3332301)
    Code isn't malicious, people are. Most virus code that is made public is expressly for the purpose of defending against viruses, not spreading them, at least where I frequent. Forgive the gun control reference, but laws only affect the people who obey them. Its just as ludicrous as anti-circumvention laws, which just harm the people who aren't breaking the law in the first place. Why don't we spend all of this effort going after the real criminals/crackers instead of expending endless resources litigating useless laws that do much more harm than good. Knowledge of the enemy and the enemies tactics are the best weapon.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...