Should Virus Distribution be Illegal? 436
mccormi writes "In a guest editorial on Newarchitect Sarah Gordon looks at whether posting malicious code should be allowed and what steps could be taken to stop it. What's worrisome though is that restrictions on malicious code doesn't take into account who it's malicious against and what truly defines malicious." Note that she's not talking about actually infecting computers, but merely making the code available for others to examine (and for some of them, no doubt, to try to spread in the wild).
is spyware viral? (Score:3, Interesting)
making everyone a criminal (Score:2, Interesting)
Of course, the perfect virus in this case would be one that
Suddenly everyone who has ever been infected becomes a criminal for posting the virus' replication mechanism!
Sounds like a broadened DMCA... (Score:3, Interesting)
This is one of those issues where a law cannot be both effective and fair. And possibly not either.
I like the scientific analogy (Score:3, Interesting)
Researching biological viruses is legal, although people could attempt to spread said viruses maliciously. Those who deal with lethal viruses and diseases often can't just make samples and research easily accessible to anyone, even anonymous people. Why should virus "researchers" be able to do what is essentially the same thing?
Free speech is good, research is good... but so are ethics and responsibility.
mark
It is Our Constitutional Right (Score:2, Interesting)
It's our constitutional right, but it should be illegal?
Re:Hmm. (Score:5, Interesting)
Code is -art-.
When I was but a wee hacker, I used to LOVE reading virus source code. I would download all I could find (granted, at the time, it was from BBS', or sneaker-net), and let me tell ya, I learned much more from those virus' than I ever learned in any mainstream assembler class I've taken.
And no, I -never- used the code for malicious purposes. It was just amazingly interesting to me.
To make it illegal to write ANY type of code is just insane; and if you distribute it without disguising it as something else, what's the real problem??
What if its intent was not to be malicous? (Score:3, Interesting)
Okay so I write some code for a piece of test equipment. Let's just pick an example situation. I don't want to argue if this is a good or bad idea, but say I did it anyway. Every once in a while the machine checks to see if it is slipping its calibration. If it is, it contacts some server to say "hey look at me." Then the server responds and says "yeah I see you." Well with my expansive programming skills I accidentally code a bug. Let's say instead of contacting the intended target, I just start contacting anything I can find. Well another analyzer sees my cries for help and starts yelling too. See where I am going?
The code was never intended to broadcast huge amounts of useless traffic. It happened by accident. I picked this haphazard example to be similar to Code Red. The machines are basically messaging, like mad, between each other. So does this mean my company or I should have charged (civil or criminal) against us? I say no, but I'm sure a lawyer would scream yes.
GPL (Score:1, Interesting)
obfuscated code (Score:3, Interesting)
Re:What if its intent was not to be malicous? (Score:3, Interesting)
Re: Should Virus Distribution be Illegal? (Score:2, Interesting)
Posting the code should be legal because there are always new methods of attacking someone's computer, and people/companies working against this should have access to methods of distributing viruses that other people have thought of, the better to protect themselves/their customers.
An apt analagy is that people are allowed to buy guns, despite the fact that they can kill people--they also help protect people from being killed.
viruses are good for computers.... (Score:5, Interesting)
I think I subscribe to this to some extent. If we had no viruses, and didn't know what havoc they could play with our system, we'd be completely unprepared for any such trouble in our systems -- whether maliciously, or because someone's code happened to go wrong.
I don't think that you can place restrictions on what people write or do not write. I feel it's still the obligation of the system user to protect him/herself against problems and to be vigilant. It keeps us all in practice, and makes us more ready for whatever is out there, no?
That point of view is extremely dangerous (Score:2, Interesting)
The stance that it is somehow idealogically immoral to put constraints on the availability of dangerous information in our current society is not only without a rational defense, but completely ignores the reality that such information can directly lead to a massive amount of harm.
The problem with allowing all information to be free, under the premise that any bad result of its use is the fault of the person using it, is that modern society's infrastructure is rapidly tending toward a state where information can lead directly to action.
Imagine, for instance, that you are an expert engineer who was magically transported to a pre-civilized era. Would the vast body of knowledge that you posessed help you, in that era, take actions that effect any significant amount of change? Would you, in fact, be able to do anything with the advanced information that you posess in such a situation?
In earlier times, it was entirely ok to spread any and all information, because the worst that the information could do would be to change somebody's opinion on a political matter or teach somebody how to make a shoddy weapon (read: a stick) of minor consequence. In the near future, one will be able to transmit a digital specification for a weapon to be fabricated on one's personal fab-lab. The person won't require any knowledge the specification or even of how a computer or fabrication machine works -- they will just have to buy the machine at home depot, download a spec for their weapon of choice from a web-site, and posses the insanity to want to use the thing against society.
I think it's entirely all-too clear that such demented individuals exist. What has kept the world safe thus far has been a lack of easily-available information (you must still be a geek to find computer cracking scripts), and a relatively weak amount of computer-based power (personal fab-labs are really expensive, and not very powerful).
But this won't be the case in the future. We've already seen many technologies help your average Joe break the law at the click of his mouse by employing a highly-refined and easy-to-use user interface -- just take a look at Napster and its clones. Clearly the very availability of Napster enabled thousands and millions to break laws that they would have not broken previously. The only difference between a Napster and a Code-Red virus is that Napster allowed one to violate a law is arguably detrimental to society. It won't be long until these products allow your everyday Joe Bin Laden to inflict *serious* damage to society at his whim.
It'd be great if information could always be free, but unless we restrict dangerous forms of it, we are simply giving up our safe way of life. Although one might *want* to give arbitrary individuals access to all information, you're essentially allowing arbitrary individuals the power to do anything they desire. This system will eventually lead to catastrophe, because you cannot make the entire world's population obey an honor system.
Re:I like the scientific analogy (Score:4, Interesting)
I feel fine letting Sun worry about Java.
I feel fine letting Microsoft worry about computer security.
I feel fine letting the LAPD internal affairs department worry about police corruption.
I feel fine letting the military worry about war.
In general, I feel fine about letting the fox worry about the henhouse.
NO! (Score:1, Interesting)
Six times in the last year I've come across indications of malicious code, while working for varying clients.
Three of those times, I was unable to find anything *BUT* sourcecode as a mechanism for determining propagation mechanisms and possible damage. Ironically, all three were with a client who couldn't or wouldn't spend the money and/or downtime to rebuild servers from scratch to be REALLY REALLY sure they weren't infected (never mind that they paid me 80% of the cost of backup hardware).
What's more, I have resorted to reading source-code for a few other malicious bits of code (DDoS drones) to (in)validate a scan that claimed to find 'em in a sizeable intranet. Code helped me confirm that those were false alarms, so I dodged the cost/hassle/downtime of rebuilding those servers.
In the second case, I came across a tool *after* having read source to invalidate the scanner results. But in the first case, and no doubt in the future, I'll need to know more again.
This is a simplification, since I'd probably qualify for 'trusted access' with my credentials and work background... but it makes a barrier for entry for anyone else interested in security. And WHY would we want ANY people to have ANOTHER excuse for being idiots about any of these things: viruses, privacy, passwords, infosec, etc.?? That nearly always ends up being my strongest recommendation on any audit: educate your staff!
A last thought: this gets back into the same can of worms associated with banning books, banning encryption and banning anonymity. Those in favor of these ideas are usually being lazy and want us to work around their narrow-minded little short-cut ways of doing stuff.
Screw that.
--posted anonymously to protect my clients' confidentiality. Probably silly, but why risk it?
good for Symantec, bad for everyone else (Score:3, Interesting)
Security researchers who don't work for dominant companies like Symantec aren't in such a sweet position, and rely on public forums to learn about exploits. And it's not enough to be told "there is a new virus that attacks X", with the details held secret (eg, known only by Microsoft, Symantec and a few other giants). Security researchers need precise details of how the exploit works, and they need to see the virus code itself in order to write code for detecting that virus signature, or to protect against certain aspects of its behaviour.
Sarah's proposal is just a way to shut down the competition by criminalizing the only way that independent researchers have for getting information.
Doug Moen
Re:I like the scientific analogy (Score:3, Interesting)
The US has a "slippery slope" legal system.
I don't care what your high school english told you about rhetoric, when speaking of law a "slippery slope" argument is perfectly acceptable. It reflects the way that the system ACTUALLY WORKS.
...and good luck TRYING to hold Symantec accountable.
Forms of speech describing illegal action (Score:4, Interesting)
Follow the Money (Score:2, Interesting)
Symantec.
What does Symantec do?
It writes VIRUS DETECTION software.
What do large corporations like Symantec hate the most?
Competition.
If it is illegal to distribute the source code to viruses, then others clearly cannot examine the code in order to defeat it. Symantec, since it is a large corporation, will always be exempt from such law.
So what would should a law do? Reduce competition for Symantec by disallowing others to examine and write counter-virus software lest they be labeled lawbreakers for distributing the virus!
Sneaky.
Disingenuous poster (Score:1, Interesting)
Fact is, it should be an acute embarrassment to most of the security industry that their adversaries have been more energetic, clever, and inventive than they for some years now. The number of companies that sell security solutions and have nobody who is worth spit in kernel mode coding, nor anyone who has had an original thought in the last decade in the areas of access control, is amazingly large both in numbers and in fraction of the industry. Many would like to continue to be lazy and to somehow still get the drop on those who are not lazy and who work up novel things to do with software. For shame, gentlemen! The price of admission to the game with a decent chance of winning it is understanding the guts of your systems, including at kernel level, and willingness to do new things at that level. Without examples coming out, by the way, you are blind and have no way to know where the threats will be coming from next. If you understand research that is going on (and yes, virus building is a kind of research into self propagating code), you can figure out defenses before the attacks turn into widespread virii. If you understand what is being worked and have access to it, you have IF you are not too lazy a chance to build your operating systems and applications not to be vulnerable to weaknesses. Don't whine that it is impossible. It has been done, repeatedly, by some of the more serious OS vendors and app vendors who treat their products as not being permitted to fail. Widen your universe but realize that putting secure software together requires vast carefulness and attention to detail. Most people don't just churn code out like that first try; they refine it and test living he** out of it.
I will add too, that if someone posts some message, supposing it to be a C program, arguably it might be code if it could compile. If it begins with
#if 0
and ends with
#endif
then it does not compile, does it?
It is not code then.
This tends to make it so easy to post pure
non compilable comments (which might be able to
be turned INTO compilable stuff, but are not as posted) that the argument about it being "actions" shows forth as the nonsense it is.
The author would do better to learn to keep up with the technology rather than wish it didn't advance so fast.