Security By Obscurity — a New Theory 265
mikejuk writes "Kerckhoffs' Principle suggests that there is no security by obscurity — but perhaps there is. A recent paper by Dusko Pavlovic suggests that security is a game of incomplete information and the more you can do to keep your opponent in the dark, the better. In addition to considering the attacker's computing power limits, he also thinks it's worth considering limits on their logic or programming capabilities (PDF). He recommends obscurity plus a little reactive security in response to an attacker probing the system. In this case, instead of having to protect against every possible attack vector, you can just defend against the attack that has been or is about to be launched."
Remember it only talks about cryptography (Score:5, Informative)
Re:Remember it only talks about cryptography (Score:5, Funny)
This part of the summary is just great: "... is about to be launched"
Yes, having somebody sitting there as the attack is taking place and somehow guessing how the attacker will try to compromise your system makes it much easier to defend against the attack. Of course, just correctly guess sooner, and then you can fix the system beforehand and then you don't need someone sitting there....
Re: (Score:2)
This part of the summary is just great: "... is about to be launched"
Yes, having somebody sitting there as the attack is taking place and somehow guessing how the attacker will try to compromise your system makes it much easier to defend against the attack. Of course, just correctly guess sooner, and then you can fix the system beforehand and then you don't need someone sitting there....
It also assumes we can determine the capability or the resources the enemy is willing to employ. It's a lot safer to assume you don't know than to try and assume you know.
Re: (Score:2)
Not necessarily, if the money you spent trying to defend against all possible attacks means that you can no longer have seat belts.
OpenBSD: Only two remote holes in years (Score:5, Informative)
Of course, just correctly guess sooner, and then you can fix the system beforehand
One method to make such a guess is called a "code audit", and code auditing practices applied since mid-1996 [openbsd.org] are part of why OpenBSD has had only two remote vulnerabilities for over a decade.
Which is more than it's coders got (Score:3)
Re: (Score:3)
Come on, you are way off topic here. You deserve the troll remark. We're talking about OpenBSD, not FreeBSD. You didn't read the comment you replied to and you don't know what you're talking about anyway. OpenBSD is rarely used, but when it is used, it is used because it is protecting something, and that means that the value of attacking it is very high; virtually every OpenBSD system not on some nerd's desk is guarding something important to someone.
Re: (Score:3)
Seriously, the main reason why OpenBSD had few remote vulnerabilities in the default install was because they only had one service running in the default install- e.g. openssh. ( http://en.wikipedia.org/wiki/OpenBSD#Security_and_code_auditing )
If some idiot installed phpnuke/phpbb, apache with an outdated version of the app, php etc, they'd be just as pwned whe
Re:Remember it only talks about cryptography (Score:5, Insightful)
The problem is that Security by Obscurity is the defense of lazy vendors who should damn well know better. On the one hand, it's "obscure" that a particular keyphrase known by trusted people will get you to a layer of network security. It is slightly less "obscure" to have your server up on an unresponsive IP address. It's technically a form of "obscurity" to think the hackers wouldn't notice that you left an FTP server up and running without realizing it, or that the default login was still viable. But when vendors use that form of the term obscurity, they're just masking the fact that they are selling you rubbish.
Any properly secured system should be able to proudly proclaim all of its pertinent information to the world, including source code to all available participants, and still be secure. ONLY THEN, should obscurity be layered on. But if your vendor or contractor starts talking about obscurity first, they don't have a clue what they're doing.
Obscurity is icing. Minimalist, properly protected system design with multiple layers of protection, iron-clad internal logging, and no routes to priviledge escalation (especially social) is the route to security. Obscurity is a mildly nice icing that makes maintaining servers less problematic. It also usually leads to lazy vendors creating the illusion of security out of a soon-to-be-had massive privacy lawsuit.
I don't think they understood. (Score:3, Insightful)
Obscurity only makes your security "brittle". Once broken, it is completely broken. Like hiding your house key under a flower pot.
Which means that the real security is the lock on the door. All you've done is allow another avenue of attacking it.
Re:I don't think they understood. (Score:4, Interesting)
Imagine you have gold behind a locked door. Now imagine you have 50 locked doors.
This is your security through obscurity.
Re:I don't think they understood. (Score:4, Insightful)
Re: (Score:2)
Although these days CA authorities are becoming the weak link.
They will have to rethink centralized security, big time.
Re: (Score:2)
Re: (Score:3)
Think about it a little more and you'll see that it's the same thing. A number and it's representation in a numeral system share a duality. Also, it's not 2^128 bits, it's 128 bits, but you probably meant that anyways.
Re: (Score:3)
No. Only the following two are true:
(a) A 128 bit certificate is the equivalent of 128 light switches that have all to be in the right setting (not 2^128),
(b) A 128 bit certificate is the equivalent of 2^128 doors, of which you have to find the right one.
Here the arrangement of 128 options with 2 choices is the equivalent of choosing the ordering number in a sequence of 2^128 elements.
Doors with (counter)clockwise just half the number of doors needed, as you can see each rotation as a separate door. Or you
Re: (Score:2)
There is another way to look at this. Imagine you have gold behind a locked door. Now imagine you have 50 locked doors. This is your security through obscurity.
You hid the gold under the floorboards. Consider your security broken.
Re: (Score:2)
Series or parallel? (Score:2)
Does the attacker have to get through 50 doors to get the gold? Not all locked with the same key? (etc) This is good security (unless locked with the same key and so forth). ..or.. ..or..
Does the attacker have to get through ONE door that is NOT locked (the security depends upon the attacker not getting the right door) ?
Does the attacker just have to check the doors for recent fingerprints to guess which door to attack?
Re: (Score:2)
Frankly, if it is that important to be connected to the internet, but requires high security, the cost is justified.
You can even set
I don't think that's correct. (Score:2)
Just as in my house key example. The attacker has to know WHICH flower pot has the house key.
The problem is that once that piece of information is uncovered, the entire security implementation is broken.
Yes, I understand the concept. I just don't agree with it. Again with the house key exa
Re: (Score:2)
What you want to do is keep it secure as possible, but give the potential intruder something else to work on that yields no results, but increases their risk of exposure.
Security through obscurity does not automatically assume that it is a door left wide open, just no one knows about it.
Consider things that are currently unknown to the public, such as Air Force one. Only a few people know about its defenses and potenti
Nope. That would be "obscurity". (Score:4, Informative)
No, that would be "security through obscurity".
But that does nothing to improve the security of the system. If the attacker choose the correct door (or whatever) then you're left with only the defenses of that door.
No. The "security THROUGH obscurity" means that the door IS unlocked (or unlockable with the hidden key) and that the "security" comes from no one KNOWING that it is a way in. That's what the "through" part of that statement means.
I've always understood it. And you're making a very common mistake. Obscurity != Secret in "security through obscurity".
Re: (Score:3)
Nobody talks about security exclusively through obscurity. Secrecy is just an added layer.
The added security of many eyes reviewing your code makes up for the loss of security from having the code visible. <i>That</i> is why Linux is more secure than Windows. But security through obscurity is not useless.
Re: (Score:2)
Just as in my house key example. The attacker has to know WHICH flower pot has the house key.
The problem is that once that piece of information is uncovered, the entire security implementation is broken.
There are other ways to have obscurity.
What if you put the lock for the door underneath one of the many flower pots, and perhaps even have a completely non-functional keyhole on the door itself.
That is also "security through obscurity".
Moving the lock to an an unusual place certainly doesn't make system any
Not exactly. (Score:4, Interesting)
That isn't "obscurity" in the context of "security THROUGH obscurity". The word "through" is important there.
You can have a functional security system and add misdirection to that without reducing the overall security of the system. But the system, in the end, still depends upon the original security model. Once the correct key hole is known, the lock still must be cracked.
You can add obscurity without making the security dependent upon the obscurity.
Re: (Score:2)
There is another way to look at this.
Imagine you have gold behind a locked door. Now imagine you have 50 locked doors.
This is your security through obscurity.
That is *not* security through obscurity. There are 50 locked doors - that's about 6 more bits of password strength, but it's not obscure that you need to go through one of the doors.
Hiding your key under the flower pot is a better example of obscurity. As is hiding your money in the freezer, or in your sock drawer. Ask someone who has worked in a prison, or served time - most people tend to come up with the same banally unoriginal ways to hide stuff, and the bad guys are pretty good at figuring those metho
Re: (Score:2)
Put up more doors with more locks... that'll fix it! (Just don't tell them about the hidden door into the basement...)
Re: (Score:3, Insightful)
Re: (Score:2)
In fact, viruses are developed based on obscurity. I mean, it is in our everyday lives. To believe that obscurity is somehow the Achilles heel is just crazy thinking.
You have it wrong. (Score:4, Informative)
You're confusing the "obscurity" portion of that statement.
Passwords should rely upon the difficulty in cracking them due to their complexity. The system is known. The password is not known.
Security through obscurity refers to the workings of the system being hidden. Such as the key under the flower pot opening the door. Once that information is discovered, the system is cracked.
Re: (Score:2)
So how is this different?
Time. (Score:2)
In the end, it all comes down to time.
If it takes you 20,000 years to crack my password with a password cracker, then the system is secure for 20,000 years. After which it is cracked (until I change my password again).
If the password is hidden on a post-it under my keyboard, then there is an easier, alternative avenue of attack. And the system is cracked in a minute.
So, having the "security through obscurity" resulted in a less secure system that was cracked a lot quicker than the original system.
That is wh
Re: (Score:2)
> That is why you do not use "security through obscurity".
Well, if you define "security through obscurity" to such an absurd point, then of course there's no value to obscurity.
However, obscurity is an important part of any security system, but only an idiot would rely on obscurity as the only source of security, and only someone being obtuse would assume that that's what others mean.
Soldiers use "security through obscurity" by wearing camouflage. It's by no means their only means of security. I helps
Which is the whole point. (Score:2)
You may view it as "absurd" but it having no value is the whole point.
In these SPECIFIC instances, obscurity only REDUCES the security of a system.
The problem is that we're discussing computer security. Physical security is a different matter and has very limited usefulness as an analogy.
Re: (Score:3)
The idea of any security system is to reduce the number of fatal secrets. The minimum number is one. (Otherwise you have an open-access system.)
Your password, or key, should be that one. It shouldn't matter if the attacker gets everything else, they still can't get your data.
'Security Through Obscurity' is saying 'we've removed this fatal secret by hiding it from the attackers'. Um, no. All you've done is made it slightly harder for them to find. It's still a fatal secret. If you want to remove it fr
Re: (Score:2)
Security through obscurity refers to the workings of the system being hidden. Such as the key under the flower pot opening the door. Once that information is discovered, the system is cracked.
Security through obscurity doesn't mean that you hide the flaws instead of patching it (it can mean that, but it's a narrow definition). Even when you patch the holes, it's still worth it to make it as hard as possible for the attacker to figure out what the state of your system is - let him waste time looking for the flaws that simply aren't there. That's security through obscurity, too.
It's just another layer. Ignoring it makes sense only when you're absolutely confident that your other layers will hold.
Re: (Score:2)
No because you can change the key, which is much easier than changing the cryptosystem. With a good source of entropy, I can generate large numbers good keys all day long. Good cryptosystems are much harder to come by, so the cryptosystem is designed to make changing keys easy. Cryptosystems are also designed to minimize the impact of a single key being discovered. Forward secrecy, for instance, where stealing a key might not get you anything at all.
Re: (Score:2)
Re: (Score:3)
Secrecy is not identical to obscurity. The meaning of obscurity in "Security Through Obscurity" refers to the overall scheme and methods. The secured secrecy of keys and the like is assumed and does not mean that the security system is based on obscurity as understood in the context of discussing security through obscurity.
Re:I don't think they understood. (Score:4, Insightful)
> Which bank would you prefer?
And that is the key point. Real security can be audited without compromising it. Obscurity cannot be audited - you have to take their word that it is "obscure" enough. And what is obscure or inconceivable to some person may be perfectly obvious to another (such as a blackhat with actual security skills...).
Re: (Score:2)
Here is a real world example where getting a key gets you nothing. Lets say you're targeting someone specific to get their secret cookie recipe or their confession and you've installed a wire tap on their net connection and you've been recording all of the traffic. The target has been chatting with their friends over some encrypted chat thing and you're sure they've been discussing the recipe/crime. So one day your goons stop the mark, steal their laptop which contains their private keys, and beat them w
Re: (Score:2)
Re: (Score:2)
No security we have isn't fundamentally based on obscurity. None.
Yes, we have no bananas. You didn't mean to use a double negative, did you?
Re:I don't think they understood. (Score:5, Interesting)
Which means that the real security is the lock on the door.
But that is also just obscurity in another form. The obscure part is that the attacker doesn't know the combination to the lock, or doesn't know how the tumblers specifically are keyed. Otherwise a key could be made up.
All security is obscurity, just different levels of it. In some schemes the obscure value is shared (hidden directory on the server that isn't crawled but can none the less be accessed by a direct link). Some obscure values aren't (public key encryption).
The hiding the key under the rock is analogous to using a weak form of obscurity to hide a strong one. Which in this case is no better than the obscurity of not letting anyone know that the door lock doesn't actually work anyway.
Secret != Obscure in this instance. (Score:2)
Nope. Similar to the use of "theory" in science. The common usage of the word is not the exact same as the usage in this context.
The system is designed so that it can only be opened by the correct secret (the key in this case). That does not mean that the key is "obscure" even though it is the "secret".
Obscurity refers to the system. The key is still the secret. What the obscurity is is the fact that you're hiding (obscuring) the secret under a flower pot.
To p
Re: (Score:2)
No. It's the usage. (Score:2)
No. It's the usage of the terms in the context.
The same as people complain about evolution being "just a theory". The words have multiple definitions and using the incorrect one in this context is incorrect.
Re: (Score:2)
Re: (Score:2)
But, isn't the pattern to the very lock you describe a "secret" or obscure in as much that the lack of knowledge about how to duplicate that key is what keeps intruders out?
Most forms of security rely on some form of obscurity to decide which group of people is allowed access and which group of people is not. A password or a private key, if known to everybody would allow everybody into the system. Only those who hold that extra piece of information are able to access the system through the means by which
Re: (Score:2)
Re: (Score:2)
Many crytpo schemes are provably secure.
However the implementation itself could be flawed, providing a side channel that can be exploited.
Re: (Score:2)
Yes, well, what if they can't find the lock?
Sure (Score:4, Insightful)
That's fine and all. If you want to create your security through incomplete information, or different tactics and strategy, that is a choice.
Just don't be a childish whining little bitch and run to the FBI to stop the big bad anti-social "hackers" from revealing your used-to-be incomplete information in security conventions and trying to have them arrested.
You get double whiny bitch points trying to invoke copyright to prevent the "leakage" of your incomplete information.
I certainly get the point of the article, but a system that is secured through well thought out and tested means will always trump a system where, "Golly Gee Willickers Bat Man.... I hope they don't find the secret entrance to our bat cave that is totally unprotected and unmonitored".
Re: (Score:2)
What's a password - or even a private key - if not incomplete information?
Re:Sure (Score:5, Insightful)
I don't think that is what they mean by incomplete information.
In the context of security through obscurity it has always, to me, seemed to mean that your method and process of providing security is not well understood and it is this fact that is providing the majority of the security. If somebody figures out the method or process, your security is greatly compromised.
A password, or private key, is not a good example in this case. I think a better example would be that passwords and private keys protect documents created by a certain well known company, but that their methods and processes were so laughable that you could create a program to bypass the keys themselves.
Or in other words........ the only thing keeping Wile E Coyote (Super Genius) from getting to Bugs Bunny though the locked door is his complete lack of awareness that there is nothing around the door but the desert itself. Take two steps to the right, two steps forward, turn to your left, and there is Bugs Bunny. You did not even have to get an ACME locksmith to come out.
Re: (Score:2)
you attempted to redefine his terms, and then you attempted to change the topic. in other words, you don't have an answer
aka, incomprehensibility by affability
because the real answer would be to concede that icebraining is correct: it's just a matter of perspective of what security is, and what obscurity is, and, on some philosophical level, they are indeed the same concept after all. not that this is a mighty a thunderclap of a realization, and not that it completely changes security paradigms. but it is i
Re:Sure (Score:5, Informative)
Uhhhhhh..... okay
I am not redefining terms here at all.
Granted, this is from Wikipedia:
Security through (or by) obscurity is a pejorative referring to a principle in security engineering, which attempts to use secrecy (of design, implementation, etc.) to provide security. A system relying on security through obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that the flaws are not known, and that attackers are unlikely to find them. A system may use security through obscurity as a defense in depth measure; while all known security vulnerabilities would be mitigated through other measures, public disclosure of products and versions in use makes them early targets for newly discovered vulnerabilities in those products and versions. An attacker's first step is usually information gathering; this step is delayed by security through obscurity. The technique stands in contrast with security by design and open security, although many real-world projects include elements of all strategies.
icebraining is not correct here, and your assertion I am changing the definition from the norm and widely accepted definition is false. Security through obscurity, as a concept, is not something vague and a matter of perspective. It is a very well defined term in security and has been for quite some time.
According to the definition above, a password is not incomplete information, or information being obscured, as it is being presented in the context of the article and the principle of security through obscurity.
Making this a philosophical debate that a password is also obscurity at some level has nothing to do with the principles that are mentioned.
Re: (Score:2)
Whatever man. I am not resisting anything.
Passwords and secret keys don't have anything to do with the principles of security through obscurity.
I am getting the distinct impression I am feeding a troll, so the kitchen is closed. Come back tomorrow.
Re: (Score:2)
http://slashdot.org/comments.pl?sid=2455818&cid=37579932 [slashdot.org]
read it again
go "yeah sure, on an abstract and inconsequential level," and move on. why is that so difficult for you
Re: (Score:2)
mean that your method and process of providing security is not well understood and it is this fact that is providing the majority of the security. If somebody figures out the method or process, your security is greatly compromised.
Not necessarily. It may also mean using a public and well-understood method - but not telling which method you're using, so the attacker has to figure it out on his own.
Re:Sure (Score:4, Insightful)
Passwords and private keys are very specific pieces of information that use algorithms to make it mathematically (almost) impossible to figure out. Obscure processes and methods and algorithms, on the other hand, are negligibly easy to find out when it comes to computers. Computers are too powerful to hide something from them (with a few exceptions mentioned above). Relying on obscurity is a fools game in those circumstances.
Nature disagrees (Score:3, Interesting)
Camouflage is the oldest and most natural form of security on the planet.
Re: (Score:2)
Carrying a bigger stick then your opponent is the oldest and most natural form of security.
Re: (Score:2)
Camouflage is the oldest and most natural form of security on the planet.
Carrying a bigger stick then your opponent is the oldest and most natural form of security.
Actually its camouflage *plus* the bigger stick. The camouflage gives one the potential advantage of deciding if and when the bigger stick comes into play.
Misapplication of Kerckhoff's Principle (Score:4, Interesting)
Kerckhoff's Principle specifically applies to cryptosystems. Not only does TFA describe more of a generalized application to systems and code, but it's not really describing 'security through obscurity.' It's describing informational arbitrage, i.e., profiting (not necessarily financially) from an imbalance of knowledge on one side of a two-participant game.
The dynamic adaptive approach has its merits, particularly as it is increasingly clear that most security is only the illusion of security, maintained until it is breached. But traditional 'security through obscurity' refers to systems for which the only security measure in place is maintaining the secrecy of a protocol, algorithm, etc.
It seems to me the ideal approach is a balanced one, that embraces the UNIX philosophy: cover the 90% of most common attack vectors with proven security measures (and update practices as needed), and take a dynamic adaptive approach to the edge cases, because those are the ones most likely to breach if you've done the first 90% correctly.
Luck (Score:2)
Call it luck, or educated guess, call it fate for all I care. One miss, and you're screwed.
This is new? (Score:2)
SbO: lame (Score:3)
Re: (Score:2, Insightful)
Re: (Score:2)
Someone else can get in -- all they need is a little bit of information you've left out (like a key). Obscurity. Right there. Self defeating posts are self defeating.
If you have the key then all bets are off. But if the inner workings of the lock are completely known to the opponent and they still can't get in without the key then you can say your system is secure. If there is a flaw in your lock such that it is possible to get in without requiring the key then you have to obscure the inner workings of the lock, and you can't say your system is secure because it's always possible that someone could reverse engineer it and find the flaw, allowing them access to _all_ suc
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
"Open Source" doesn't buy you much. Sure, you can see what the program is "supposed" to do. But do you fully understand what the compiler does with it? Do you trust the compiler to be both bug free and non-malicious? I've filed far too many bugs against compilers to trust them to be bug free. Even if you assume they are, what about the compiler that was used to build your compiler? How do you know that the hardware on which the program is running doesn't leave it open to attack?
If you want "actual tru
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
=Anything server-side is never truly "open" in the sense that you can't truly know that it has your open software.
It is if YOU put the software there.
Missing the point? (Score:4, Interesting)
Well maybe I'm wrong, but I always thought the complaints of "security by obscurity" were not that obscurity couldn't be helpful to security, but that it was a bad idea to rely on obscurity.
It seems obvious to me that the more complete the attacker's knowledge, the greater the chance of a successful attack. If an attacker knows which ports are opened, which services are running, which versions of which software are running which services, and whether critical security patches have been applied, for example, it's much easier for them to find an attack vector if there is one. You're more secure if attackers don't know that information about your systems, because it forces them to discover it. That takes additional time and effort, and they may not be able to discover that information at all.
However (and here's the point), it's not a good idea to leave your systems wide open and insecure and hope that attackers don't discover the holes in your security. It's not smart to rely on the attacker's ignorance as the chief (or only) form of protection, because a lot of times that information can be discovered. It's true that "obscurity" is a form of security, but it's a fairly weak form that doesn't hold up over time. The truth tends to out.
Re: (Score:2)
You're more secure if attackers don't know that information about your systems, because it forces them to discover it. That takes additional time and effort, and they may not be able to discover that information at all.
But on the other hand if you find a back door to a security system, you now have access to all such security systems. Not publishing the intricate details about the security system doesn't add nearly as much security as people think.
Put it another way, if your security system is completely open and documented and nobody has ever discovered a backdoor that would allow them access without a key, then you can say it is secure with a great degree of confidence.
Re: (Score:2)
Unless there is strong incentive to not reveal knowledge of a backdoor if you find it, such as the desire to exploit it yourself. With open source, you're still trusting the people who really spent the time to look at and understand the code. How many of those people are there? How many of them do you trust absolutely?
Rational, but flawed. (Score:2)
Past performance IS a proper indication of how the future will be, if everything stays as expected. But reality is rarely fully what we expect it to be.
Defending against known threats is certainly part of the task of securing something - but the other part is observing what makes up the thing you're defending, and looking for weaknesses, and from that how to react when those weaknesses are exploited. Not doing the last bits is one of the very bad parts of groupthink, complacency.
One of the best ways to de
Re: (Score:2)
I don't think you even read the article.
Re: (Score:2)
Who actually reads anything more than TFS?
"Security by obscurity" is misleading. (Score:2)
As a information security professional, I've always seen the whole "security by obscurity" issue somewhat misleading. By repeating the mantra, I feel many people forgot its true meaning.
Security shouldn't RELY on obscurity. That's true. But it doesn't mean obscurity, by itself, doesn't provide security benefits.
There are many examples where this is obvious. For example, would you publish your network topography on your public website? Of course not. Even if you were convinced that its security and access co
anything that can be made by a man (Score:2)
can be unmade by another man
it's that simple
the rest is just an arms race to keep one slight step ahead in constant effort and constant motion
OK, great, but not at the expense of users (Score:2)
The entire concept of security by obscurity acts as a justification for keeping secrets. It often sweeps up information whose release will help users much more than it will help attackers. Once it becomes a sanctioned tool of security, instead of an objective of the security, those who set up and maintain the security lean on obscurity like a crutch.
I realize my argument is an appeal to the slippery slope, but I see it everywhere in society. People, organizations, and governments can get into frames of mind
Security thru (Score:2)
Security thru absurdity is just crazy enough to work.
Secrecy != Obscurity (Score:2)
In information security, secrecy does not equal obscurity.
Obscurity is if I give out access cards for the doors of my building, but all the magic of the card is a single magnet, and just changing the magnetic field at the reader will unlock the door.
Another example of obscurity: I give out access cards but encode them all to the same code and just tell people this one is only for these particular non restricted zones (this is more like DRM systems).
Layers (Score:2)
People, many of your implementation examples aren't "either/or" situations. From a practical standpoint you are usually better with a layer of each: security and obscurity, For example, a strong vault that is hidden is better than the same one exposed. A steganographically-encrypted file is safer than that same file in the public domain. How much safer is open for debate, but you are probably safer with both layers in most individual *implementation* situations.
Where the debate comes alive is in two main ar
Real security by obscurity (Score:2)
What about true obscurity. What kind of OS or software runs on the computers in a nuclear missile silo? Do those computers even use an OS? The point is, with little or nothing published, an attacker who was able to access systems like those would have little realistic hope of hacking them. There's no 0 day lists, no marketplace to pick up working cracks, no books describing how the internals of such a system.
Re: (Score:2)
Re: (Score:2)
But the OS in projects like that was probably a one-off written JUST for the application. And the software probably won't RESPOND to most packets, nor support modern networking methods. It's one thing if a true hacker who knows everything had something to work with. But if he doesn't know what computer it is he is trying to hack into is using, and even if he did it he wouldn't be able to find any information about how it works, being a one-off project with the books being top secret...
I am not saying tha
Re: (Score:2)
Re: (Score:2)
Yes but if you publish the code + proofs, and the mathematical analysis you used to formulate the proofs is flawed, and an attacker is able to see that but others aren't...Then you have just given him or her the means to break in.
Same goes for encryption. You can't generally crack an encryption algorithm, even a flawed one, if you only have the encrypted data and plaintext but no idea at all what algorithm was used.
interesting, but incomplete (Score:3)
Applying game theory is always an interesting approach.
However, this one misses what I consider an extremely important part: The multiplayer aspect. If obscurity is a part of your defense strategy, you can not cooperate with other defenders. As your are competing with the attacker, that means obscurity is only advantageous if the additional cost to the attacker is higher than the benefit you could gain from such cooperation. In general, your security mechanism will not be so new, innovative and hard to crack that this is true. It does depend on the size and resources of your organisation, though. If you're a large organisation that can keep a secret (say, a secret service), it could have a net advantage. For almost everyone else, though, having more eyes on the problem will generally provide a better solution than the additional difficulty that obscurity provides for the attacker.
Re:Yet... (Score:5, Funny)
A new kind of goatse troll in which the troll commenter hides his actions by contributing to the thread in a positive manner.
*golfclap*
Re: (Score:3)
Re: (Score:3)
Wow that didn't even cross my mind. So in addition to contributing to the thread positively, the goatse troll is actually relevant to the topic at hand. Absolutely amazing. A technological marvel.
Re: (Score:2)
On top of that, he was logged in while an AC pointed out it's goatse.