Insider Threat 156
Ben Rothke writes "Thousands of computer security books have been published that deal with every conceivable security issue and technology. But Insider Threat is one of the first to deal with one of the most significant threats to an organizations, namely that of the trusted insider. The problem is that within information technology, many users have far too much access and trust than they should truly have." Read the rest of Ben's review.
Insider Threat | |
author | Eric Cole and Sandra Ring |
pages | 397 |
publisher | Syngress |
rating | 9 |
reviewer | Ben Rothke |
ISBN | 1597490482 |
summary | Excellent overview of the insider threat to networks and information systems |
The retail and gambling sectors have long understood the danger of the insider threat and have built their security frameworks to protect against both the insider and the outsider. Shoplifters are a huge bane to the retail industry, exceeded only by thefts from internal employees behind the registers. The cameras and guards in casinos are looking at both those in front of and behind the gambling tables. Casinos understand quite well that when an employee is spending 40 hours a week at their location dealing with hundreds of thousands of dollars; over time, they will learn where the vulnerabilities and weaknesses are. For a minority of these insiders, they will commit fraud, which is invariably much worse than any activity an outsider could alone carry out.
Insider Threat is mainly a book of real-life events that detail how the insider threat is a problem that affects every organization in every industry. In story after story, the book details how trusted employees will find weaknesses in systems in order to carry out financial or political attacks against their employers. It is the responsibility to the organization to ensure that their infrastructure is designed to detect these insiders and their systems resilient enough to defend against them. This is clearly not a trivial task.
The authors note that the crux of the problem is that many organizations tend to think that once they hire an employee or contractor, that the person is now part of a trusted group of dedicated and loyal employees. Given that many organizations don't perform background checks on their prospective employees, they are placing a significant level of trust in people they barely know. While the vast majority of employees can be trusted and are honest, the danger of the insider threat is that it is the proverbial bad apple that can take down the entire tree. The book details numerous stories of how a single bad employee has caused a company to go out of business.
Part of the problem with the insider threat is that since companies are oblivious to it, they do not have a framework in place to determine when it is happening, and to deal with it when it occurs. With that, when the insider attack does occur, which it invariably will, companies have to scramble to recover. Many times, they are simply unable to recover, as the book details in the cases of Omega Engineering and Barings Bank.
The premise of Insider Threat is that companies that don't have a proactive plan to deal with insider threats will ultimately be a victim of insider threats. The 10 chapters in the book expand on this and provide analysis to each scenario described.
Chapter 1 defines what exactly insider threats are and provides a number of ways to prevent insider threats. The authors note that there is no silver bullet solution or single thing that can be done to prevent and insider threat. The only way to do this is via a comprehensive program that must be developed within the framework of the information security group. Fortunately, all of these things are part of a basic information security program including fundamental topics like security awareness, separation and rotation of duties, least privilege to systems, logging and auditing, and more.
The irony of all of the solutions suggested in chapter one is that not a single one of them is rocket science. All of them are security 101 and don't require any sort of expensive software or hardware. Part of this bitter irony is that companies are oblivious to these insider threats and will spend huge amounts of money to protect against the proverbial evil hacker, being oblivious to the nefarious accounts receivable clerk in the back office that is draining the coffers.
One example the book provides is that many companies feel they are safe because they encrypt data. An excellent idea detailed in chapter two is to set up a sniffer and examine the traffic on the internal network to ensure that the data is indeed encrypted. The reliance on encryption will not work if it is not setup or configured correctly. The only way to know with certainty is to test it and see how it is transmitted over the wire. Many companies will be surprised that data that should be unreadable is being transmitted in the clear.
Some of the suggestions that authors propose will likely ruffle some feathers. Ideas such as restricting Internet, email, IM and web access to a limited number of users may sound absurd to some. But unless there is a compelling business need for a user to have these technologies, they should be prohibited. Not only will the insider threat threshold be lowered, productivity will likely increase also.
The author's also suggest prohibiting iPods or similar devices in a corporate environment. The same device that can store gigabytes of music can also be used to illicitly transfer gigabytes of corporate data.
Insider Threat provides verifiable stories from every industry and sector, be it commercial or government. The challenge of dealing with the insider threat is that it requires most organizations to completely rethink the way they relate to security. It is a challenge that many organizations would prefer to remain obvious to, given the uncomfortable nature of the insider threat. But given that the threats are only getting worse, ignoring them is inviting peril.
The only lacking of the book is that even though it provides a number of countermeasures and suggestions, they are someone scattered and written in an unstructured way. It is hoped that the authors will write a follow-up book that details a thorough methodology and framework for dealing with the insider threat.
Overall, Insider Threat is an important work that should be required reading for every information security professional and technology manager. The issue of the insider threat is real and only getter worse. Those that choose to ignore it are only inviting disaster. Those companies that will put office supplies and coffee under double-lock and key, while doing nothing to contain the insider threat are simply misguided and putting their organization at risk.
Insider Threat is a wake-up call that should revive anyone who doubts the insider threat.
Ben Rothke, CISSP is a New York City based security consultant and the author of Computer Security 20 Things Every Employee Should Know (McGraw-Hill 2006) and can be reached at ben@rothke.com"
You can purchase Insider Threat from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Very true (Score:5, Insightful)
Another problem I've seen is execs granting themselves and their assistants way more access than they really need to do their job. It's a power issue for some of them. I run the company and should be able to get to anything.
That's not every company and SOX has made thinking about the consequences more attractive for the higher ups.
The only point I would disagree.. (Score:2, Insightful)
Re:The only point I would disagree.. (Score:2)
That's far too sensible. What's supposed to happen is, you find out that some vital system is broken and the two people who have passwords necessary to fix it are on holiday or at a conference.
Even better if it's something like desktop OS or firewall upgrades needing authorisation, when a new virus comes out.
Re:The only point I would disagree.. (Score:2)
You put blame on whoever drawed the policy or whoever didn't comply with it. If it is such a sensible resource how the heck is that there's no way to recall it at any moment?
You can ask "what if the two knowing the password are out of office?" just the same you can ask "what if both pilot and copilot become intoxicated over Atlantic?" Bot
Re:Very true (Score:5, Interesting)
The company I work for for instance... EVERYONE has administrator rights to their desktop. Everyone from us lowly engineers in the back who bend our machines to their limits... up to the sales people who just use our proprietary apps (which do not require admin access) and Outlook.
Long ago, IT tried to restrict most users... unfortunately enough complained about not being able to do what they wanted (not always what they needed to do), and the policy was reversed.
This has of course enabled HR persons to install spyware that was suggested by a secretary.
I am still waiting for the day we have someone run a piece of malware who didn't know any better that brings the entire network, and most of it's users to their knees.
Re:Very true (Score:2, Funny)
Why wait, just schedule it the night you leave for vacation.
Re:Very true (Score:2)
Huh? (Score:4, Insightful)
The keys need to be held by only a small group of people. "Too many cooks spoil the soup" applies very well to a corporate network, even down to the workstation configuration. It's possible to screw up the whole enchilada from that point too, ore at least have some major negative effect, and it's much better that if the intent is for it to be a managed network for it to be managed, dammit. If not, it's a free-for-all.
Many of my users are very smart people. Unfortunately, they're good only with their own home PCs. They don't understand why we don't always do things the same way that they themselves do them, nor will they until they come to appreciate the demands that present themselves in trying to keep a 30,000 computer network up and functioning for everyone despite their different needs. Where I work, our network is supported by ten field and bench technicians, two data cabling technicians, two telephone system technicians, and four helpdesk persons as far as interface-with-the-user support is concerned. Our back end is four network engineers, four software specialists, one AS/400 administrator, two Computer Operators, and a slew of programmers to write the software that the users will do their jobs with. It's a very, very small department given the size of the organization, and if we had better, tighter control over the security of the workstations it'd be a much easier job.
Re:Huh? (Score:5, Insightful)
1. Tight control. In this method, the IT people keep the users from doing anything to break or fix the systems.
2. Hands off. In this method, the IT people say "fix it yourself".
In my opinion the first one rarely works for very long.
IT administrators should tell new employees from the very beginning that they will maintain the network, period. If somebody screws up their machine, the IT folks might help the user figure out how to fix it, but the person should have to do the actual work him/herself. This encourages people to take responsibility for their actions, which leads to people actually taking care of their work machines. That was the policy at my former employer (though they did help the marketing folks a bit). It's also the policy of my current employer. From what I have seen, it has worked extremely well.
Putting in a paranoid policy like not giving users admin rights to their own workstations only coddles the users and lulls them into a false sense of security. After all, the IT department is protecting them from breaking anything, so no matter what they do, if the software lets them, it must be safe. It leads to people doing utterly stupid things that they would never do with their own machines---precisely because on their own machines, they would have to fix it if they break it.
As for the premise that users will screw things up if they have any control, my experience has been exactly the opposite. I find that software lock-downs tend to be buggy and cause more problems than they solve. I've seen university computer labs run in a paranoid style and university labs with nearly identical machines run with an open policy. The paranoid lab constantly experienced weird crashes and generally unusable systems. The "do what you want" lab, to my knowledge, hasn't had any non-hardware-related service calls since I helped set it up in 1996.
It is my experience that trusting people until they prove to be idiots is always the best policy. If you trust someone and they betray your trust, you will never trust them again, and they know this. Thus, trusting someone tends to inspire trustworthy behavior. By contrast, paranoid information hiding, control hoarding, and other such authoritarian behavior tends to breed suspicion and contempt, which tends to lead to untrustworthy behavior.
For example, companies that tend to closely guard their secrets within the company, only providing information to people with a "need to know" tend to have much higher leak rates than companies that are open and trusting of their employees. This boils down to basic psychology. Secrecy breeds a feeling of power---that excitement over knowing something that no one else knows---and the only way to exercise that power is by proving to others that you do, in fact, know something that they don't know, which can only be done by leaking information. If you can share that information within the company, most people do so out of loyalty to the company. If you can't, the destination of the leaked information tends to be the press.
This isn't to say that monitoring for improper behavior isn't useful. It is always a good thing to find out quickly when someone is betraying your trust, allowing you to take immediate corrective action. In the field of IT, for example, you should have the ability to detect suspicious network activity, break-in attempts, etc. Centralized system logging can also be useful in this regard. However, if you trust people until they show reason not to do so, the vast majority of people will behave appropriately. If you distrust people until they earn your trust, the majority of people will do everything they can to work around you and subvert your control. That is not a healthy work environment.
Personally, I've always said that the best way to stop press leaks from a company is to create a competing rumor site, see who submits information to it, and take corrective action. Introduce a situation where an une
Re:Huh? (Score:5, Insightful)
So what happens if they can't fix it? Do you just fire them, reload their computer, and hire the next guy?
What makes the most sense to me is to store all a user's data on the network, forcing them to do so if at all possible but at minimum making it easy to do so, and have a system image for each PC in your organization. If they scrag their computer somehow, then you can just reload from the image and move on with your life.
Re:Huh? (Score:2)
Re:Very true (Score:2)
So what? Each and every version of MS Office has codefied escalation to admin privledges at some points - even when running with a restricted user account. MS has acknowledged this problem. Running as a restricted user is only a small part of defense-in-depth.
Re:Very true (Score:2)
Re:Very true (Score:2)
I run a small server 2003/win xp network and it's really pretty easy to do anything you want. You can even choose a default list of bookmarks to put on a user's web browser.
One problem I have noticed is shitty web based
Re:Very true (Score:2)
Olny once I had the administrator password, I was able to connect succesfully.
As always, it's a conflict between ease of use and security..
Re:Very true (Score:2, Interesting)
This book reminded me of another good read, "Art of Deception" by Kevin D. Mitnick. You would be surprised how
Re:Very true (Score:3, Informative)
No, I wouldn't be surprised. I'm able to figure out any random user's password about 70% of the time just based on their pictures or other obvious habits. Couple that with organizations that give users full local computer administrator access (the bane of any kind of real security) and weak password schemes on remote systems and it's a wonder that there a
Re:Very true (Score:2)
This book (and especially Mitnik's book) points up the folly of infuritating password schemes where each user has 17 passowrds that are each changed on different schedules with different r
Re:Very true (Score:2)
When I was a student at the university I used the X-terminals that they had in the computing sites. I didn't have any access more than any one else who had userlevel access to the UNIX
Re:Very true (Score:2)
Re:Very true (Score:2)
The two longest parts of the process are waiting for the win2k cd to boot, and loading the rainbow tables into RAM.
Re:Very true (Score:2)
And the process that I describe and use really doesn't take any skill with a computer.
Re:Very true (Score:2, Funny)
Re:Very true (Score:4, Interesting)
This book reminded me of another good read, "Art of Deception" by Kevin D. Mitnick. You would be surprised how easy it is to get information from people.
I was working for a large retailer about five years ago when I accidentally sent the wrong pricing file for a sign-making program to all 105 stores in our marketing area. So I needed to get into each store's computer via PC Anywhere and manually change the file. It went something like this:
Mgr or Asst. Mgr.: This is Mr./Mrs./Ms. Manager, how can I help you?
Me: Hi, I know that you don't know me but this is Joe from Advertising. I make up the signs and there's an error with next week's file that I need to fix.
Mgr or Asst. Mgr.: Oh, well we certainly don't need wrong information on our signs. What do you need me to do?
Me: Right click on Network Neighborhood, double-click the connection and read me your IP address.
Mgr or Asst. Mgr.: Okay, it's xxx.xxx.x.xxx
Me: Super. I will be in your computer changing some stuff for a few minutes so don't be alarmed if stuff starts happening on your screen.
Mgr or Asst. Mgr.: Okay, thanks.
The crazy thing about it is *not one person* in the 105 stores ever questioned whether I should have that information even though none of them knew me or could ascertain where I was calling from. Not even close--they all cheerfully did what I asked without hesitation. Scary!!
Re:Very true (Score:2)
Re:Very true (Score:3, Insightful)
And you should have been. You don't go "telling" your boss what he can or cannot have, he's your boss. If he tells you to do it, do it. It's then his liability.
Why are there so many IT people with zero interpersonal skills? Instead of flat out refusing, you could've simply explained why it wouldnt be a good idea. It's your job to present the facts, and you can even
Re:Very true (Score:2)
It's a tough call, but this could have had significant liability for the employee as well. Depending on the circumstances, this may have been a time to put the foot down.
Re:Very true (Score:2)
Sometimes there are liability issues too. If that box is your responsibility and your boss fscks things up... guess what, it is still your responsibility.
Feel free to write a memo 'confirming' what the boss has put in place. This is standard ass-covering tactics if you are forced to do something you don't like. Oh and
Re:Very true (Score:2)
The truth is that some managers are simply incapable of grasping the technical intricacies of IT work. This was clearly the case here: my boss at the time heard me say "root access" once in relation to one of our servers, and said that he wanted the ability to perform root tasks. He didn't need that ability, wouldn't have known how to appl
Re:Very true (Score:2)
I don't think your boss should've had root in the context you described, and I do agree that he's unsophisticated given the portrait you've given.
My only disagreement, and probably why you almost got disciplined, was the way you implied you handled it. You'll likely never lose your job in IT for being unpolished, but you'
BS (Score:3, Insightful)
Anyone that stands in the way of this should be fired.
If you cant trust your IT people with this access, then they should be fired.
As far as the owner having total access, well its his f-ing place. HIS butt is on the line.. He gets what he wants, always. Deal with it.
Re:BS (Score:2)
Re:BS (Score:2)
Re:BS (Score:2)
Data is another matter. IT does not need access to sensitive busin
Encrypted Data (Score:2)
Nope, no encryption allowed on my network, unless i hold the key.
If any data is missing, you will be accused regardless. You are the computer guy remember, its ALWAYS your fault.
Re:Encrypted Data (Score:2)
Also if a company cant trust me to hold all the keys, i dont want to work for them in the first place.
And just for the record, the place i work at currently, i do hold all the keys. I'm entrusted with our network/data safety. ( as i have in the past with previous jobs ) I also take that responsibility seriously.
Re:Encrypted Data (Score:2)
For most organizations, though, real security is not that important. For those instances where it is necessary, though, having a single employee in a position to co
Re:BS (Score:2)
If you cant trust your IT people with this access, then they should be fired.
Ahhh, now we agree! Sadly, that's not my call.
Translucent Databases (Score:2)
While I disagree with the whole of this statement, I disagree most vehemently with the part in bold, so I'll address that.
In world that cared about data security, NO EMPLOYEE WOULD EVER BE GIVEN ACCESS TO CUSTOMER DATA THAT WAS ONLY USED TO DRIVE THE APPLICATION. Take a look at the ideas in the book Translucent Databases [wayner.org] (actually, even just read the summary on that page) a
All Data. No exceptions (Score:2)
If the CEO wants the data, its his to have. Period. End of disscussion.
Re:BS (Score:2)
I reckon this is rubbish. I reckon that user data should be encrypted, so that only the people who the user wants to give access to it gets access to it, and that includes IT staff. If I get my way - and as an IT Manager I just might - I'll be putting in place systems that devolve authority to determine who reads what to the people who own the data, and that'
Re:BS (Score:2)
Re:Very true (Score:2)
Re:Very true (Score:3, Insightful)
Agreed (Score:5, Funny)
Re:Agreed (Score:2, Funny)
But I bet you found it again, taped under your keyboard.
Re:Agreed (Score:2)
Too much trust... (Score:5, Interesting)
Re:Too much trust... (Score:2)
Let's call that the "post-9/11 effect".
Re:Too much trust... (Score:2, Insightful)
Its VERY common to seperate the administrative tasks of purchasing and renewing maintenance agreements away from engineering.
another book... (Score:2)
(Although when I read the title, I kept thinking of detecting things that are extruded. WARNING! SILLY PUTTY FUN FACTORY [feelingretro.com] DETECTED.)
I hate books like this (Score:4, Insightful)
I hate posts like this (Score:2)
I hate to point it out to you, but company rules (and government laws, btw) are not written for those who are already doing good. They are written to limit the impact that someone who lacks your good behavior.
Other posts have commented about the balance involved, and it is a difficult one to strike. In many cases, the official geeks (i.e. IT staff charged with maintaing the systems, etc.) need greater access, but part of the company's process should include a method of documenting who gets such access,
Do you want that trust? (Score:3, Insightful)
woo,... (Score:4, Funny)
Yes, which is why we "need" Trusted Computing(tm) which will solve all of our problems.
Oblivious to the problem, or resigned to it? (Score:5, Insightful)
I doubt many companies are "oblivious" to the insider threat, it's just considered an acceptable cost of doing business. For example, a grocery store I used to work at knew perfectly well that their employees were lifting candy from the bulk candy dispenser (to pick an example). But they also knew the money they lost on that was significantly less than the cost of installing cameras and paying someone to review the tapes, or than the cost in lost sales of eliminating the bulk candy dispenser. So, when someone was caught red-handed, they were read the riot act (at least) or outright fired (at worst), but no special effort was made to catch people.
I don't think the owners of that grocery store were business prodigies, either. My guess is that the same sort of logic applies to most employers: the cost of preventing the infraction is higher than the cost of allowing it. The truth of this is reflected in which industries do protect themselves against the "insider threat": places like casinos, where a successfully criminal insider could lose them huge quantities of money.
Meanwhile, the book seems to make the same suggestion a lot of security experts do: if a user doesn't need the technology, then don't let them use it. This sounds good, but it carries costs, too. First, of course, the cost of setting up and maintaining a network that enforces such policies. But second, the cost in employee morale, which cannot be discounted. Another job I had not all that long ago was in an office that didn't allow its employees to listen to talk radio. Music was fine, but talk radio was too much of a distraction. Since you didn't need it to do your job, you weren't allowed to have it.
The effect on morale was, to put it mildly, negative. Honestly, it's one of the reasons I didn't have the job for very long. Email and internet access are similar: employees have become accustomed, rightly or wrongly, to some personal use of these technologies. Take that away, and you're sure to end up with disgruntled employees, no matter how rational your reasons.
Moreover, it's a question of trust. If you demonstrate to all your employees that you don't trust them, odds are good you'll increase the number of employees who will live up (or down, if you prefer) to your expectation. At best, you'll incur the costs associated with high turnover rates. At worst, you'll fall victim to even more pernicious crime than you otherwise might have.
I guess the point is, it's not necessarily ignorance or even apathy that causes businesses to be vulnerable to insiders, it's simple cost/benefit analysis.
Be careful not to over simplify the issue (Score:2)
What I can say is that protecting a company's financial information or intellectual property is of much greater value to our company than some missing inventory. I also know that after having read this review, I am interested in understa
Re:Oblivious to the problem, or resigned to it? (Score:3, Insightful)
You don't want me to do some personal emailing from the work account? Fine, I'll make sure that I work exactly 8 hours a day, so that I get to have enough time to email from home. You expect me to d
Re:Oblivious to the problem, or resigned to it? (Score:2)
Paranoia the destroyer, and it goes like this... (Score:2)
These guys are right, but how am I supposed to trust them?
Re:Paranoia the destroyer, and it goes like this.. (Score:2)
Did the Submitter Even Read the Book? (Score:2)
whatever... (Score:5, Funny)
never mind
they have no idea! (Score:3, Interesting)
Re:they have no idea! (Score:3, Interesting)
1. Requiring special characters
2. Requiring a lower case and a upper case letter
3. Changing passwords every 30 days
4. No common words
This all leads to lower security with a post it note.
Re:they have no idea! (Score:2)
Anyone with an IQ over room temperature can memorize a sequence of 8 alphanumeric characters.
If they can't, they shouldn't be working for you. Period.
Same for writing it down - it should be a terminable offense.
Re:they have no idea! (Score:2)
They shouldn't have to. What are the chances of an intruder getting a password from a brute force attack over a post it note?
Re:they have no idea! (Score:2)
Re:they have no idea! (Score:2)
Computer security in general is dangerously bad, damn right we should be intolerant.
Re:they have no idea! (Score:2)
That's the problem.
Re:they have no idea! (Score:2)
Re:they have no idea! (Score:2, Interesting)
All I have to remember is where to start and the pattern (which is easy).
Re:they have no idea! (Score:2)
I was at the railing overlooking the 'center' of the mall. By coincidence, the security desk was directly below me... I watched the security guy type in his password.
So much for strong password policy.
From a healthcare perspective (Score:5, Interesting)
The main issue is that most people can look at any patient. This is considered a "necessary evil" as sometimes unexpected clinicians might be looking at a patient's information and we don't want to block access in a life threatening situation. Instead, we review access after the fact, in addition to putting certain blocks in place:
Re:From a healthcare perspective (Score:2)
Comment removed (Score:3, Funny)
Re:"too much access ... than they should ... have" (Score:2)
TRUST NO ONE (Score:3, Interesting)
God I'd hate to live in the world you would create.
Re:TRUST NO ONE (Score:2)
>>God I'd hate to live in the world you would create.
Here's an idea. Start up company... say, retail, perhaps. Make sure that the data used in managing that business involves personnel records, credit card data, health insurance policies, bank info - all the usual stuff. And then hire a bunch of people, trusting all of them entirely to have access to everything. Let us kno
Re:TRUST NO ONE (Score:2)
Indeed. Who needs to do a google search for their programming problems anyway, when you could just spend 3 weeks trying to solve it by yourself?
Re:TRUST NO ONE (Score:2)
This has the added bonus that someone who sits down at someone else's desk has limited access. Without managing rights, someone unauthorize
OH, joy. Another anti-IT witch-hunting book. Yay! (Score:2, Interesting)
Meanwhile, over here IN REAL LIFE, people like me are running a company's entire business, with full access to everything, and yet, we don't break the law! We don
Re:OH, joy. Another anti-IT witch-hunting book. Ya (Score:2)
Who reads it? Management, which feels above reproach and won't consider itself to be a threat even if your book has a whole chapter entitled "Management: A Threat".
Who suffers from it? Mostly the IT department, because they're closest to the data, they're usually not politically connected at all, and Management doesn't know
Re:OH, joy. Another anti-IT witch-hunting book. Ya (Score:2)
Too much lockdown costs money too (Score:3, Interesting)
Re:Too much lockdown costs money too (Score:2)
most of our developpers have full access to all the databases they need.
they also have a bit more access than they really should have, usually because they're debugging something which requires it. most of the time it's not a huge problem,
but sometimes it can come back to bite your ass. recently, one service crashed. and none of the admins were there to fix it. one of our brillant developper (which also happen to be a manager) decided he'd "fix" it himself, and he
Re:Too much lockdown costs money too (Score:2)
I have seen that before. You got know your limits. I am often given sysadmin privledges to an Oracle database. If I see any kind of wierd Oracle error I don't even try to play with it. That's the DBA's problem.
Re:Too much lockdown costs money too (Score:3, Informative)
Re:Too much lockdown costs money too (Score:2)
And then I sit on my hands and post to Slashdot.
Now that's a clever trick...
I agree (Score:2)
my take on trust (Score:5, Funny)
...Is that Zen or what?
THE INTERNET IS NOT SECURE (Score:2)
The Internet is not secure.
And it does not need to be.
It was not designed so that large corporations could sell security services on it.
The Internet is an open field. A common.
If you want the Cone of Silence, you know where to find it.
THE BOOK IS NOT ABOUT THE INTERNET (Score:2)
The intranet is not the internet.
And the main subject of the book does not address the internet.
It was not designed to address internet security.
The intranet is not an open field. (P.S. Ever hear of the "tragedy of the commons"? Calling the internet a common is, to say the least, unnerving.)
If you want the award of irrelevance, continue commenting.
Re:THE BOOK IS NOT ABOUT THE INTERNET (Score:2)
And the internet is a common. The RFCs aren't standards except by consent of those choosing to stay within them.
Look in the mirror for that irrelevance you seek.
Security has a cost (Score:5, Insightful)
One of the wisest comments I've heard on security was: security is the tax that the rest of us pay because some people are immoral.
Security has a definite cost. Casinos are probably the extreme example. They tend to hire people paid an hourly wage who handle large amounts of money. Perhaps they have little choice but to watch them all them time. The people who are working at the casino are generally willing to put up with a total surveilance work environment because the jobs pay better than most relatively unskilled jobs.
I have not read the book that was reviewed, but the reviewer seems to sugget that something like this kind of total surveilance environment is desirable. The problem is that such an environment exacts a cost from the majority of honest and moral people in the hope that it will deter or catch those who are dishonest. A heavily restricted surveilance environment is likely to drive anyway many people who have other job options. As espionage scandels have shown, there is never any guarantee that any set of counter measures will assure that someone does not betray trust.
There has to always be a balance between risk and the cost of the security measures. Security "professional" like the reviewer seem to forget this. After all, it is not their problem when people quit for a more pleasant environment or when the organization cannot attract highly qualified people who can choose to work elsewhere.
IT Security (Score:2, Interesting)
I'd have to say that this is actually blown a tad out of proportion.
I used to work as a HelpDesk Technician for a school. This job was a tad different than ordinary HelpDesk positions at other places. I didn't handle problems over the phone. I'd walk to the office and fix it there. Now to do my job I was told the password for the built-in admin account on every machine. I was just a volunteer too.
However, I often needed to get into someones office when that person was absent. So I had to call security and
A Better Solution (Score:3, Insightful)
- Hire good employees, who are relatively honest and straightforward people. This includes everyone -- IT, Sales, Administrative, etc. If they arent honest, they shouldnt be working here. (This also tends to help with Corporate Responsibility -- how NOT to fudge the books in a crunch..) There are decent HR personality tests that can reasonably predict if someone would be untrustworthy in different situations.
- Deal with your employees fairly, honestly, and be upfront. This will minimize the biggest source of insider problems -- disgruntled employees. For example, giving yourself a raise after or just before laying off other employees, is generally a Bad Thing (tm). Try to be honest with employees about their performance, what is expected, and what wont fly. Provide regular, upfront feedback. Follow through with action. Be Kind, Understanding, but Firm.
- Trust your employees to make sound decisions. The employee who is berated and treated as if they "cant be trusted" will eventually turn into the employee who you fear them to be. If you dont trust them to start, then why should they care? More over, if you dont trust them, why did you hire them?
- Give people ample access to what they need, but not so much access that it impedes others. For example, the IT administrator should have access to quite a bit. Asking for a password to do their job is no only unefficient, its demeaning and downright stupid. Do you trust the IT people you have hired? Do you believe them to be competent? If so, then let them do their job. If not, then why did you hire them or why are they still working there? Its incredibly frustrating to employees to do what this book reccomends -- lock down access. Its frustrating to the employee becuase they have to "ask" to do their job. And its frustrating to management, who has to constantly hand-hold entering passwords as the employee progresses. Cut the leash.
Overall, I think its important for IT security people and Management to understand these risks. TO watch for violations. But to base your company security policies on these type of ideas would be lunacy, and would kill any sort of company morale you might have had going for you. Its much easier to trust the people you work for, pay them fairly and well, and treat them like human beings than it is to try to lock them down in every way to "prevent" bad things.
Certainly there are exceptions where even the very small percentage of bad employees can cause very large damage to the company. This should be dealt with appropriately within those industries -- and employees should know this DURING the application process, so they know what kind of BigBrother situation they are getting into.
B
Re:A Better Solution (Score:2)
Guilty till proven innocent? Or..? (Score:2)
The other way is to trust everybody - that tends to make people feel responsibility for the company, the team, the project or whatever. This doesn't mean that everybody should h
Re:Everyone's password is taped to their monitor . (Score:2)
This was the policy at one site I worked at. One day I had to ask a fellow worker to show me some bad data on their workstation. They had gone home for the day, but a nearby cube dweller helped me out.
The password for the month (in that entire office) had been agreed to be abcyyyymm or some such where abc is known to everybody and the rest is just the date.