Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Books Media Book Reviews

Insider Threat 156

Ben Rothke writes "Thousands of computer security books have been published that deal with every conceivable security issue and technology. But Insider Threat is one of the first to deal with one of the most significant threats to an organizations, namely that of the trusted insider. The problem is that within information technology, many users have far too much access and trust than they should truly have." Read the rest of Ben's review.
Insider Threat
author Eric Cole and Sandra Ring
pages 397
publisher Syngress
rating 9
reviewer Ben Rothke
ISBN 1597490482
summary Excellent overview of the insider threat to networks and information systems


The retail and gambling sectors have long understood the danger of the insider threat and have built their security frameworks to protect against both the insider and the outsider. Shoplifters are a huge bane to the retail industry, exceeded only by thefts from internal employees behind the registers. The cameras and guards in casinos are looking at both those in front of and behind the gambling tables. Casinos understand quite well that when an employee is spending 40 hours a week at their location dealing with hundreds of thousands of dollars; over time, they will learn where the vulnerabilities and weaknesses are. For a minority of these insiders, they will commit fraud, which is invariably much worse than any activity an outsider could alone carry out.

Insider Threat is mainly a book of real-life events that detail how the insider threat is a problem that affects every organization in every industry. In story after story, the book details how trusted employees will find weaknesses in systems in order to carry out financial or political attacks against their employers. It is the responsibility to the organization to ensure that their infrastructure is designed to detect these insiders and their systems resilient enough to defend against them. This is clearly not a trivial task.

The authors note that the crux of the problem is that many organizations tend to think that once they hire an employee or contractor, that the person is now part of a trusted group of dedicated and loyal employees. Given that many organizations don't perform background checks on their prospective employees, they are placing a significant level of trust in people they barely know. While the vast majority of employees can be trusted and are honest, the danger of the insider threat is that it is the proverbial bad apple that can take down the entire tree. The book details numerous stories of how a single bad employee has caused a company to go out of business.

Part of the problem with the insider threat is that since companies are oblivious to it, they do not have a framework in place to determine when it is happening, and to deal with it when it occurs. With that, when the insider attack does occur, which it invariably will, companies have to scramble to recover. Many times, they are simply unable to recover, as the book details in the cases of Omega Engineering and Barings Bank.

The premise of Insider Threat is that companies that don't have a proactive plan to deal with insider threats will ultimately be a victim of insider threats. The 10 chapters in the book expand on this and provide analysis to each scenario described.

Chapter 1 defines what exactly insider threats are and provides a number of ways to prevent insider threats. The authors note that there is no silver bullet solution or single thing that can be done to prevent and insider threat. The only way to do this is via a comprehensive program that must be developed within the framework of the information security group. Fortunately, all of these things are part of a basic information security program including fundamental topics like security awareness, separation and rotation of duties, least privilege to systems, logging and auditing, and more.

The irony of all of the solutions suggested in chapter one is that not a single one of them is rocket science. All of them are security 101 and don't require any sort of expensive software or hardware. Part of this bitter irony is that companies are oblivious to these insider threats and will spend huge amounts of money to protect against the proverbial evil hacker, being oblivious to the nefarious accounts receivable clerk in the back office that is draining the coffers.

One example the book provides is that many companies feel they are safe because they encrypt data. An excellent idea detailed in chapter two is to set up a sniffer and examine the traffic on the internal network to ensure that the data is indeed encrypted. The reliance on encryption will not work if it is not setup or configured correctly. The only way to know with certainty is to test it and see how it is transmitted over the wire. Many companies will be surprised that data that should be unreadable is being transmitted in the clear.

Some of the suggestions that authors propose will likely ruffle some feathers. Ideas such as restricting Internet, email, IM and web access to a limited number of users may sound absurd to some. But unless there is a compelling business need for a user to have these technologies, they should be prohibited. Not only will the insider threat threshold be lowered, productivity will likely increase also.

The author's also suggest prohibiting iPods or similar devices in a corporate environment. The same device that can store gigabytes of music can also be used to illicitly transfer gigabytes of corporate data.

Insider Threat provides verifiable stories from every industry and sector, be it commercial or government. The challenge of dealing with the insider threat is that it requires most organizations to completely rethink the way they relate to security. It is a challenge that many organizations would prefer to remain obvious to, given the uncomfortable nature of the insider threat. But given that the threats are only getting worse, ignoring them is inviting peril.

The only lacking of the book is that even though it provides a number of countermeasures and suggestions, they are someone scattered and written in an unstructured way. It is hoped that the authors will write a follow-up book that details a thorough methodology and framework for dealing with the insider threat.

Overall, Insider Threat is an important work that should be required reading for every information security professional and technology manager. The issue of the insider threat is real and only getter worse. Those that choose to ignore it are only inviting disaster. Those companies that will put office supplies and coffee under double-lock and key, while doing nothing to contain the insider threat are simply misguided and putting their organization at risk.

Insider Threat is a wake-up call that should revive anyone who doubts the insider threat.

Ben Rothke, CISSP is a New York City based security consultant and the author of Computer Security 20 Things Every Employee Should Know (McGraw-Hill 2006) and can be reached at ben@rothke.com"


You can purchase Insider Threat from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Insider Threat

Comments Filter:
  • Very true (Score:5, Insightful)

    by HangingChad ( 677530 ) on Friday January 06, 2006 @03:40PM (#14411163) Homepage
    The problem is that within information technology, many users have far too much access and trust than they should truly have.

    Another problem I've seen is execs granting themselves and their assistants way more access than they really need to do their job. It's a power issue for some of them. I run the company and should be able to get to anything.

    That's not every company and SOX has made thinking about the consequences more attractive for the higher ups.

    • is that I would want access in case, for whatever reason, I had to throw the admin out the door and get someone else to his job.
      • "I would want access in case, for whatever reason"

        That's far too sensible. What's supposed to happen is, you find out that some vital system is broken and the two people who have passwords necessary to fix it are on holiday or at a conference.

        Even better if it's something like desktop OS or firewall upgrades needing authorisation, when a new virus comes out.
        • "What's supposed to happen is, you find out that some vital system is broken and the two people who have passwords necessary to fix it are on holiday or at a conference."

          You put blame on whoever drawed the policy or whoever didn't comply with it. If it is such a sensible resource how the heck is that there's no way to recall it at any moment?

          You can ask "what if the two knowing the password are out of office?" just the same you can ask "what if both pilot and copilot become intoxicated over Atlantic?" Bot
    • Re:Very true (Score:5, Interesting)

      by DaHat ( 247651 ) on Friday January 06, 2006 @03:46PM (#14411219)
      There can often be a trickle down effect of that as well... resulting in nearly the entire company having too much access.

      The company I work for for instance... EVERYONE has administrator rights to their desktop. Everyone from us lowly engineers in the back who bend our machines to their limits... up to the sales people who just use our proprietary apps (which do not require admin access) and Outlook.

      Long ago, IT tried to restrict most users... unfortunately enough complained about not being able to do what they wanted (not always what they needed to do), and the policy was reversed.

      This has of course enabled HR persons to install spyware that was suggested by a secretary.

      I am still waiting for the day we have someone run a piece of malware who didn't know any better that brings the entire network, and most of it's users to their knees.
      • by diersing ( 679767 )
        "I am still waiting for the day we have someone run a piece of malware who didn't know any better that brings the entire network, and most of it's users to their knees."

        Why wait, just schedule it the night you leave for vacation.

      • I dodn't know if thats such a bad thing all of the time. ESP if you work on the road or from home. A lot of the people at our company and IT company are pretty good with computers. If a machine breaks it's quicker and cheaper to be able to fix it right then and there rather than calling desktop support and having them charge you for the repairs. It's even better now that our help desk is outsourced to India. Not that thats bad but sometimes it's hard to understand. Granted the way our install images are set
        • Huh? (Score:4, Insightful)

          by TWX ( 665546 ) on Friday January 06, 2006 @04:18PM (#14411496)
          I'm trying to figure out if you're attempting to be sarcastic in places or not, but I'm still not quite sure.

          The keys need to be held by only a small group of people. "Too many cooks spoil the soup" applies very well to a corporate network, even down to the workstation configuration. It's possible to screw up the whole enchilada from that point too, ore at least have some major negative effect, and it's much better that if the intent is for it to be a managed network for it to be managed, dammit. If not, it's a free-for-all.

          Many of my users are very smart people. Unfortunately, they're good only with their own home PCs. They don't understand why we don't always do things the same way that they themselves do them, nor will they until they come to appreciate the demands that present themselves in trying to keep a 30,000 computer network up and functioning for everyone despite their different needs. Where I work, our network is supported by ten field and bench technicians, two data cabling technicians, two telephone system technicians, and four helpdesk persons as far as interface-with-the-user support is concerned. Our back end is four network engineers, four software specialists, one AS/400 administrator, two Computer Operators, and a slew of programmers to write the software that the users will do their jobs with. It's a very, very small department given the size of the organization, and if we had better, tighter control over the security of the workstations it'd be a much easier job.
          • Re:Huh? (Score:5, Insightful)

            by dgatwood ( 11270 ) on Friday January 06, 2006 @05:25PM (#14412110) Homepage Journal
            There are two methods of IT:

            1. Tight control. In this method, the IT people keep the users from doing anything to break or fix the systems.
            2. Hands off. In this method, the IT people say "fix it yourself".

            In my opinion the first one rarely works for very long.

            IT administrators should tell new employees from the very beginning that they will maintain the network, period. If somebody screws up their machine, the IT folks might help the user figure out how to fix it, but the person should have to do the actual work him/herself. This encourages people to take responsibility for their actions, which leads to people actually taking care of their work machines. That was the policy at my former employer (though they did help the marketing folks a bit). It's also the policy of my current employer. From what I have seen, it has worked extremely well.

            Putting in a paranoid policy like not giving users admin rights to their own workstations only coddles the users and lulls them into a false sense of security. After all, the IT department is protecting them from breaking anything, so no matter what they do, if the software lets them, it must be safe. It leads to people doing utterly stupid things that they would never do with their own machines---precisely because on their own machines, they would have to fix it if they break it.

            As for the premise that users will screw things up if they have any control, my experience has been exactly the opposite. I find that software lock-downs tend to be buggy and cause more problems than they solve. I've seen university computer labs run in a paranoid style and university labs with nearly identical machines run with an open policy. The paranoid lab constantly experienced weird crashes and generally unusable systems. The "do what you want" lab, to my knowledge, hasn't had any non-hardware-related service calls since I helped set it up in 1996.

            It is my experience that trusting people until they prove to be idiots is always the best policy. If you trust someone and they betray your trust, you will never trust them again, and they know this. Thus, trusting someone tends to inspire trustworthy behavior. By contrast, paranoid information hiding, control hoarding, and other such authoritarian behavior tends to breed suspicion and contempt, which tends to lead to untrustworthy behavior.

            For example, companies that tend to closely guard their secrets within the company, only providing information to people with a "need to know" tend to have much higher leak rates than companies that are open and trusting of their employees. This boils down to basic psychology. Secrecy breeds a feeling of power---that excitement over knowing something that no one else knows---and the only way to exercise that power is by proving to others that you do, in fact, know something that they don't know, which can only be done by leaking information. If you can share that information within the company, most people do so out of loyalty to the company. If you can't, the destination of the leaked information tends to be the press.

            This isn't to say that monitoring for improper behavior isn't useful. It is always a good thing to find out quickly when someone is betraying your trust, allowing you to take immediate corrective action. In the field of IT, for example, you should have the ability to detect suspicious network activity, break-in attempts, etc. Centralized system logging can also be useful in this regard. However, if you trust people until they show reason not to do so, the vast majority of people will behave appropriately. If you distrust people until they earn your trust, the majority of people will do everything they can to work around you and subvert your control. That is not a healthy work environment.

            Personally, I've always said that the best way to stop press leaks from a company is to create a competing rumor site, see who submits information to it, and take corrective action. Introduce a situation where an une

            • Re:Huh? (Score:5, Insightful)

              by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday January 06, 2006 @06:40PM (#14412763) Homepage Journal

              IT administrators should tell new employees from the very beginning that they will maintain the network, period. If somebody screws up their machine, the IT folks might help the user figure out how to fix it, but the person should have to do the actual work him/herself.

              So what happens if they can't fix it? Do you just fire them, reload their computer, and hire the next guy?

              What makes the most sense to me is to store all a user's data on the network, forcing them to do so if at all possible but at minimum making it easy to do so, and have a system image for each PC in your organization. If they scrag their computer somehow, then you can just reload from the image and move on with your life.

              • Very much agreed except for the forcing part. If you make it easy enough, you shouldn't need to force them, and if you find yourself needing to force them, something is wrong.

      • Long ago, IT tried to restrict most users... unfortunately enough complained about not being able to do what they wanted (not always what they needed to do), and the policy was reversed.

        So what? Each and every version of MS Office has codefied escalation to admin privledges at some points - even when running with a restricted user account. MS has acknowledged this problem. Running as a restricted user is only a small part of defense-in-depth.
      • Actually, what you are saying is a poor excuse. IT obviously does not have enough experienced people who know how to actually implement group policies (on whatever OS you use), design a security hierarchy, etc. Once you have a framework in place, it's really quite easy to grant exceptions.

        I run a small server 2003/win xp network and it's really pretty easy to do anything you want. You can even choose a default list of bookmarks to put on a user's web browser.

        One problem I have noticed is shitty web based
      • The thing is: Windows doesn't work very well when you don't have administrator rights: once I had lots of trouble roaming from one point to another with my laptop, as it took numerous reboot to have the laptop find the servers.
        Olny once I had the administrator password, I was able to connect succesfully.

        As always, it's a conflict between ease of use and security..
    • Re:Very true (Score:2, Interesting)

      by wiz31337 ( 154231 )
      Very true, some of the most knowledgeable people at a company are its administrative assistants. They sit in on meetings and soak up the knowledge, and they need access to many different files (or servers as the case often is) so they can update files, or post notes. They are not often the highest paid either, so if someone offers them a lot of money to get some information, they just may crack.

      This book reminded me of another good read, "Art of Deception" by Kevin D. Mitnick. You would be surprised how
      • Re:Very true (Score:3, Informative)

        by TWX ( 665546 )
        "This book reminded me of another good read, Art of Deception by Kevin D. Mitnick. You would be surprised how easy it is to get information from people."

        No, I wouldn't be surprised. I'm able to figure out any random user's password about 70% of the time just based on their pictures or other obvious habits. Couple that with organizations that give users full local computer administrator access (the bane of any kind of real security) and weak password schemes on remote systems and it's a wonder that there a
        • I've worked for large corporations for most of my adult life. I've always had admin access to myown machines, and I doubt I'de work at a company that thoight it was necessary to remove it. A network admin that can't keep the network usable, despite all the crazy shit users get up to, just isn't very good at his job, IMO.

          This book (and especially Mitnik's book) points up the folly of infuritating password schemes where each user has 17 passowrds that are each changed on different schedules with different r
          • "I've worked for large corporations for most of my adult life. I've always had admin access to myown machines, and I doubt I'de work at a company that thoight it was necessary to remove it. A network admin that can't keep the network usable, despite all the crazy shit users get up to, just isn't very good at his job, IMO."

            When I was a student at the university I used the X-terminals that they had in the computing sites. I didn't have any access more than any one else who had userlevel access to the UNIX

            • I guess my point, more consicely, is "admin access is dangerous - I'd be foolish to trust my loser network admins with it". Of course, network admins say the same thing about users, but what do they know? ;)
        • I'm usually able to get people's passwords with a win2k installation cd, a floppy disk and my dvd of rainbow tables.

          The two longest parts of the process are waiting for the win2k cd to boot, and loading the rainbow tables into RAM.
          • Yeah, but that's a lot more involved than just walking up the Director of Audit Service's desk, looking at the back of the photo of his daughter, and typing the name that was written down into the password field...

            And the process that I describe and use really doesn't take any skill with a computer.
      • by Hockers ( 871149 )
        This book reminded me of another good read, "Art of Deception" by Kevin D. Mitnick. You would be surprised how easy it is to get information from people.
        You mean you actually believed what you read in a book called the 'Art of Deception' describing how easy it is to con people ito believing things ?
      • Re:Very true (Score:4, Interesting)

        by udderly ( 890305 ) on Friday January 06, 2006 @04:36PM (#14411654)

        This book reminded me of another good read, "Art of Deception" by Kevin D. Mitnick. You would be surprised how easy it is to get information from people.

        I was working for a large retailer about five years ago when I accidentally sent the wrong pricing file for a sign-making program to all 105 stores in our marketing area. So I needed to get into each store's computer via PC Anywhere and manually change the file. It went something like this:

        Mgr or Asst. Mgr.: This is Mr./Mrs./Ms. Manager, how can I help you?

        Me: Hi, I know that you don't know me but this is Joe from Advertising. I make up the signs and there's an error with next week's file that I need to fix.

        Mgr or Asst. Mgr.: Oh, well we certainly don't need wrong information on our signs. What do you need me to do?

        Me: Right click on Network Neighborhood, double-click the connection and read me your IP address.

        Mgr or Asst. Mgr.: Okay, it's xxx.xxx.x.xxx

        Me: Super. I will be in your computer changing some stuff for a few minutes so don't be alarmed if stuff starts happening on your screen.

        Mgr or Asst. Mgr.: Okay, thanks.

        The crazy thing about it is *not one person* in the 105 stores ever questioned whether I should have that information even though none of them knew me or could ascertain where I was calling from. Not even close--they all cheerfully did what I asked without hesitation. Scary!!

    • So, so true. I nearly got disciplined once for explaining to my boss that I wasn't going to give him root access on our Debian boxes.
      • Re:Very true (Score:3, Insightful)

        by sjwaste ( 780063 )
        So, so true. I nearly got disciplined once for explaining to my boss that I wasn't going to give him root access on our Debian boxes.

        And you should have been. You don't go "telling" your boss what he can or cannot have, he's your boss. If he tells you to do it, do it. It's then his liability.

        Why are there so many IT people with zero interpersonal skills? Instead of flat out refusing, you could've simply explained why it wouldnt be a good idea. It's your job to present the facts, and you can even
        • And you should have been. You don't go "telling" your boss what he can or cannot have, he's your boss. If he tells you to do it, do it. It's then his liability.

          It's a tough call, but this could have had significant liability for the employee as well. Depending on the circumstances, this may have been a time to put the foot down.

        • A good boss should be able to choke down his rage and look at what's being said objectively. Also, a lot of people don't feel the need to 'spin' things when they think the reasons are obvious.

          Sometimes there are liability issues too. If that box is your responsibility and your boss fscks things up... guess what, it is still your responsibility.

          Feel free to write a memo 'confirming' what the boss has put in place. This is standard ass-covering tactics if you are forced to do something you don't like. Oh and
        • I'm sorry it took me so long to get back to this post, but now that I've discovered it, I can't help but respond. It is, by far, one of the silliest things I've ever read.

          The truth is that some managers are simply incapable of grasping the technical intricacies of IT work. This was clearly the case here: my boss at the time heard me say "root access" once in relation to one of our servers, and said that he wanted the ability to perform root tasks. He didn't need that ability, wouldn't have known how to appl
          • I dont disagree with your motives, I disagree with "telling" as I've often see IT people in this building "tell" people things. There's more diplomatic ways to handle the situation.

            I don't think your boss should've had root in the context you described, and I do agree that he's unsophisticated given the portrait you've given.

            My only disagreement, and probably why you almost got disciplined, was the way you implied you handled it. You'll likely never lose your job in IT for being unpolished, but you'
    • BS (Score:3, Insightful)

      by nurb432 ( 527695 )
      'IT' needs access to do its job. We need *total* access to all systems and data or we cant be effective and might as well not goto work.

      Anyone that stands in the way of this should be fired.

      If you cant trust your IT people with this access, then they should be fired.

      As far as the owner having total access, well its his f-ing place. HIS butt is on the line.. He gets what he wants, always. Deal with it.
      • OK, I can't trust anyone with total access to all the data in my entire corporation, so I just fired my entire IT staff as per your instructions. What do I do now?
      • That's not true. Each IT person or group needs only enough access to maintain the systems it is responsible for maintaining. If your company is small, that may mean that the IT guy has access to all the systems. If it's larger, though, it is a very good idea to partition access. This is not just a question of trust -- it also forces the person responsible for each element to be involved in (and therefore aware of) any changes made to it.

        Data is another matter. IT does not need access to sensitive busin
        • And how many times has this been tried and yet when its most important you get the call 'can you fix this file for me'? Most every time.

          Nope, no encryption allowed on my network, unless i hold the key.

          If any data is missing, you will be accused regardless. You are the computer guy remember, its ALWAYS your fault.
      • by lgw ( 121541 )
        IT certainly has no access to any machine that's important to me getting real work done! Like I'm going to risk the data janitors touching something that's actually valuble to the company.

        If you cant trust your IT people with this access, then they should be fired.

        Ahhh, now we agree! Sadly, that's not my call.

      • IT' needs access to do its job. We need *total* access to all systems and data or we cant be effective and might as well not goto work.


        While I disagree with the whole of this statement, I disagree most vehemently with the part in bold, so I'll address that.

        In world that cared about data security, NO EMPLOYEE WOULD EVER BE GIVEN ACCESS TO CUSTOMER DATA THAT WAS ONLY USED TO DRIVE THE APPLICATION. Take a look at the ideas in the book Translucent Databases [wayner.org] (actually, even just read the summary on that page) a
        • Dont try coming to work for me is all i have to say about that.. You wouldnt last 10 minutes. ( if you even got hired )

          If the CEO wants the data, its his to have. Period. End of disscussion.

      • Hi there,

        'IT' needs access to do its job. We need *total* access to all systems and data or we cant be effective and might as well not goto work.

        I reckon this is rubbish. I reckon that user data should be encrypted, so that only the people who the user wants to give access to it gets access to it, and that includes IT staff. If I get my way - and as an IT Manager I just might - I'll be putting in place systems that devolve authority to determine who reads what to the people who own the data, and that'

    • What you're really saying is that the executives are the real weak point in the organization. Considering how much overall operational power they're granted, combined with an equally large level of a lack of oversight, you're close to realizing the real problem.
    • Re:Very true (Score:3, Insightful)

      by bobdehnhardt ( 18286 )
      I had the rare opportunity of pulling our CEO's physical access to the data centers because he had no business need for it. He responded that he liked to take potential clients on tours of the facilities, and the data center part was very impressive to them. I countered that he could still do that (wince), but he and his party would have to be escorted; consider it an opportunity to point out to potential clients how serious we are about security. It worked - he's told me that he has received several commen
  • Agreed (Score:5, Funny)

    by dilute ( 74234 ) on Friday January 06, 2006 @03:40PM (#14411164)
    I thought of hiding my root password from myself
    • Re:Agreed (Score:2, Funny)

      by Anonymous Coward
      I thought of hiding my root password from myself

      But I bet you found it again, taped under your keyboard.

      • I tape it under my mouse to be secure through being more obscure. Unfortunately since I've been using a GUI on the box, the ink got smudged since I wrote it down a year ago.
  • Too much trust... (Score:5, Interesting)

    by RandoX ( 828285 ) on Friday January 06, 2006 @03:41PM (#14411184)
    I've experienced working at a place where an employee walked out with information (and was subsequently sued into oblivion). Afterwards, all computers were locked down to the point where it made it nearly impossible to get any work done. Ever try to troubleshoot a data issue when you have to get your supervisor to log you into the database server every time? It can be hard to find a happy medium.
    • Afterwards, all computers were locked down to the point where it made it nearly impossible to get any work done. Ever try to troubleshoot a data issue when you have to get your supervisor to log you into the database server every time?

      Let's call that the "post-9/11 effect".
  • Another recent book on the same topic: Extrusion Detection: Security Monitoring for Internal Intrusions [barnesandnoble.com]. Haven't read it yet, but looks interesting.

    (Although when I read the title, I kept thinking of detecting things that are extruded. WARNING! SILLY PUTTY FUN FACTORY [feelingretro.com] DETECTED.)
  • by UndyingShadow ( 867720 ) on Friday January 06, 2006 @03:43PM (#14411193)
    I hate books like this. Management reads stuff like this and starts making it difficult for employees to get any work done. Worse yet is if they start trying to take away the IT department's power. In every environment I've ever worked in, I've EARNED the trust of my fellow geeks and been given access gradually. I dont abuse it. A good IT department never fully trusts anyone. I never fully expect to be trusted. These kinds of books just complicate that delicate geek balance.
    • I hate to point it out to you, but company rules (and government laws, btw) are not written for those who are already doing good. They are written to limit the impact that someone who lacks your good behavior.

      Other posts have commented about the balance involved, and it is a difficult one to strike. In many cases, the official geeks (i.e. IT staff charged with maintaing the systems, etc.) need greater access, but part of the company's process should include a method of documenting who gets such access,

  • by P3NIS_CLEAVER ( 860022 ) on Friday January 06, 2006 @03:46PM (#14411227) Journal
    A good security policy protects admnistrators too... if something happens you will be less likely to get blamed for something you didn't do.
  • woo,... (Score:4, Funny)

    by User 956 ( 568564 ) on Friday January 06, 2006 @03:48PM (#14411245) Homepage
    The problem is that within information technology, many users have far too much access and trust than they should truly have.

    Yes, which is why we "need" Trusted Computing(tm) which will solve all of our problems.
  • by Control Group ( 105494 ) on Friday January 06, 2006 @03:49PM (#14411265) Homepage
    This sounds bogus to me.

    I doubt many companies are "oblivious" to the insider threat, it's just considered an acceptable cost of doing business. For example, a grocery store I used to work at knew perfectly well that their employees were lifting candy from the bulk candy dispenser (to pick an example). But they also knew the money they lost on that was significantly less than the cost of installing cameras and paying someone to review the tapes, or than the cost in lost sales of eliminating the bulk candy dispenser. So, when someone was caught red-handed, they were read the riot act (at least) or outright fired (at worst), but no special effort was made to catch people.

    I don't think the owners of that grocery store were business prodigies, either. My guess is that the same sort of logic applies to most employers: the cost of preventing the infraction is higher than the cost of allowing it. The truth of this is reflected in which industries do protect themselves against the "insider threat": places like casinos, where a successfully criminal insider could lose them huge quantities of money.

    Meanwhile, the book seems to make the same suggestion a lot of security experts do: if a user doesn't need the technology, then don't let them use it. This sounds good, but it carries costs, too. First, of course, the cost of setting up and maintaining a network that enforces such policies. But second, the cost in employee morale, which cannot be discounted. Another job I had not all that long ago was in an office that didn't allow its employees to listen to talk radio. Music was fine, but talk radio was too much of a distraction. Since you didn't need it to do your job, you weren't allowed to have it.

    The effect on morale was, to put it mildly, negative. Honestly, it's one of the reasons I didn't have the job for very long. Email and internet access are similar: employees have become accustomed, rightly or wrongly, to some personal use of these technologies. Take that away, and you're sure to end up with disgruntled employees, no matter how rational your reasons.

    Moreover, it's a question of trust. If you demonstrate to all your employees that you don't trust them, odds are good you'll increase the number of employees who will live up (or down, if you prefer) to your expectation. At best, you'll incur the costs associated with high turnover rates. At worst, you'll fall victim to even more pernicious crime than you otherwise might have.

    I guess the point is, it's not necessarily ignorance or even apathy that causes businesses to be vulnerable to insiders, it's simple cost/benefit analysis.
    • The parent comment uses the example of bulk candy being lifted from a grocery store. This is an extreme example that doesn't accurately describe the situations that are addressed by the book. The comment also states that the issue at hand is a simple "cost/benefit analysis".

      What I can say is that protecting a company's financial information or intellectual property is of much greater value to our company than some missing inventory. I also know that after having read this review, I am interested in understa
    • In short, treat your employees like you would want them to treat you, and you'll be better off. I know this definitely applies to me. If my boss doesn't trust me and makes my life difficult because of that, I not only will not trust him, but will also make sure that something balances out the bad work atmosphere.

      You don't want me to do some personal emailing from the work account? Fine, I'll make sure that I work exactly 8 hours a day, so that I get to have enough time to email from home. You expect me to d
    • Having just read the review, I think the book agrees that security is a cost/benefit analysis, but that many companies screw up the analysis. They will throw good money after bad for external threats, but not secure themselves from internal ones. To pull another example from retail - our cashiers have "drive-offs", where someone does not pay for the gasoline they purchased and drives offs. There are only three possible ways for this to happen, the customer paid a cashier who either forgot to ring in teh sal
  • "But Insider Threat is one of the first to deal with one of the most significant threats to an organizations, namely that of the trusted insider. The problem is that within information technology, many users have far too much access and trust than they should truly have."

    These guys are right, but how am I supposed to trust them?
  • He says there's 10 chapters, but I only see two mentioned in the summary... and not even a suggestion of what may be in the other eight.
  • whatever... (Score:5, Funny)

    by revery ( 456516 ) <charles&cac2,net> on Friday January 06, 2006 @03:53PM (#14411287) Homepage
    This book is total crap and their conclusions about trusted insiders are all wrong. I know this because a friend of mine who worked at the publishing house leaked me a copy a few months early...

    never mind

  • they have no idea! (Score:3, Interesting)

    by firesuite ( 932268 ) on Friday January 06, 2006 @03:55PM (#14411307) Homepage
    Ive worked as a tech for 3 different companies since i moved over here to the states 2 years ago and in ev ery single company the CEO has his logon password on a post it note or equivalent and stuck to his monitor.. now thats secure! not saying its an american thing so please dont flame me :P im sure it happens worldwide.. maybe Gates does the same thing.. haha
    • It was probably caused by some crazy password policy that makes remembering the password impossible.

      1. Requiring special characters
      2. Requiring a lower case and a upper case letter
      3. Changing passwords every 30 days
      4. No common words

      This all leads to lower security with a post it note.
      • Which part of that policy is impossible?

        Anyone with an IQ over room temperature can memorize a sequence of 8 alphanumeric characters.

        If they can't, they shouldn't be working for you. Period.
        Same for writing it down - it should be a terminable offense.
        • Anyone with an IQ over room temperature can memorize a sequence of 8 alphanumeric characters.

          They shouldn't have to. What are the chances of an intruder getting a password from a brute force attack over a post it note?
        • I have serious problems with memorization. However, I am fairly bright. I learned crystal reports in a week (not a master, but enough to do about anything I need to do here) and I taught myself enough ECMAscript to do the client and server-side stuff for a bunch of fun stupid web tricks in a week. But, I cannot memorize anything without using it. I cannot memorize a password simply by reading and rereading it. Consequently, I write it down, and put it in my wallet. I have not lost my wallet since I was abou
          • and you are an unrealistic, intolerant so-and-so.
            We're talking about security, not "and how does that make you feel". As intolerance increases, so does security. Would you rather your password policy be something like "Ah, that's close enough, come on in"?
            Computer security in general is dangerously bad, damn right we should be intolerant.
        • Remembering 1 password is no problem. The trouble is, I have over 10 passwords. And some of them force me to change them once in a while.

          That's the problem.

      • When in an environment that demands those crazy passwords, the trick is not to use phrases/etc, but to use physical patterns on the keyboard. On, say, a 10 character crazy password, I'll have 5 keys pressed without shift pressed, in a pattern, being sure that at least one bit of the pattern crosses the number keys. Then I press shift and do another pattern, again hitting the number (now symbol) keys, to get my capitals and symbols.

        All I have to remember is where to start and the pattern (which is easy).
        • I was at a mall a few days ago, standing on the 2nd floor.

          I was at the railing overlooking the 'center' of the mall. By coincidence, the security desk was directly below me... I watched the security guy type in his password.

          So much for strong password policy.
  • by PIPBoy3000 ( 619296 ) on Friday January 06, 2006 @04:02PM (#14411367)
    I work in healthcare and one of my roles is to help in auditing.

    The main issue is that most people can look at any patient. This is considered a "necessary evil" as sometimes unexpected clinicians might be looking at a patient's information and we don't want to block access in a life threatening situation. Instead, we review access after the fact, in addition to putting certain blocks in place:
    • Unusual access is audited. This includes people looking at patients who happen to be employees, specific audits of local celebrities, and so on.
    • Random audits. Periodically, someone will check to see what a random person is doing.
    • Probation. New users are audited at certain points, to make sure they're not abusing their new power.
    • Hiding patients Certain patients are hidden from most users - this might include celebrities, legal issues, or patients who have requested it.
    I see trust as a necessary part of functioning within an organization, though trust must be tempered with watchfulness. I'm a big fan of letting people do what they want, and then "break their kneecaps" if they abuse that trust. In real terms, this means prosecution and the like. Of course, I don't decide such things - that gets passed on to our legal department and I try not to follow up after that.
  • by account_deleted ( 4530225 ) on Friday January 06, 2006 @04:13PM (#14411456)
    Comment removed based on user account deletion
  • TRUST NO ONE (Score:3, Interesting)

    by mary_will_grow ( 466638 ) on Friday January 06, 2006 @04:14PM (#14411473)
    The problem is that within information technology, many users have far too much access and trust than they should truly have.

    God I'd hate to live in the world you would create.

    • The problem is that within information technology, many users have far too much access and trust than they should truly have.

      >>God I'd hate to live in the world you would create.


      Here's an idea. Start up company... say, retail, perhaps. Make sure that the data used in managing that business involves personnel records, credit card data, health insurance policies, bank info - all the usual stuff. And then hire a bunch of people, trusting all of them entirely to have access to everything. Let us kno
    • "God I'd hate to live in the world you would create."

      Indeed. Who needs to do a google search for their programming problems anyway, when you could just spend 3 weeks trying to solve it by yourself?
    • We're talking about running a business here. People who don't need access to information shouldn't have access to information. It's called minimum necessary rights, and it's a basic tenet of security. If you grant all the capabilities you expect to need, and nothing else, then you wipe out a lot of potential attacks right off the bat without even knowing what they are.

      This has the added bonus that someone who sits down at someone else's desk has limited access. Without managing rights, someone unauthorize

  • Here we go again. Yet another book claiming that companies can't trust their employees, as if we're all crooked and evil (and not merely underpaid and mistreated, but that's another story). ANOTHER book justifying management treating us like shit. ANOTHER book telling the bosses what they want to hear. Hooray. And it's in a book so It Must Be True.

    Meanwhile, over here IN REAL LIFE, people like me are running a company's entire business, with full access to everything, and yet, we don't break the law! We don
  • by FriedTurkey ( 761642 ) on Friday January 06, 2006 @04:25PM (#14411568)
    I can't tell you how many times I have sat there doing nothing but billing a client because I didn't have security to a system. There is always just one guy who can give you access and he is on vacation. I can't tell you how many times I wasn't able to fix a production system because we needed some DBA to run some SQL script I wrote to fix the system. It's not like the DBA even looks at the scripts. I could've stuck in a statement to delete all the tables and he wouldn't have known. My last client had to give you MAC address to Server name security access. My motherboard fried so my MAC address changed. Of course server guy is on vacation. Eight hours X $150\hr = where is the savings? I know the majority of /. is UNIX/NT admin guys and not programmers so I probably won't get anybody to understand. It's safer for the admin guy's job to lock your system down then worry about development costs. If management really knew the additional software costs, developers wouldn't be locked down. Often it seems the admin guys have some kind of power trip with access. Am I really more of a security threat than the admin guy with lots of Lord of the Rings crap all over his cube?
    • sysadmin here, but i tend to agree with you.

      most of our developpers have full access to all the databases they need.
      they also have a bit more access than they really should have, usually because they're debugging something which requires it. most of the time it's not a huge problem,

      but sometimes it can come back to bite your ass. recently, one service crashed. and none of the admins were there to fix it. one of our brillant developper (which also happen to be a manager) decided he'd "fix" it himself, and he
      • one of our brillant developper (which also happen to be a manager) decided he'd "fix" it himself, and he ended up turning what would have been a 5 minutes fix into a 3 hours outage for 4000 customers, if he had just waiting 10 more minutes. he doesn't have access anymore.

        I have seen that before. You got know your limits. I am often given sysadmin privledges to an Oracle database. If I see any kind of wierd Oracle error I don't even try to play with it. That's the DBA's problem.
    • The problem really isn't the IT Admin. Largely, they implement rules handed down to them by management. In your case (and I have first hand experience of your problem as well), the problem is far more likely to reside with management, who are unable to do a cost-benefit analyis of a given situation. How much could it cost to give you access versus how much does it cost not to give you access? Personally, I try to make this as crystal clear as possible to management, and sometimes, I get through. Sometimes t
  • I have to agree. I work for a school district where I have complete access for every workstation as well as every server. True, I am a computer tech here, but still, the few things I do on a server shouldn't give me access to pretty much turning it into a FUBAR machine. Office staff and District Office personnel are even worse. They have full access to whatever they want on their machines. And all they do is use MS Office and a few programs for the district. Though it is kinda fun to search the server
  • by east coast ( 590680 ) on Friday January 06, 2006 @04:40PM (#14411684)
    Once I was asked by a friends father who he could trust to run his IT department and I told him "you can trust no one" and he told me "East, everyday I trust Jesus Christ as my Lord and Savior" and I simply asked him "Is Jesus your SysAdmin?". I don't think I ever spoke to the man again...

    ...Is that Zen or what?
  • The Internet is an enabling technology.

    The Internet is not secure.

    And it does not need to be.

    It was not designed so that large corporations could sell security services on it.

    The Internet is an open field. A common.

    If you want the Cone of Silence, you know where to find it.
    • The intranet is not the internet.

      And the main subject of the book does not address the internet.

      It was not designed to address internet security.

      The intranet is not an open field. (P.S. Ever hear of the "tragedy of the commons"? Calling the internet a common is, to say the least, unnerving.)

      If you want the award of irrelevance, continue commenting.

  • by wintermute42 ( 710554 ) on Friday January 06, 2006 @04:56PM (#14411794) Homepage

    One of the wisest comments I've heard on security was: security is the tax that the rest of us pay because some people are immoral.

    Security has a definite cost. Casinos are probably the extreme example. They tend to hire people paid an hourly wage who handle large amounts of money. Perhaps they have little choice but to watch them all them time. The people who are working at the casino are generally willing to put up with a total surveilance work environment because the jobs pay better than most relatively unskilled jobs.

    I have not read the book that was reviewed, but the reviewer seems to sugget that something like this kind of total surveilance environment is desirable. The problem is that such an environment exacts a cost from the majority of honest and moral people in the hope that it will deter or catch those who are dishonest. A heavily restricted surveilance environment is likely to drive anyway many people who have other job options. As espionage scandels have shown, there is never any guarantee that any set of counter measures will assure that someone does not betray trust.

    There has to always be a balance between risk and the cost of the security measures. Security "professional" like the reviewer seem to forget this. After all, it is not their problem when people quit for a more pleasant environment or when the organization cannot attract highly qualified people who can choose to work elsewhere.

  • IT Security (Score:2, Interesting)

    by peterfa ( 941523 )

    I'd have to say that this is actually blown a tad out of proportion.

    I used to work as a HelpDesk Technician for a school. This job was a tad different than ordinary HelpDesk positions at other places. I didn't handle problems over the phone. I'd walk to the office and fix it there. Now to do my job I was told the password for the built-in admin account on every machine. I was just a volunteer too.

    However, I often needed to get into someones office when that person was absent. So I had to call security and

  • A Better Solution (Score:3, Insightful)

    by Brushfireb ( 635997 ) on Friday January 06, 2006 @05:55PM (#14412371)
    A Better solution is to do the following:

    - Hire good employees, who are relatively honest and straightforward people. This includes everyone -- IT, Sales, Administrative, etc. If they arent honest, they shouldnt be working here. (This also tends to help with Corporate Responsibility -- how NOT to fudge the books in a crunch..) There are decent HR personality tests that can reasonably predict if someone would be untrustworthy in different situations.

    - Deal with your employees fairly, honestly, and be upfront. This will minimize the biggest source of insider problems -- disgruntled employees. For example, giving yourself a raise after or just before laying off other employees, is generally a Bad Thing (tm). Try to be honest with employees about their performance, what is expected, and what wont fly. Provide regular, upfront feedback. Follow through with action. Be Kind, Understanding, but Firm.

    - Trust your employees to make sound decisions. The employee who is berated and treated as if they "cant be trusted" will eventually turn into the employee who you fear them to be. If you dont trust them to start, then why should they care? More over, if you dont trust them, why did you hire them?

    - Give people ample access to what they need, but not so much access that it impedes others. For example, the IT administrator should have access to quite a bit. Asking for a password to do their job is no only unefficient, its demeaning and downright stupid. Do you trust the IT people you have hired? Do you believe them to be competent? If so, then let them do their job. If not, then why did you hire them or why are they still working there? Its incredibly frustrating to employees to do what this book reccomends -- lock down access. Its frustrating to the employee becuase they have to "ask" to do their job. And its frustrating to management, who has to constantly hand-hold entering passwords as the employee progresses. Cut the leash.

    Overall, I think its important for IT security people and Management to understand these risks. TO watch for violations. But to base your company security policies on these type of ideas would be lunacy, and would kill any sort of company morale you might have had going for you. Its much easier to trust the people you work for, pay them fairly and well, and treat them like human beings than it is to try to lock them down in every way to "prevent" bad things.

    Certainly there are exceptions where even the very small percentage of bad employees can cause very large damage to the company. This should be dealt with appropriately within those industries -- and employees should know this DURING the application process, so they know what kind of BigBrother situation they are getting into.

    B

    • Mod this up! There's too much focus on idiot technical measures and not enough on working with people. Whenever a disgruntled departing employee manages to do serious damage, the knee jerk reaction is to impose yet more restrictions ranging from the purely technical to the physical and legal such as having uniformed guards standing over the terminated employee to prevent any unauthorized action like touching of keyboards, and, of course, the escort off the premises. What about avoiding most of these ugly
  • In my 30+ years of experience working in IT I've found that companies can be run in basically two way when it comes to the question of security. One is to be anal retentive and implement restrictions on everything so that you can't breathe; this creates a climate where nobody trusts each other, and of course it breeds resentment etc.

    The other way is to trust everybody - that tends to make people feel responsibility for the company, the team, the project or whatever. This doesn't mean that everybody should h

What the gods would destroy they first submit to an IEEE standards committee.

Working...