Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Businesses News Technology

AMD CEO Dirk Meyer Resigns 123

angry tapir writes "Advanced Micro Devices has announced that Dirk Meyer has resigned from the post of CEO, and that the company is beginning to search for a new chief executive. Meyer resigned in a mutual agreement with the board of directors, and the company has appointed Thomas Seifert, the company's chief financial officer, as the interim CEO. Meyer was installed as CEO in 2008 as a replacement to Hector Ruiz, just as the company was making its way out of rough financial times. In October, AMD posted a third-quarter net loss of US$118 million."
This discussion has been archived. No new comments can be posted.

AMD CEO Dirk Meyer Resigns

Comments Filter:
  • by e4g4 ( 533831 ) on Tuesday January 11, 2011 @12:09AM (#34832620)
    With AMD CPUs left and right, how is AMD posting a loss?
    • by hedwards ( 940851 ) on Tuesday January 11, 2011 @12:19AM (#34832656)
      I haven't looked at the financial sheets, but I would venture to guess that a lot of it is R&D trying to catch up with Intel again.
      • by Futurepower(R) ( 558542 ) on Tuesday January 11, 2011 @07:27AM (#34834490) Homepage
        This article explains: Coup at AMD: Why was Dirk Meyer Pushed Out? [brightsideofnews.com] Quote:

        "Remember, Dirk Meyer's three deadly sins were:

        1) Failure to Execute: K8/Hammer/AMD64 was 18 months late, Barcelona was deliberately delayed by 9 months, original Bulldozer was scrapped and is running 22 months late.

        "2) Giving the netbook market to Intel [AMD created the first netbook as a part of OLPC project] and long delays of Barcelona and Bulldozer architectures.

        "3) Completely missing the perspective on handheld space - selling Imageon to Qualcomm, Xilleon to BroadCom."

        There is a comment at the bottom of this poor-quality article in the Inquirer [theinquirer.net] that says Dirk Meyer "was the lead engineer who designed the Athlon, Opteron and the DEC Alpha. Let's not forget that from 1999-2006, AMD actually had better processors than Intel, and this was due to Dirk Meyer's technology."
        • by import ( 40570 )

          1) Dirk wasn't CEO til 08, well after AMD64 and Barcelona. Bulldozer was scrapped *by* Dirk IIRC

          2) AMD had nothing to compete with the atom in that market. Sure, they were first, but with a Geode IIRC slower than an Atom which is already pathetically slow.

          3) This is arguable, it falls under the strategy where AMD needed to focus on core business. Whether or not you buy that or you believe they should've switched core business or whatever is where its arguable. I think from AMD's fiscal performance, Dirk

    • by RightSaidFred99 ( 874576 ) on Tuesday January 11, 2011 @12:20AM (#34832662)
      They're in a price war with a competitor who is a generation ahead of them in manufacturing technology. Their margins are getting slimmer and slimmer.
      • by Khyber ( 864651 )

        Ahead in manufacturing tech, maybe. Architecture? Who made x64? Who has one of the lowest power draw/highest performance CPU/GPU combo for mobile systems that would shit all over tons of current in-service desktop systems, with an even better revision coming soon?

        When it comes down to it, none of the hardware companies are really that impressive. The hardware right now is enough to do WAY BETTER performance-wise, the problem is that programs are becoming so bloated and unoptimized.

        It's coming down to where

        • by robthebloke ( 1308483 ) on Tuesday January 11, 2011 @09:13AM (#34835092)

          Ahead in manufacturing tech, maybe. Architecture? Who made x64?

          That's a bit like saying ford is the worlds greatest motor company because of the Model T, and then neglecting to notice the failures of it's recent business practices. x64 was something AMD did right 8 years ago. In the last 8 years however, things haven't gone as well for the business....

          Who has one of the lowest power draw/highest performance CPU/GPU combo for mobile systems that would shit all over tons of current in-service desktop systems, with an even better revision coming soon?

          Intel ATOM and Nvidia ION?

          • That's a bit like saying ford is the worlds greatest motor company because of the Model T, and then neglecting to notice the failures of it's[SIC] recent business practices.

            Indeed, unlike GM they failed to go bankrupt.

          • by Khyber ( 864651 )

            "Intel ATOM and Nvidia ION?"

            Not even close. Have you even seen the latest AMD Fusion? It shits all over Intel+ION.

            "That's a bit like saying ford is the worlds greatest motor company because of the Model T, and then neglecting to notice the failures of it's recent business practices."

            During the auto bailout and financial crisis, Ford was the only company actually able to afford to give loans to people. AMD isn't Bankrupt.

            • Have you even seen the latest AMD Fusion?

              Unfortunately not. I requested an engineering sample from AMD about 2 years ago, and am still waiting......

              It shits all over Intel+ION.

              Whether it shits over ION or not entirely misses the point. I'm sure Fusion *could* shit all over ION, but the problem can be summed up with a quick glance here [scan.co.uk]. Where are these fabled Fusion netbooks? It's ATOMS, ATOMS, ATOMS and nothing but ATOMS in that market sector.....

              So returning to your original question:

              Who has one of the lowest power draw/highest performance CPU/GPU combo for mobile systems that would shit all over tons of current in-service desktop systems, with an even better revision coming soon?

              The ATOM/ION has been available for over 18 months. It has one of the lowest power/high

          • Ahead in manufacturing tech, maybe. Architecture? Who made x64?

            That's a bit like saying ford is the worlds greatest motor company because of the Model T, and then neglecting to notice the failures of it's recent business practices.

            That's a spectacularly ignorant statement.

            Ford is the only one of the "Big 3" US-based auto manufacturers that has not been put through a bankruptcy backed by government financing. They took steps well ahead of the financial meltdown to turn as much of their non-performing assets into liquidity (e.g. the blue oval trademark is collateral for a big fat loan, or put more harshly: they hocked the family jewels.) Bill Ford recognized that his name didn't mean he was the best strategic manager for the company

      • We'll see how that all turns out when the Fusion processors come out in laptops soon. I think they will probably take over the laptop market. The desktop market still is going to be Intel dominated unless AMD can push its bulldozer out quicker than they are expected to and sell it at a lower margin.
        • Your problem is twofold. First, the vast majority of users don't need anything beyond the Intel integrated graphics. AMD has had superior chips in the past (Pentium 4/X2 days) and they didn't "take over" anything, and not because of mean old Intel cheating.

          This leads is to your second problem - AMD doesn't have the manufacturing might to make chips at the same price Intel does. AMD's only choice is to cut prices to the bone, and Intel can still match them and make good margins.

          I expect some design wins f

    • by Kjella ( 173770 ) on Tuesday January 11, 2011 @03:18AM (#34833468) Homepage

      What does volume matter if you don't have margins?

      1. Intel has always been ahead on processing tech, often a generation in front or if not on a more mature process. That means AMD has bigger dies and lower yields, which directly inflate cost.
      2. A lot of the expense is R&D, and with Intel having ~80% of the microprocessor market each AMD chip has to carry at least four times as much of the cost as each Intel chip.
      3. Intel got a processor to match every one of AMDs, the reverse is not true. Intel makes high margins where they are alone and squeezes AMDs margin where they compete.

      Seriously, take a look at something like 3D rendering [anandtech.com] performance, which is usually extremely multi-threading friendly. The 2500K which sells for less than the 1100T is beating it in everything but the POV-ray test. Never mind that it's much faster and better for everything that doesn't take advantage of six cores. The Opteron vs Xeon battle looks the same, AMD had the advantage a while but they're struggling badly now there too. On the low end Intel has the Atom which is raking in money meaning AMD is losing a lot of low-end sales. They're boxed in and in every market they deliver "value" processors. That means in other words low income processors. So with low income and high costs, you post a loss.

      • Honestly I think where AMD went wrong is while INTEL has been spending money on companies like Dell to increase how many products contain INTEL, AMD went and bought another company to try and improve them selves. They may be a bit behind due to financial struggles however I am one that definitely does not support those who play the marketing game. Has anyone on here ever seen a TV ad for AMD?
      • Not always. The pentium pro line of processors had a heat problem that Intel "solved" by adding latency layers, and AMD began beating them in benchmarks. Right up until the Core2 processors, when Intel had solved the problem.

        • by Kjella ( 173770 )

          Yes, in benchmarks but not processing technology - litography size and yield. Also you mean the Pentium IV, not the Pentium Pro from the mid 90s.

    • by Xest ( 935314 )

      What AMD CPUs left and right? My work laptop is Intel, my Netbook is Intel, both my Cell phones have Qualcomm ARM chips.

      I don't even buy AMD chips at home now, since whilst Intel remained more expensive, AMD chips always ended up seeming to require excessive cooling, and AMD chips never seemed to give the performance I'd expect, yet the first Intel chip I bought in 15 years for my home PC did straight out the box.

      Apart from extremely low margin budget PCs and laptops from your local electronics superstore I

      • AMD chips always ended up seeming to require excessive cooling

        I'm really not sure what you've been smoking, but I suggest you stop.

        The last "hot" AMD chip I had was back in the original 1GHz Athlon days, circa 2000 or 2001. It got that reputation mostly because it was about the point of the time that Intel added on-chip thermal regulation and AMD hadn't yet. So if you pulled a heatsink off a Intel chip, it would clock itself downward to compensate. The AMD chips, which didn't yet have that feature,
    • by tyrione ( 134248 )

      With AMD CPUs left and right, how is AMD posting a loss?

      They aren't. AMD's quarterly report is on the 20th of January for it's 4th Quarter. The 3rd Quarter reference is 3 months old.

    • With AMD CPUs left and right, how is AMD posting a loss?

      It's called "making it up on volume", did you sleep through the dot-com boom or something?

  • I for one an sorry to see him go. I think he has brought the company well through some rough times.
  • by syousef ( 465911 ) on Tuesday January 11, 2011 @12:19AM (#34832652) Journal

    Hey, that's one clever way to get your mind off the recession: Play musical chairs with the company execs. Did you see the job open up at Microsoft? Time to apply Dirk! Where she stops nobody knows....weeeehheeeeee! You poor schleps can lose your jobs, we'll just keep going round and round!

    • Re:Musical chairs (Score:5, Informative)

      by hedwards ( 940851 ) on Tuesday January 11, 2011 @12:29AM (#34832706)
      Actually, that's typically how it's handled when the CEO makes an abrupt departure, one of the other executive board members will step in while they find a replacement.

      Personally, I don't blame him, I blame it on Intel and its successful attempts to bribe major equipment integrators to not use AMD chips.
  • Re:Sorry.. (Score:4, Insightful)

    by Anonymous Coward on Tuesday January 11, 2011 @12:19AM (#34832658)

    I for one an sorry to see him go. I think he has brought the company well through some rough times.

    Some CEO's that are great for riding through the rough times aren't the CEO's that you want when that stretch is over.

    • Re:Sorry.. (Score:4, Insightful)

      by 1984 ( 56406 ) on Tuesday January 11, 2011 @12:40AM (#34832758)

      Winston Churchill, meet Clement Atlee.

      Except I think that involved winning a war, not just surviving in a currently tenuous second position...

    • Re:Sorry.. (Score:5, Interesting)

      by Cornelius the Great ( 555189 ) on Tuesday January 11, 2011 @12:44AM (#34832770)
      I'll agree with this. AMD's been seeing some triumphs lately- their graphics division has been very successful, even despite a minor delay with the Radeon HD6900 GPU. Nvidia might have the performance crown this generation, but their previous generation has been shaky and their 40nm chips haven't been as available as AMD's, allowing AMD to gain considerable marketshare.

      I've noticed a few netbooks with AMD Bobcat cores appear at CES, and has enough performance and power efficiency to give both Atom and Ion some serious competition.

      While Llano doesn't appeal to me personally, it's nice to see Fusion reaching the desktop shortly. I'm also anxious to see how the Bulldozer will perform once it's released in a few months.

      With the delay of Intel's Ivy Bridge into 2012, AMD has a lot of potential to make this year a profitable one.
      • I'll agree with this. AMD's been seeing some triumphs lately- their graphics division has been very successful, even despite a minor delay with the Radeon HD6900 GPU. Nvidia might have the performance crown this generation, but their previous generation has been shaky and their 40nm chips haven't been as available as AMD's, allowing AMD to gain considerable marketshare.

        I've noticed a few netbooks with AMD Bobcat cores appear at CES, and has enough performance and power efficiency to give both Atom and Ion some serious competition.

        While Llano doesn't appeal to me personally, it's nice to see Fusion reaching the desktop shortly. I'm also anxious to see how the Bulldozer will perform once it's released in a few months.

        With the delay of Intel's Ivy Bridge into 2012, AMD has a lot of potential to make this year a profitable one.

        so the guy that brought AMD to a position where they're successfully launching 3 products in one year (which they've never done before) is not someone you want to keep around? Are you kidding? It's about momentum and inertia, this is a silly time to do something like this to a visionary like him.

        This is the same board that kept that dolt Hector Ruiz around for years while he ran the company into the ground. Color me unsurprised.

        • I hate to break it you, but Fusion, Bobcat, and Bulldozer have been in development for quite a long time- all of these projects started when Hector was at the helm. Dirk can hardly be credited with these product releases, other than keeping AMD afloat long enough to allow these products see the light of day.
        • Re:Sorry.. (Score:5, Informative)

          by Rockoon ( 1252108 ) on Tuesday January 11, 2011 @02:31AM (#34833236)

          so the guy that brought AMD to a position where they're successfully launching 3 products in one year (which they've never done before) is not someone you want to keep around? Are you kidding?

          Are you honestly asking this question? If you are going to pretend to know anything about the business world, then you should at least pretend to also know that some CEO's are specialists at bringing companies out of financial trouble and even bankruptcy.

          For example (from my industry) there is Scott Butera, a CEO that has brought more than one casino out of financial trouble, who has just been picked up by the Foxwoods Resort Casino in Connecticut because of its very serious financial troubles (billions in debt, defaulting on loans..)

          Often what these specialists bring to the table is their trusted contacts in the financial industries. The primary goal is often to maintain a credit line while the problems are resolved (because no large business can run without credit, regardless of how much cash they have.)

          • by Raenex ( 947668 )

            The primary goal is often to maintain a credit line while the problems are resolved (because no large business can run without credit, regardless of how much cash they have.)

            That doesn't make a whole lot of sense. Large companies like Google, Microsoft, and Apple have billions in cash reserves, and can probably operate for a year or two by spending just cash.

            • And they keep those cash reserves for dividends, acquisitions, and keeping their credit rating high.
              • by Raenex ( 947668 )

                But the companies mentioned don't pay dividends, don't *need* to acquire companies (and can afford to do so anyways and still operate on cash), and don't *need* credit if they have cash to spend. Remember the person I responded to said "no large business can run without credit, regardless of how much cash they have". It just isn't true.

  • I would resign too, AMD is always the bridesmaid never the bride.
  • Does this mean in six months time Intel Ceo Paul Otellini will quit, but for higher severance package?
  • Any word yet (Score:4, Insightful)

    by publiclurker ( 952615 ) on Tuesday January 11, 2011 @12:34AM (#34832734)
    on how many million they will be rewarding him for losing that $118 million?
    • Re: (Score:1, Funny)

      No idea, but at least being AMD they can calculate it correctly [wikipedia.org].
    • I suspect his golden parachute will look like a knotted napkin compared to the ones failed bankers get.

      Then again a banker would probably lose 100 times as much so it stands to reason he deserves more...

  • by DoofusOfDeath ( 636671 ) on Tuesday January 11, 2011 @12:46AM (#34832778)

    I'm just getting going on GPU programming. I was thinking to go with OpenCL (pushed by AMD/ATI ) over CUDA (pushed by nVidia) because I thought AMD looked more likely to survive in the long term. But now it's getting harder to tell which company is safer to rely upon.

    • by Anonymous Coward on Tuesday January 11, 2011 @12:59AM (#34832824)

      I'm just getting going on GPU programming. I was thinking to go with OpenCL (pushed by AMD/ATI ) over CUDA (pushed by nVidia) because I thought AMD looked more likely to survive in the long term. But now it's getting harder to tell which company is safer to rely upon.

      OpenCL works on both AMD and nVidia GPUs , so you should be safe there.

      • by Cornelius the Great ( 555189 ) on Tuesday January 11, 2011 @01:10AM (#34832874)
        Intel has also promised OpenCL support on Sandy Bridge and later integrated GPUs. Not to mention S3 and VIA support.

        I predict that Cuda will quickly become irrelevant and die a long, slow death (ie- just legacy support, no new features). Much like Cg did, after GLSL and HLSL matured. No one wants to be stuck on a single hardware platform, despite performance advantages.
        • Intel has also promised OpenCL support on Sandy Bridge and later integrated GPUs.

          As far as I'm aware, Intel promised that they'd get OpenCL running on the CPU (for Sandy Bridge) as opposed to the GPU after Apple pressured them for OpenCL support. OpenCL on GPU is scheduled for Ivy Bridge.

        • by Twinbee ( 767046 )
          From what I've heard, Nvidia are planning to allow CUDA to compile to other processors such as Intel's x86 (and presumably the new ARMs).

          Remember, in the end, it's just a language, albeit a language that's biased slightly towards Nvidia's architecture. But as like all other languages, there's nothing stopping them from making that same code run on any platform.

          It's also somewhat easier to use than OpenCL from what I've heard.
    • Someday MS might give us a standard wrapper for both.
      Oh, who am I kidding? MS only cares about their own DirectX product as te be-all/end-all. But they might need it as bait for coders in your same dilemma, as XP and DirectX9 are still their own strongest rival for DX10 and upcomig DX11 adoption.

      In other words, if corporate America finds a killer app for CUDA "soon," MS could start selling the idea that XP/DX9 upgrades to Windows 7 and 8 are their only upgrade path to gain built-in CUDA-like support.

      • by macshit ( 157376 )

        Isn't the MS-only analogue of OpenCL/CUDA "DirectCompute" [wikipedia.org]?

        OpenCL is clearly the portable choice, but I presume Microsoft will try to gimp that on Windows to encourage people to use DirectCompute.

      • MS already has DirectCompute [wikipedia.org] which is an alternative to OpenCL/CUDA and already runs on existing NVIDIA and ATI GPUs.
        • by Rockoon ( 1252108 ) on Tuesday January 11, 2011 @02:52AM (#34833340)
          Just to add to this..

          ..the predecessor to DirectCompute was a little .NET library that came out of Microsoft Research called Accelerator [microsoft.com] which was initially available to the public in 2006.

          ..thats several years before CUDA (2008) and OpenCL (also 2008)

          Microsoft has actually been the innovator on this one.
          • CUDA was announced November 2006 [ixbtlabs.com], so Microsoft wasn't that far ahead. But mass-market GPGPU [gpgpu.org] really got started around 2000, culminating in the Brook project [stanford.edu] in 2004. Microsoft didn't start this trend, though they did jump on it quickly.

          • A quick wikipedia search indicates the CUDA SDK [wikipedia.org] came out Feb 2007 and the Streams SDK v1 [wikipedia.org] came out Dec 2007. Following links from Streams gets you to BrookGPU [stanford.edu], an ATI project using the Close-to-Metal programming interface who's last major release was 2004.

            Microsoft is only an innovator where it has a monopoly.
    • Actually NVIDIA was a major contributor to OpenCL http://www.realworldtech.com/page.cfm?ArticleID=RWT120710035639 [realworldtech.com]
  • Mobile Failure (Score:4, Informative)

    by DeadBugs ( 546475 ) on Tuesday January 11, 2011 @01:04AM (#34832840) Homepage
    http://www.brightsideofnews.com/news/2011/1/10/coup-at-amd-dirk-meyer-pushed-out.aspx [brightsideofnews.com] It seems that that the selling off of their mobile business and the success of Tegra is behind this.
    • As far as I knew they already DID sell off the mobile business and it was partially Dirk Meyer's fault so they removed him.
  • by Anonymous Coward
    Thanks for steering us back from the brink, Boss. I really appreciated having a both a capable engineer and a courageous leader at the helm.
  • I'm no silicon engineer, but I have to imagine that AMD has SURELY got something big in the works.

    Think about it. They've essentially had three chips in ten years. Athlon, Athlon64, and Phenom. Everything else is minor variation and process evolution. That's not a lot, really.

    As I see it, AMD is either biding their time, holding their market segment down with their really-stretched-to-the-limit Phenom architecture, while perfecting the next generation product...

    OR

    They've just been fucking off for the past t

    • by Trogre ( 513942 )

      You forgot Opteron. Where else can I get twelve physical cores per CPU? Not Intel, that's for sure.

      • Opteron is/was a derivative of Athlon, Athlon64 and Phenom. Just like Xeon is/was a derivative of Pentium2, Pentium3, Pentium4, Core2, and Core i7. They have additional cache, features to support multisocket and ECC memory for servers, and are tested to a different standard.
  • by madwheel ( 1617723 ) on Tuesday January 11, 2011 @02:45AM (#34833304)
    It's unfortunate, but regardless, I will be a die hard AMD supporter. They've helped keep the market competitive, have much better business practices, and always have the end-user in mind with regards to their CPU socket configurations. Or should I say configuration? One socket for a massive range of CPUs. I like being in control of my upgrades. I can't stand that Intel changes MB socket types with damn near every CPU and expect it to be alright to fork over a couple hundred bucks in addition to the CPU price. AMD has never let me down since I switched during the K7 era. I for one can not wait for the Bulldozer. I know right now the new Sandy Bridge chip is simply amazing but I can wait a few months.
    • Here is my predictions:

      1) Bulldozer will beat Sandy Bridge on integer performance (clock for clock) but will lag badly on floating point performance (clock for clock.)

      2) Nobody will complain about DRM in AMD processors because they dont have any.
    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday January 11, 2011 @08:22AM (#34834766) Homepage Journal

      AMD has been the clear leader for consumer choice and value since the K6/3, which is a complete monster. Actually, the K6/2 is a beast as well, but it has a crap fpu still. Not that I expect anyone to do this today, but if you actually compile your full system for K6 (hello, Gentoo!) then you will beat the pants off a Pentium II of the same clock rate and cache... not to mention that a K6/3 system with external cache has an L3 cache because of the integrated L2. Unfortunately they were saving their pennies for the upcoming K7 launch so they didn't have the money to do a fully rebranded K6/3 launch as a new product which could actually compete with the P2 in the x86 market.

      New intel processors are always astoundingly expensive until the next AMD processor comes out, so IMO it is safe to say that you should always wait for AMD to bring out a processor before a new purchase even if you don't plan to buy AMD. And if more people did that, Intel would drop their pricing and a new equilibrium would be found, but a lot of people who didn't understand (or won't forgive) the K6 FPU debacle will never forgive AMD for their one (very real) failure. If you look back at AMD's list of attempts to rival or even surpass intel they are all massive successes, including 40 MHz 386s, low-power 486s, the oddly competitive yet overlooked 586...

      • by CAIMLAS ( 41445 )

        Yes, the K6/2 was an incredible system.

        I remember doing just that - beating the pants off friends and their Intel 233Mhz machines with my same-clocked K6-2 - even with a worse graphics card, I got higher FPS in games like Quake. Windows operation was visibly better and more responsive, as well. AMD had the edge in this department all the way up to the current, Nehalem based Intel systems, IMO. Unless I'm doing server wrok, I'll take an AMD Athlon64 CPU over a 5200 Xeon or similar.

  • by assemblerex ( 1275164 ) on Tuesday January 11, 2011 @06:14AM (#34834218)
    they all left, party over. Long story made short.
    • by Anonymous Coward on Tuesday January 11, 2011 @09:09AM (#34835060)

      As someone who was able to compare salaries when AMD bought ATI, I've got to take issue with that comment... AMD actually paid *very* handsomely compared to ATI (and that pay disparity still exists, and causes some resentment... but that's a separate issue). And of course, there are tons of highly competent, skilled, and creative engineers still in the company.

      A lot of "top" ex-ATI talent has gone elsewhere (also starts with 'A'), but that phenomenon is almost exclusively limited to Silicon Valley. In general, the hop from job-to-job culture is far less practical when there are only a handful of ASIC jobs to be had in certain areas. Many "top" CPU guys are still around too, so far as I can tell (not my department).

      The thing that I notice from all of these moves is that ex-ATI people are on the move upwards, largely displacing the CPU side. The poor execution of the latter group is a large part of that, no question. The trouble is, most of the moves upwards are being made by people in (you guessed it), Silicon Valley. The headquarters is still in Austin, but it's becoming little San Fransisco. The reason this is a problem? Well, it's building resentment in almost the entire remainder of the company, which is a rather large organization. The CPU guys are annoyed that everything is moving under formerly-graphics ownership (add that to the irritated sentiment that AMD overpaid for ATI...), and two-thirds of the GPU guys are annoyed that everything is moving under Silicon Valley ownership. Some changes are viewed as being unfair (such is life) and some are clearly undeserved (ATI had some big screw-ups too). The politics is actually pretty bad; it's much worse than any other place I've ever worked.

      We've got a group meeting about this announcement this afternoon. I wonder what the spin will be...

      captcha: tantrum

      haha...

  • Talk to the Oracle. (Score:4, Interesting)

    by Deathlizard ( 115856 ) on Tuesday January 11, 2011 @09:09AM (#34835054) Homepage Journal

    Considering their market cap, and Oracle's interest in chip companies, It wouldn't surprise me if Larry Ellison isn't their next CEO.

For God's sake, stop researching for a while and begin to think!

Working...