Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
United Kingdom Medicine

UK Supercomputer Cambridge-1 To Hunt For Medical Breakthroughs 23

The UK's most powerful supercomputer, which its creators hope will make the process of preventing, diagnosing and treating disease better, faster and cheaper, is operational. The Guardian reports: Christened Cambridge-1, the supercomputer represents a $100m investment by US-based computing company Nvidia. The idea capitalizes on artificial intelligence (AI) -- which combines big data with computer science to facilitate problem-solving -- in healthcare. [...] Cambridge-1's first projects will be with AstraZeneca, GSK, Guy's and St Thomas' NHS foundation trust, King's College London and Oxford Nanopore. They will seek to develop a deeper understanding of diseases such as dementia, design new drugs, and improve the accuracy of finding disease-causing variations in human genomes.

A key way the supercomputer can help, said Dr Kim Branson, global head of artificial intelligence and machine learning at GSK, is in patient care. In the field of immuno-oncology, for instance, existing medicines harness the patient's own immune system to fight cancer. But it isn't always apparent which patients will gain the most benefit from these drugs -- some of that information is hidden in the imaging of the tumors and in numerical clues found in blood. Cambridge-1 can be key to helping fuse these different datasets, and building large models to help determine the best course of treatment for patients, Branson said.
This discussion has been archived. No new comments can be posted.

UK Supercomputer Cambridge-1 To Hunt For Medical Breakthroughs

Comments Filter:
  • and we knew that 100 years ago already.

    • Vitamins aren't a panacea. All they are essentially are essential. Meaning that we need them but can't produce them ourselves. That's it. That's all there is to them. They're neither magical nor miraculous, all they are is something our body needs but can't produce.

  • So this computer is being tasked to solve 'better, faster AND cheaper'?

    It's going to blow a valve and have a meltdown, for sure!

    • by gweihir ( 88907 )

      It can do faster and maybe cheaper. It cannot do "better". That is just lies to the public to glorify what is essentially an oversized Abacus. Also, "AI" based solutions are routinely much worse than ones designed by humans. The hope is that this thing can find some needles in some haystacks and that humans with actual intelligence can then improve what was found.

      • Even a non-learning system that's basically a big database and some statistical analysis can do better than a doctor, because it can remember everything at once and a human can't.

        We're still better at many tasks, but we're not better at all tasks.

        The computer currently can't create a diagnosis alone because it can't do what a human does regarding inspection of and communication with the patient. But that's also only a matter of time before it can do it better than the doctor, with as much objectivity as it'

        • by gweihir ( 88907 )

          Even a non-learning system that's basically a big database and some statistical analysis can do better than a doctor, because it can remember everything at once and a human can't.

          Remember IBM Watson doing a bit better on average, but then also killing patients that a real doctor would never have killed? IBM abandoned that project because it could not be fixed. And that was pretty much the best possible solution.

          • Remember IBM Watson doing a bit better on average, but then also killing patients that a real doctor would never have killed?

            That's why, currently, you need a human in the loop. But there's no shortage of stories of human doctors not providing care to patients, telling them it's all in their heads, when they have real problems. And this happens to women more than men, proving that it's caused by doctor bias.

            • by gweihir ( 88907 )

              There is just the tiny problem that the human in the loop cannot be eliminated. Claiming this is a "not yet" is either dishonest or uninformed because in actual reality, nobody knows.

              • Claiming this is a "not yet" is either dishonest or uninformed because in actual reality, nobody knows.

                I disagree. I think it's inevitable, frankly. The limitations of the human senses compared to ever-evolving sensor systems, and the human ability to process information compared to clusters of computers which are continually improving in performance, all but guarantee it.

                Computers still lack real creativity, they need a person to tell them how to function in specific ways every time. But given that, they are fundamentally better at many very specific tasks already, and the number is ever-growing as new algo

                • by gweihir ( 88907 )

                  Well, I am a sorry to go "Invalid Argument from Authority", but I am a PhD Level Computer Scientist and I have been following this field for around 30 years now. I see nothing definite and nobody claiming "it will work" with scientifically valid proof. Hence it is pretty much unknown. I can understand that even an expert on a lower level can get blinded by the truly astonishing computing power modern computers have. But it does not mean anything here. If all you have is Apple-trees, you will never be able t

                  • Also, there is no scientifically sound indication at all that computers can ever be "creative". None at all.

                    Computers perform several kinds of creative tasks [quora.com] (read answer by "Joel Chan") through mostly the same means that most people use to perform most of them; that is, imitation and comparison. What they don't do is come up with new kinds of things, but given enough computers there's no reason they can't do that either. Instead of making comparisons based on similarity, they would make them based on differences. It may require orders of magnitude more processing power for computers to be able to evolve their o

                    • by gweihir ( 88907 )

                      Ah well, if you are satisfied with a fake definition of "creativity", you will get fake creativity. Computers cannot do "imitation" and "comparison". They can do transformation without understanding and they can do bit-delta without understanding.

                      I am well aware many people desperately want computers to be "creative" and "intelligent" and one way to fake that is to redefine terminology. That gives nice fakes, but never the real thing. Once you dig a bit deeper, you find the utter meaninglessness of what com

                    • Ah well, if you are satisfied with a fake definition of "creativity", you will get fake creativity.

                      Most creativity is fake. Most people are just copying what someone else did.

                      Computers cannot do "imitation" and "comparison".

                      That's totally false. Computers are great at both of those things. In fact comparison is most of what computers do! Most instructions in most programs are comparisons and conditional jumps based on those comparisons. What they're not good at is innovation, because they can't do enough imitation and comparison fast enough.

                      And no, it is not a question of computing power.

                      More processing power will let them do more comparison, which lets them do more imitation. (Comparison is how you

                    • by gweihir ( 88907 )

                      Ah well, if you are satisfied with a fake definition of "creativity", you will get fake creativity.

                      Most creativity is fake. Most people are just copying what someone else did.

                      True. But imitating that does not make it non-fake. And in many situations only the real thing will work.

                      Computers cannot do "imitation" and "comparison".

                      That's totally false. Computers are great at both of those things. In fact comparison is most of what computers do! Most instructions in most programs are comparisons and conditional jumps based on those comparisons. What they're not good at is innovation, because they can't do enough imitation and comparison fast enough.

                      Nope. Computers cannot compare anything except bits. The reason is computers have no grasp of anything but bits.

                      And no, it is not a question of computing power.

                      More processing power will let them do more comparison, which lets them do more imitation. (Comparison is how you figure out whether you've imitated successfully.)

                      No. If it was a problem of processing power, we would have actual intelligence and actual creativity in machines for quite a while now, it would just be very slow. We have no such thing.

                    • Nope. Computers cannot compare anything except bits. The reason is computers have no grasp of anything but bits.

                      Software can be trained to badly imitate humans' ability to recognize things, and make comparisons... based on their ability to compare bits. Humans are also prone to make comparisons based on imagined things, similarity to remembered situations and such, and our optics are impressive but limited.

                      The computer is only as good as its hardware and its training, in which it is much like a human. Its hardware is inferior to the human's. But it can be trained much more than the human can because it can learn from

  • It is not a miracle-machine. It is unusable for most research and where it is usable, it plays the role of a mere dumb tool, i.e. a helper. Stop fetishizing machines that push bits around.

  • 72 years late, Cambridge-1.

  • > will make the process of preventing diagnosing and treating disease

    I'm concerned that the automation will reduce the collection of medical history and listening to the patient directly, which gathers information an automated survey will not.

  • Ah, I see Nvidia is trying to improve its image in the UK, while aiming to screw up ARM's business model.
  • by excelsior_gr ( 969383 ) on Thursday July 08, 2021 @09:59AM (#61562231)
    It seems to me that all we achieved in the meantime was add three zeros to the budget. From John D. Clark's "Ignition!":

    This was to get a computer, and to feed into it all known bond energies, as well as a program for calculating specific impulse. The machine would then juggle structural formulae until it had come up with the structure of a monopropellant with a specific impulse of well over 300 seconds. It would then print this out and sit back, with its hands folded over its console, to await a Nobel prize. The Air Force has always had more money than sales resistance, and they bought a one-year program (probably for something in the order of a hundred or a hundred and fifty thousand dollars) and in June of 1961 Hawkins and Summers punched the "start" button and the machine started to shuffle IBM cards. And to print out structures that looked like road maps of a disaster area, since if the compounds depicted could even have been synthesized, they would have, infallibly, detonated instantly and violently. The machine's prize contribution to the cause of science was the structure [ASCII-art filter does not accept chemical forumlas!], to which it confidently attributed a specific impulse of 363.7 seconds, precisely to the tenth of a second, yet. The Air Force, appalled, cut the program off after a year, belatedly realizing that they could have got the same structure from any experienced propellant man (me, for instance) during half an hour's conversation, and at a total cost of five dollars or so. (For drinks. I would have been afraid even to draw the structure without at least five Martinis under my belt.)

    On a more serious tone, as someone that runs simulations on a computer cluster for a living, it's actually beneficial that we're using the technology for more than lolcat videos. But it's also hilarious how such stories have remained essentially unchanged for 60 years.

  • Well, yeah. A lot of things may not be fully used, as far as I understand. Such capacities could be used for more generally useful purposes. Although for business, I would like such a computer. Jokes aside, I even had a hard time finding the best showbox alternatives [doit.software]. But the supercomputer is still cool.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...