Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Media Technology

MIT Research Amplifies Invisible Detail In Video 114

An anonymous reader writes "MIT researchers have invented an algorithm which is able to amplify motion in video that is invisible to the naked eye — such as the motion of blood pulsing through a person's face, or the breathing of an infant. The algorithm — which was invented almost by accident — could find applications in safety, medicine, surveillance, and other areas. 'The system is somewhat akin to the equalizer in a stereo sound system, which boosts some frequencies and cuts others, except that the pertinent frequency is the frequency of color changes in a sequence of video frames, not the frequency of an audio signal. The prototype of the software allows the user to specify the frequency range of interest and the degree of amplification. The software works in real time and displays both the original video and the altered version of the video, with changes magnified.'"

This discussion has been archived. No new comments can be posted.

MIT Research Amplifies Invisible Detail In Video

Comments Filter:
  • by Anonymous Coward on Friday June 22, 2012 @03:47PM (#40415701)

    Wish youtube wasn't blocked at work.

    Although, if this looks like what I think it looks like, I could see this having a lot of potential in the movie industry as well. Specifically enhancing things otherwise unnoticed could make for some very creepy footage.

    • No, that creepy footage in films is what FX departments are for - and they do it far more artistically.

      This is bringing all the creepy remote monitoring shit to your local and federal law enforcement departments, along with every other eye-in-the-sky system in use by government and industry.

  • Enhance! (Score:5, Funny)

    by Bill Hayden ( 649193 ) on Friday June 22, 2012 @03:49PM (#40415729) Homepage
    n/t
  • Everything (Score:5, Insightful)

    by Sparticus789 ( 2625955 ) on Friday June 22, 2012 @03:50PM (#40415741) Journal

    Now every recording device has the potential to become a lie detector.

  • by mooboy ( 191903 ) on Friday June 22, 2012 @03:51PM (#40415753)

    For employers, or even police: you could easily detect emotional flushes in someone's face when asked certain questions, i.e., a lie detector of sorts. Also, think poker players with this software built into their "Google Glasses".

    • Yeah, I was thinking that something like this could be applied rather easily to monitor breathing and heart rates for lie detection purposes. Though I hate to admit I saw it, in the movie Salt, they had a lie detector that didn't require anything hooked up to the user. It'd be neat if that technology could be real.

      The applications for poker are an interesting idea as well. Awesome thinking with that one. I could definitely see this sort of thing becoming part of quite a few augmented reality applications.

      • by Anonymous Coward on Friday June 22, 2012 @04:21PM (#40416129)

        I, for one, would love to see a poker tournament where all of this stuff was legal. It would have to take place on a separate circuit, but currently the top strategies are 'don't act emotional and wear dark sunglasses'. This would take things to the next level, so you might as well throw in real-time simulation outputs, probabilities, heart-rate monitors, histograms, etc, all available to each player in real time. Put a thin layer of lead paint on the backing of each card and you're good to go.

        • I can't remember which one, but I swear some poker tournaments on TV had heart-rate monitors on the players.. only visible to the viewers, of course.

        • I've been saying this about professional wrestling for years. Let them `roid up and actually beat the shit out of one another. The same would work for NASCAR or similar. Let them strap rocket boosters and time shifters to the bastards and bask in the entertainment.
        • Want to hide your excitement? Get beta blockers, they're your little sniper friendly pals.
      • by Khyber ( 864651 )

        You think lie detectors are worth half a shit....

        I pity you and your fantasy world.

      • "Yeah, I was thinking that something like this could be applied rather easily to monitor breathing and heart rates for lie detection purposes"

        I'd prefer if they directed their energy to finally do a sports watch that could read a precise pulse without that darn breast-thingie.

      • Yeah, I was thinking that something like this could be applied rather easily to monitor breathing and heart rates ...

        There are iPhone apps that relatively accurately detect heart rate... Cardiograph [apple.com] is one that seems to work well as an pulse oximeter, [wikipedia.org] ... and I was about to try to explain what that was... those things that check heart rate with a sensor on your finger ... I read the wiki article... don't ask me how it works... something about arterial globins.

    • by Anonymous Coward

      Forget that.

      This technology will be immensely useful for various emergency situations:

      1. Detecting aliens that have taken on human form.
      2. Detecting assassination robots attempting to infiltrate our ranks.

      I'm sure there's many more USEFUL applications than I haven't come up with yet. But you get the point!

      "lie detector". hah.

    • I also can fake such a reaction by thinking of my mother trying to go down on me. This may seem like a troll, but it is fact
    • Haven't looked at it in detail so I'm just guessing what they're doing here, but it's likely that this would only work well for periodic motion. In fact, the examples in their video are just that: pumping blood, breathing, guitar strings vibrating, etc. Of course, you can make anything periodic by essentially playing it in a loop, with a good windowing function...

  • by cpu6502 ( 1960974 ) on Friday June 22, 2012 @03:54PM (#40415789)

    Here's iCarly.

    Here's iCarly enhanced..... you can see right through their shirts! (Watch; you'll see.)

  • If this won't detect replicants, then I don't want it
  • Oh fuck no! (Score:2, Informative)

    by Alex Belits ( 437 ) *

    could find applications in safety, medicine, surveillance, and other areas.

    So instead of highlighting areas where something worth looking at is detected, this thing produces a highly distorted, exaggerated version of the motion, adding its own bias based on naive attribution of moving areas to distinct objects? Then a human won't see important details behind things that software deemed worthy of emphasizing -- you can just as well remove the humans from the process completely.

    • I understood the point was to be able to have humanless monitoring. The laplace calculations implied that the computer would be very aware if there was a visible frequency that was in the range of a human breathing, or heart rate. If this visible frequency disappeared, then either the subject obstructed itself from the camera, or the motion stopped. This could then set off an alarm- if motion didn't continue within a short time frame.
      • If it was about finding the motions, there would be no attempts to produce a human-viewable video, it would just detect the motions and produce structured response -- "this looks like breathing", "swing set is wobbling", etc. The video output is clearly intended for human post-processing, but while it may be useful for research -- to check if movement detection indeed detected the right kind of movement -- it's unusable for human post-processing.

        It's like removing a security camera, placing an artist in its

    • No, humans decide what portion of the signal to manipulate, although I suppose an AI could be taught to look for various things that might not be obvious to humans. But your concern is exactly the same problem you have with any imaging technique - it always looks at some subset of 'reality' and interpretation is needed to correlate the signal / video / whatever.

      • No.

        All other techniques have DIFFERENT SEMANTICS of the input and output -- at the input the raw video feed that contains only what was seen by the camera, at the output there are two clearly separate layers -- video and highlited/marked up/colored/schematically displayed/... areas where movement (any, or of some particular kind) was detected. Human sees schematically marked up areas and positions, and tries to determine what exactly is happening by looking into the details of the original video in and arou

    • Re:Oh fuck no! (Score:4, Insightful)

      by wonkey_monkey ( 2592601 ) on Friday June 22, 2012 @04:14PM (#40416037) Homepage
      Yes, clearly the demonstrated ability to remotely monitor a sleeping baby's pulse and breathing will have no practical use.

      adding its own bias based on naive attribution of moving areas to distinct objects? Then a human won't see important details behind things that software deemed worthy of emphasizing

      This isn't AI. It's actually fairly simple image processing. It has no bias or sense or worth. Yes, it can be tuned - by a human operator who will most likely know what they want to have their attention drawn to. How important are a few pixels in the background behind a barely breathing body when you're searching for a hypothermia victim?

      If the concerns you raise had any impact on a particular scenario, the operator can just use their own eyes or switch off the processing.

      • If the concerns you raise had any impact on a particular scenario, the operator can just use their own eyes or switch off the processing.

        How would he know to do that? There is no indication what is exaggerated/emphasized and what is not.

        • I'm sure even I could add a function to stamp a little "image processing on" subtitle to the image. That's if the psychadelic cycling colours didn't give it away. This is research. The little niceties can get left aside until it's ready to roll it out, or after user feedback.
          • But that's exactly what they are NOT doing! Look at the output, it's completely natural-looking except for objects wobbling as if they are made of jelly.

            • What? What is it they're not doing? They're demonstrating their new method for enhancing changes in a video, and you're complaining that they're not displaying an "image enhanced in post-production!" disclaimer?

              Look at the output, it's completely natural-looking except for objects wobbling as if they are made of jelly.

              I thought that was the whole point.

              • And this is wrong for all those applications because it prevents the human viewer from noticing anything else.

                • *facepalm* I just don't get why you insist on decrying this research as useless. The point is to enhance subtle motion/changes so that the viewer notices them. Yes, there may be times when this could draw the attention away from something else, but if that something is not moving/changing colour then it's probably (giving that the enhancement is being used in the first place) not important.

                  You may as well pour scorn on infrared technology because it prevents you from noticing things that are the same temper

      • This isn't AI. It's actually fairly simple image processing.

        Oh yes, it is! Once you introduced contours detection or image partitioning, you are have decision-making embedded in the process -- that's out of filtering and into AI territory.

        • you are have decision-making embedded in the process

          s/are //

          (incomplete editing on my part)

        • Oh yes, it is! Once you introduced contours detection or image partitioning, you are have decision-making embedded in the process -- that's out of filtering and into AI territory.

          So any computer program that has an if-statement in it, is artificial intelligence. Ok, glad we got that cleared up.

          • So any computer program that has an if-statement in it, is artificial intelligence. Ok, glad we got that cleared up.

            In a signal processing algorithm (what this thing presented as) -- yes.

            • Admittedly I don't know how the details of the algorithm, but would you also call edge detection AI?
              • Filter that precedes the decision is not AI -- it applies to the whole image iniformly, and contains no conditionals (and for this reason can be SIMD'ed so easily). Once you start establishing cutoffs, start smoothing and centering, detecting contiguous edges and areas, it's AI.

  • Worst Case (Score:4, Interesting)

    by TubeSteak ( 669689 ) on Friday June 22, 2012 @04:05PM (#40415935) Journal

    to long-range-surveillance systems that magnify subtle motions, to contactless lie detection based on pulse rate.

    This is the first thing they're going to do with it.
    All the other applications might come afterwards.

  • Uhm... I didn't RTFA or watch the video (good /.er and Flash disabled, respectively), but that sounds like an off-the-shelf bunch of audio effects pointed to a different array. Even TFS acknowledges this. "Applied to something no one else has" maybe, but hardly "invented". Really this is obvious stuff, but my guess is that everyone else just assumed a typical camera/video didn't have enough SNR for anything interesting to be amplifiable. I know I did when I had the idea of applying my audio filters to video
    • Re:Invented? (Score:5, Informative)

      by Bill Dimm ( 463823 ) on Friday June 22, 2012 @04:10PM (#40415997) Homepage

      I didn't RTFA or watch the video (good /.er and Flash disabled, respectively)

      If you click through to the article they have HTML5 videos served from YouTube there, so there is no need for Flash. Why Slashdot is still embedding videos as Flash is a mystery to me.

  • Sounds like an essential component for a Voight-Kampff machine.
  • Old (Score:2, Insightful)

    Good lord, hack-a-day featured this over 2 1/2 weeks ago [hackaday.com]. In fact, there's already a bloody iPhone app [apple.com]!
    • I've had an Android app for at least 6 months which can detect heartbeat rates from a person's face.

      I'm not sure what MIT "invented" here.

      • I think the novelty is in a new motion tracking technique. The video starts with color change tracking (probably because it's so dramatic) but switches to motion about halfway through. The MIT news report closes with a UC Berkeley professor's comments:

        "This approach is both simpler and allows you to see some things that you couldn't see with that old approach," Agrawala says. "The simplicity of the approach makes it something that has the possibility for application in a number of places. I think we'll see a lot of people implementing it because it's fairly straightforward."

    • Re:Old (Score:4, Insightful)

      by bill_mcgonigle ( 4333 ) * on Friday June 22, 2012 @05:31PM (#40416821) Homepage Journal

      I'm unclear - are you suggesting that Slashdotters should all be reading Hack-A-Day, know the Apple App Store inside and out, or that the information is time-sensitive enough to not be worth posting today?

    • Re:Old (Score:5, Informative)

      by catmistake ( 814204 ) on Saturday June 23, 2012 @03:55AM (#40419451) Journal

      In fact, there's already a bloody iPhone app [apple.com]!

      For the love of Pete!! Pulse oximetry [wikipedia.org] is not the same thing! Will ignorance ever tire of dismissively posting wildly inaccurate information to slashdot summaries??!!

      • No. People always ignorantly post things because of the Flynn effect, which is the scientific principle saying that people don't understand their own blind spots. Blind spots come from looking at sunspots because the sun is too bright. Duh.
  • by Anonymous Coward

    This technology works simularly to the cookie mosnter eye filter, as it selects only the frequency of the most crumbly of cookies, rejecting those that would fail to shower him with crumbs, except that the pertinent frequency is the frequency of color changes in a sequence of video frames, not the frequency of crumbly cookies for rapid injestion.

  • 2005 (Score:2, Interesting)

    by Anonymous Coward

    Anyone else notice the Motion Magnification page was last edited September 12th 2005?

    • by Chrutil ( 732561 )
      Yeah, I actually saw this presentation at Siggraph back in '05 and it was really cool. When I saw this one I was wondering if they came up with something new, but it looks like exactly the same thing.
  • This might be useful for detecting people carrying concealed guns. It's known that when people wearing a big dense object step up or down (a curb is sufficient) there's motion that can be noticed. Some cops are trained for this. Now it can be automated.

    • Great... with automation, non-trained security will be able to tell that my phone or sunglasses are actually a concealed weapon... while not noticing the sheath knife the guy next to me is wearing.

      I think detecting heart and respiratory rate would be much more useful -- assuming it doesn't take too much calibration, does not require the subject to be stationary, and can be used to sample a mass of people instead of a single sample.

      • Dude, if your sunglasses or cell phone weigh as much as a handgun, I'd suggest ditching the 1990 phone and not buying the depleted uranium shades again.

    • by mr1911 ( 1942298 )

      This might be useful for detecting people carrying concealed guns.

      Why do you care? Not everyone carrying a concealed firearm has any intent to harm you. In fact, most do not.

      Hoplophobia can be cured, but the first step is admitting you have a problem.

  • There are already a few iPhone apps out there that do this: You put your finger over the lens, the flash-LED-light goes on, and then it takes your heart rate in under ten seconds. They also do this by measuring color differentations.
    So I don't really get what's new about this.

    Btw, the app is called 'Heart Rate' (www.instantheartrate.com) Damn, I feel like the MyPCAntiVirus-dude now...
  • by slew ( 2918 ) on Friday June 22, 2012 @05:46PM (#40416937)

    I knew I saw this stuff before... Siggraph 2005 http://people.csail.mit.edu/celiu/motionmag/motionmag.html [mit.edu]

  • Is this to be an empathy test? Capillary dilation of the so-called blush response? Fluctuation of the pupil. Involuntary dilation of the iris?
  • 1) Implement algorithm.

    2) Set this video [youtube.com] as input.

    3) If you see any kind of motion before 0:07, then prepare to receive Nobel prize. Does the detected motion occur before the dog runs away? (Also: Profit!)

One man's constant is another man's variable. -- A.J. Perlis

Working...