Become a fan of Slashdot on Facebook


Forgot your password?
Earth Space Science

Russian Satellite Takes Most Detailed 121-Megapixel Image of Earth Yet 123

Diggester writes "The satellite, known as Elektro-L No.1, took an image from its stationary point over 35,000 kilometers above the Indian Ocean. This is the most detailed image of the Earth yet available, capturing the Earth in a single shot with 121-megapixels. NASA satellites use a collection of pictures from multiple flybys stitched together. The detail in the pic is just amazing."

This discussion has been archived. No new comments can be posted.

Russian Satellite Takes Most Detailed 121-Megapixel Image of Earth Yet

Comments Filter:
  • Sweet! (Score:4, Funny)

    by SailorSpork ( 1080153 ) on Monday May 14, 2012 @11:52AM (#39995559) Homepage
    Now all I need is a 2000 inch TV to view it on. I think Weird Al knows where to get one.
    • Re:Sweet! (Score:4, Insightful)

      by vlm ( 69642 ) on Monday May 14, 2012 @12:06PM (#39995777)

      Thats only a square 11000 pixels on a side. A 300 dpi laserprinter would make a roughly one yard/one meter printout.

      At a slightly higher resolution that would be a metric A0 paper size. printers that big do exist but are kinda expensive. Best upload it to your local printer/office store and let them print it instead of do it yourself.

      • 300 dpi isn't event that good for a laser printer. Even cheap laser printers can do 600 and often 1200 dpi.
        • by trb ( 8509 )
          A photo image like this tends to have pixels that each store 24 bits of RGB color (one of about 16.7 million light colors). A color laser printer pixel usually has one of four pixel ink colors (cyan, magenta, yellow, or black). You can compare the pixels, but you shouldn't compare them one to one.
          • um, what?

            i'm pretty sure you can have solid reds, greens and blues.

            we have the technology to print more than one ink in the same place.

            the gamut is pretty different compared to the sRGB we're used to, but it's a stretch to say a printer can only do 1 colour at a time.

        • Re:Sweet! (Score:5, Informative)

          by vlm ( 69642 ) on Monday May 14, 2012 @01:22PM (#39996703)

          I specified that based on visual acuity limits. There's a lot of optical theory explaining why over 300 dpi is mostly useless for toner on paper. Unless your eyeball lens diameter is 10 times bigger than the average human or your retina cell layout is different than all known humans, it is not optically possible to resolve 3000 dpi or whatever on paper under normal conditions and lighting. Depending on how close you can hold the paper before you can't focus on it anymore, and tangentially depending on how bright the light it (little pinhole camera iris) humans top out around 300 dpi.

          Now, projected thru transparencies onto a overhead, higher res works, if you have old fashioned overhead projectors and sit close to the screen. Also there are ugly aliasing and anti-aliasing effects that can be avoided by higher res with real vector scaling. And high res allows better/smoother color mixing, in that bluring together 2**8 pixels of 2**16 color is the same as one 2**24 pixel, more or less. There are also relative brightness/consistency effects where making a "line" that varies from 8 to 9 pixels wide looks a lot less consistent than a line that is 85 or 86 pixels wide at 10 times the res, look at the percentage variation of one pixel. If the lighting is really bad, there are strange shadow effects where you can perceive over 300 dpi if the shadows land just right. Also there are some strange toner based textural issues where the plastic surface of thinner lines literally looks different. And some 3-d effects of toner on paper. So over 300 dpi is not a complete waste of time, just mostly a waste with average pictures under average conditions. It would be extremely hard to justify over 1200 dpi even in the weirdest corner cases.

          • Even though you can't tell the difference between 300 and 600 dpi black text on white paper, you sure as hell can tell the difference between 85 lpi (laser printer halftones) and 175 lpi (glossy magazine halftones). And you cannot pull off said 175 lpi (even 150 lpi) with less than 1200 dpi, 2400 dpi being recommended. The 600 dpi printer just isn't exact enough.
    • Re: (Score:2, Funny)

      by OzPeter ( 195038 )

      Now all I need is a 2000 inch TV to view it on. I think Weird Al knows where to get one.

      Yeah .. but Frank got the last one. And don't touch his remote either!

    • ...and 90,000 watts of Dolby sound!
    • What a lucky guy Frank is! I hear he got the last one in stock!
  • Looks terrible (Score:3, Insightful)

    by Anonymous Coward on Monday May 14, 2012 @11:59AM (#39995671)

    The detail is fascinating but visually it looks terrible because it includes the infrared spectrum. It looks like a dead rock with sick black oceans. Awful.

    • Agreed, why not just set the infrared "vegetation" band to some hue near green so that it can at least look a little like the real thing? Or maybe just leave the IR pass out altogether? I like my Nasa-made "ghettopixel" blue marble image much better, thanks.

    • Re:Looks terrible (Score:5, Interesting)

      by Hadlock ( 143607 ) on Monday May 14, 2012 @01:01PM (#39996423) Homepage Journal


      chromatic abberation in MY 1.21 gigapixel space photo?

      this was NOT the future I was promised

      send it back

      • by Anonymous Coward

        Looking carefully, I think it's not chromatic aberration, it's a slight change in lighting conditions, cloud cover, and (perhaps) satellite orientation during the shot. I expect that each color filter image was taken separately, and a several minutes apart. Any movement, or color change between each exposure leads to those edge effects you see.

        At least it doesn't have that fake, way too thick and bright "atmosphere" that the more natural colored NASA image has (the famous one centered on North America). T

    • by Anonymous Coward

      Well, If you look at how much is covered in concrete and asphalt, then have a look at where the Mississippi dumps into the Gulf couple with how drinkable most river water is..... ........It's a dead rock with sick black oceans.

        When you zoom in a little on google maps, all the green stuff is life, all the grey stuff is cancer.

  • by deains ( 1726012 )

    The shadows are completely wrong. 'Nuff said.

    • by ravenspear ( 756059 ) on Monday May 14, 2012 @12:05PM (#39995761)

      It looks photoshopped because it includes false color data from an infrared cam. It's not photoshopped.

      • The one thing that does bother me is the chromatic aberration - especially at higher magnifications, the overlap between the colors is very jarring.

        I rather doubt the CA filters in Photoshop could handle this problem, but it would give you a more esthetic result.

        • by nomel ( 244635 )

          This is the original image. You're free to do as many lossy operations on it as your heart desires.

      • I'm not a photography expert especially when it comes to infrared imagery, but are there RAW files of that sort of thing where the post-processing could be done in an image editor? I think some people would prefer green for vegitation.
        This looks like we have plenty of Spice reserves.

  • by Anonymous Coward on Monday May 14, 2012 @12:01PM (#39995715)

    Also, I looked at the zoomable image and zoomed in all the way in and.... saw mostly macroblocks? Is that still "amazing detail" in a sense that eludes me?

    • The article explains the color, maybe read it?

      It also says just how big an area each pixel covers.

    • Also, I looked at the zoomable image and zoomed in all the way in and.... saw mostly macroblocks? Is that still "amazing detail" in a sense that eludes me?

      That particular Gigapan upload was 1.12Gpix which suggests that they did some sort of interpolation to make it appear more grandiose. And the rust-orange is because that is the most creative thing the russians could think of to use the IR band for (heck with making it some shade of dark green...)

      • by Zocalo ( 252965 )
        Yeah, I noticed the same thing. I would have thought that with 1km resolution you might be able to pick out a vague smudge where some some of the larger cities in India and China that were visible from the satellite, but no, just massive amounts of chromatic aberation from the imaging method used. Clearly roads are going to be out, so I tried again with the 1080p video clip - thought that maybe Mumbai or Shanghai would show up as a brighter spot in the darkness - after all you can see city lights on much
    • by Anonymous Coward

      Also, I looked at the zoomable image and zoomed in all the way in and.... saw mostly macroblocks? Is that still "amazing detail" in a sense that eludes me?

      Seriously dude, I was all excited about being about to zoom in on the cleavage of a few million Indian chicks. No such luck. :(

  • by Trepidity ( 597 ) <> on Monday May 14, 2012 @12:04PM (#39995751)

    One reason the NASA global-coverage image sets that were released in 2002 (with updates starting in 2005) have become the de-facto standard source is that: 1) anyone can download them; and 2) they're in the public domain, so anyone can use them for any purpose. You can get a bunch of versions here [] and from the Visible Earth site linked at the bottom of that page.

    This one looks cool, but further use will be limited if the only thing I can do with it is look at it in this online zooming browser.

    • by glassware ( 195317 ) on Monday May 14, 2012 @12:06PM (#39995773) Homepage Journal

      Also, there seems to be a lot of chromatic distortion on the image. Check out the clouds - there are three separate registrations for each color in the cloud image. Were their optics not calibrated, or did they take each color picture separately?

      • by Baloroth ( 2370816 ) on Monday May 14, 2012 @12:10PM (#39995837)
        I came into the comments to say this. Holy hell is the chromatic aberration on that image absolutely terribly. It looks a lot like they took the different color channels separately (that would explain why the clouds, which are moving, were especially bad), and TFA says the pictures take ~30 minutes each, so that's the only thing that makes sense to me.
        • Actually, yeah -- I bet that's it. It's not optical problems; it's a delay between the images of each color channel.

          Not sure why they should take 30 minutes each, though.

          • by tibit ( 1762298 )

            For processing such an image for publicity release, it'd be customary to estimate motion vector fields between each pair of consecutively taken images, and apply motion compensation to register the clouds with minimal aberration. They apparently didn't do that.

            • by Trepidity ( 597 )

              If I'm understanding the article correctly, it sounds like they sent the raw data to "an educator named James Drake" on request. Presumably he's the one who did the overlay, but possibly doesn't have any specialist background in this area, so did it the quick-and-dirty way.

        • by Ihmhi ( 1206036 )

          I came into the comments

          I do hope you've cleaned up after yourself. Slashdot is messy enough as is.

        • TFA says the pictures take ~30 minutes each, so that's the only thing that makes sense to me.

          TFA says...

          The satellite takes a full image of Earth from its stationary point over 35,000 kilometers above the Indian Ocean every 30 minutes, providing the material for the video below.

          It doesn't say anything about shutter speed/exposure time or how long it takes to transfer a single image back to Earth.
          It only says "wait 30 minutes between taking a picture".

      • by sootman ( 158191 )

        If only there was an article [] somewhere that described how they made the image...

        The image certainly looks different than what we're used to seeing, and that's because the camera aboard the weather satellite combines data from three visible and one infrared wavelengths of light, a method that turns vegetation into the rust color that dominates the shot.

    • I've used the NASA's texture for my little OpenGL program, I must say it's very nice of their part to provide everything with great details and multiple layers.

  • Holy chromatic aberration. You'd think an organization capable of blasting things into space could do a bit better than that.

    • Re: (Score:3, Funny)

      by Anonymous Coward
      Clearly they bought that camera because it has "the most megapixels"
    • That's what happens when you spring for a high end body (to keep up with the Joneses) and then cheap out on crap lenses - and then don't bother creating a lens profile in Lightroom to correct CA in post. ;)

    • by necro81 ( 917438 )
      I suspect it is from how the image was composited. The article, if you'd bothered to read it, indicates that the camera takes shots using four color filters: RBG, but also an infrared filter. The image you see above is the composite of those four images (with the infrared given a reddish brown tint, which makes all the vegetation look brown), and there may well be some registration error that wasn't accounted for.
  • It is upside down. We are all going to fall off !!!

  • by zentigger ( 203922 ) on Monday May 14, 2012 @12:09PM (#39995825) Homepage

    It still only has a resolution of 1KM per pixel and the chromatic aberration is terrible.

    • by AmiMoJo ( 196126 )

      I would assume they have an optical zoom capability which they don't want to share with the internet, otherwise it would be kinda pointless.

  • sky watches You!
  • This is the 2nd Russian picture i have seen taken of the planet that makes it look kinda 'orangy'

    NASA's blue marble photo is what I'm used to seeing []

    So why does it look different in Russian photos? What version is more accurate?

    • by idontgno ( 624372 ) on Monday May 14, 2012 @12:36PM (#39996113) Journal

      The "Blue Marble" image you're pointing at is based on EOS (Terra/Aqua) imagery. The most recent NASA Blue Marble (Blue Marble 2012 []) is a composite based on the new NPP Suomi spacecraft, with approximately a 1-km pixel resolution.

      As to "accurate"... I think the Blue Marble images (based on the visible-light band sensors of their respective spacecraft) are closer to what a naked eye in orbit would perceive than the Russian imagery, which seems to include false-color infrared. But "naked eye in orbit" is scientifically less useful than the multi-spectral IR and visible all of these spacecraft can sense.

    • "The image certainly looks different than what we're used to seeing, and that's because the camera aboard the weather satellite combines data from three visible and one infrared wavelengths of light, a method that turns vegetation into the rust color that dominates the shot."

  • According to the picture, everything around me (South-Eastern Europe) should be orange! []
  • I expected the weather (by that I mean clouds) to move faster. I may just be used to seeing weather maps on the TV update in 4 hour slides. I'm sure the 100mph winds I'm looking at, at that scale is fast enough :)
  • Where's the real picture? I don't want a stinkin flash app. 16-bit PNG FTW!

  • Waldo too! (Score:4, Funny)

    by Ostracus ( 1354233 ) on Monday May 14, 2012 @01:18PM (#39996649) Journal

    The detail in the pic is just amazing.

    And they still can't find Carmen Sandiego.

  • by thrich81 ( 1357561 ) on Monday May 14, 2012 @01:26PM (#39996751)

    121 megapixels -- can any of the photo aficionados tell us how that compares with the shots of earth taken with the film cameras aboard the Apollo spacecraft? Some of those were mighty good.

    • by mk1004 ( 2488060 ) on Monday May 14, 2012 @02:08PM (#39997271)
      The cameras used in the Apollo program included a 70mm Hasselblad. IIRC, years ago as digital cameras struggled to pass the 2 to 3 megapixel range, it was said that to be equivalent to 35mm, you'd need 15-18 megapixel. That was, I believe, to match the grain densities of 64 or 100 speed film. So scale that up about 4x to go from 35mm to 70mm. I'd say those Hasselblads did just fine.
  • by Anonymous Coward

    capturing the Earth in a single shot
    Wouldn't the satellite be able to capture only about half of Earth in a single shot?

  • by GOES_user ( 852842 ) on Monday May 14, 2012 @02:39PM (#39997597)
    Our current series of geostationary weather satellites operated by NOAA have been taking images at 1 km resolution for the visible band and 4 km for four IR bands since 1995. The primary difference with Elektro is that it has more bands, two visible bands at 1 km and 8 IR bands at 4 km (which is why it looks blocky when you zoom in). A description of that imager can be found here: [] The image referenced in the article is a false color composite, which has been a common product from weather satellites (geostationary and otherwise) since we started using them decades ago. It shows vegetation more than we have seen from GOES because it has a near-IR band. GOES typically takes "full disk" images every three hours. The US has a new platform going up in 2016 with 16 bands - visible bands are 0.5 km and IR are at 1 km. That sensor will not be able to do true color (some of us fought hard for that...) but it can be simulated to an extent (the sensor will have red and blue wavelength sensing abilities, with a near-IR band allowing use of a look-up table to generate green; the surface under thin clouds, around coastal areas, and some other cases don't look quite right). Japan has bought the same sensor from the same vendor but swapped out a band and replaced it with green, so they will have true color images at roughly 22,000x22,000 pixels in the 2014-2015 time frame. This new sensor can take "full disk" images every 15 minutes (that is the scan schedule set for the US, it could go faster than that). The US took true color images from a geostationary camera on ATS-3 in the late 1960s. As far as I know no one has taken true color images from the geostationary orbit since. I haven't looked closely at Elektro data but the loop I've seen indicates light leaking into the telescope as the sun starts to light the Earth in the east (ie sunrise) - it looks like a lens flare. Many weather satellites have issues like this to some extent, but in this case it was more pronounced than I've usually seen it.
  • I zoomed in and saw the color separations, but I don't have my 3d glasses handy so I can't see the effect.
  • I say "way to go Rooskies!"

    Pay no mind to the fools who can't get past the spectral mapping.

  • There has to be one in there somewhere! Quick, get a army of volunteers to go over it! Not that I am trying to keep anyone from playing Diablo 3 tonight after midnight. Has nothing to do with this..honest.
  • That light-dark cycle has been going on for billions of years, ceaselessly, perfectly. An amazing machine.

    The importance of perspective is underscored as well. From the geostationary satellite, it looks as though the earth is still. And it is - from that perspective. From the perspective of other universal bodies however, the earth is moving.

    Kudos to the Russkies for capturing this perspective and to James Drake for creating the video.

  • Suddenly, I'm concerned that we don't have enough protection against mind worms.

MESSAGE ACKNOWLEDGED -- The Pershing II missiles have been launched.