Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Open Source Graphics Software

Broadcom Releases Source For Graphics Stack; Raspberry Pi Sets Bounty For Port 77

One of the few but lingering complaints about the Raspberry Pi is that it relies on a proprietary GPU blob for communication between the graphics drivers and the hardware. Today, Broadcom released the full source for the OpenGL ES 1.1 and 2.0 driver stack for the Broadcom VideoCore IV 3D graphics subsystem running on one of its popular cellphone systems-on-a-chip. It's available under a BSD license, and Broadcom provided documentation for the graphics core as well. The SoC in question is similar to the one used on the Raspberry Pi, and Eben Upton says making a port should be 'relatively straightforward.' The Raspberry Pi Foundation has offered a $10,000 bounty for the first person who can demonstrate a functional port. (The test for functionality is, of course, being able to run Quake III Arena.) Upton says, 'This isn't the end of the road for us: there are still significant parts of the multimedia hardware on BCM2835 which are only accessible via the blob. But we're incredibly proud that VideoCore IV is the first publicly documented mobile graphics core, and hope this is the first step towards a blob-free future for Raspberry Pi.' Side note: the RPi is now two years old, and has sold 2.5 million units.
This discussion has been archived. No new comments can be posted.

Broadcom Releases Source For Graphics Stack; Raspberry Pi Sets Bounty For Port

Comments Filter:
  • by TheGratefulNet ( 143330 ) on Friday February 28, 2014 @03:51PM (#46370503)

    I have given up on the rpi and moved to the BBB since I could not stand the issues related to the broken-by-design usb system.

    since ethernet goes thru usb and usb was flawed, this really put a damper in my networking use of this box.

    its been at least a year since I checked in; have they fully and completely gotton around the usb 'elephant' bug yet?

    • I have a Raspberry Pi and I use both Ethernet and USB extensively with no issues. What bug are you talking about?
      • by ledow ( 319597 ) on Friday February 28, 2014 @04:05PM (#46370619) Homepage

        I'm not the OP, but:

        The one that still had a ticket open.

        If you slag the USB / Ethernet buses simultaneously, things nowadays slow to a crawl. Why? Because if they don't make them slow to a crawl, you drop USB packets silently, which makes the driver stack crash.

        The bug was reported within weeks of release by a guy doing lots of USB / SD / Ethernet work simultaneously and I'm linked into it. Still unresolved, but they tweaked the "firmware" (really software) of the RPi to lessen the impact by degrading some performance.

        It's a timing issue on the shared bus that's part of the hardware design and can't be "resolved" without a redesign. They just worked around it so that the blindingly-obvious bug when it was first released isn't so prevalant, but there's a cost.

        My pre-order RPi ended up in my loft for 6 months after I waited a year for them to fix it (and also - on the request of some of the RPi designers / distributors - I had sent off SD cards to some guy at Broadcom who worked on the RPI "in his spare time" who then later discovered that it was because of things like this that SD cards weren't reading, not that they were old / strange cards). It's a nice gadget, but it is basically a bodge-job and for my use was useless for over 18 months without sight of permanent resolution.

        • thanks for the info. so, the fundamental issue cannot be fixed and only by degrading the usb performance can they give higher reliability.

          that just hammed the final nail into the rpi coffin, for me, anyway.

          back to the beaglebone black, it seems. its fully open source (today), it runs both android and linux and has onboard flash enough so that you don't need sdcards (but can still use them).

          a friend of mine also ported ubuntu 64 to the BBB. he claims it was not simple or easy task but he actually did get

          • Your friend ported 64-bit Ubuntu to the 32-bit BBB?

            He's quite the wizard.

            • That's nothing. Here [slashdot.org]'s Ubuntu running on an 8-bit machine.

            • I'm simply repeating what he said, but I believe the BBB cpu is 64bit and he had to do a LOT more than just change some #defines. I don't know that cpu family (at all) but from what he described, it was very non-trivial.

              if I'm wrong, then I must have misunderstood what he said; but he's not the kind of person to just make things up and he was quite proud of getting this working.

              • I believe the BBB cpu is 64bit

                You believe in god too,right?

              • from what I remember, he explained that the arm runs things in parallel inside and he redid the way the instructions are broken down into parallel bits and then managed to get 64 bits of value into enough regs to run them at once and return the value to the caller, making the caller 'think' that a 64bit value was processed.

                at least that's how it was explained to me. I have zero experience with ARM cpu architectures so I may have gotton the details wrong.

              • by BobNET ( 119675 )

                I believe the BBB cpu is 64bit

                The processor in the BBB has a Cortex-A8 core. It's nicer than the processor in the RasPi in just about every way, but still only 32-bit.

                I installed SlackwareARM on my BBB; I don't really think 'port' is the right word to use, as all I really did was compile a kernel. And even then, I could have copied one from another distro...

          • by drosboro ( 1046516 ) on Friday February 28, 2014 @06:31PM (#46371879)
            Depending on what you're trying to do, you may or may not ACTUALLY have any performance trouble with this bug. I've been using an rpi as a router / firewall / proxy / etc. in my home for about 1.5 years now. I'm using the Ethernet port, plus a USB -> Ethernet adapter to get a second port. Performance may not be spectacular, but it's still good enough to saturate my home (15-20mbps) connection, with about 8-10 devices on the other side. Not bad, for a device that cost (including case, power supply, SD card, and ethernet dongle) about $60. Granted, there's lots of applications for which the rpi is not well-suited - but basic home-networking stuff doesn't necessarily have to be written off.
            • A router/firewall? With one ethernet port? Why would you do that?
              • by drosboro ( 1046516 ) on Friday February 28, 2014 @08:45PM (#46372813)
                Well, I started off trying it out just to make sure I could get the software running the way I wanted to. My plan was to trial it with the rpi, and then move to "proper" hardware with dual ethernet ports eventually. But, as I mentioned, I'm saturating my connection with the rpi and a USB->Ethernet adapter, so I haven't seen any reason to move "up". Works great, draws very little power, and gives me all the speed I need. So, why wouldn't I?
          • by qpqp ( 1969898 )

            back to the beaglebone black, it seems. its fully open source (today), it runs both android and linux and has onboard flash enough so that you don't need sdcards (but can still use them).

            So are the OLinuXinos [olimex.com], besides some of them having twice the RAM for those who need it.

            PS: feta bucks!

        • by Snospar ( 638389 )
          Totally off-topic but I had to reply, you're about the fifth person this week I've seen that stores computer components "in the loft".
          Here in Scotland, anything I store in the loft has to be mold, damp and mildew proof - and computer components definitely wouldn't fare well up there. It's not that we have a damp house, on the contrary it's a modern ventilated timber frame with a secure (non-leaking) roof... it's simply that it rains/snows/hails/sleets here a lot so we only get truly dry a couple of months a
          • by ledow ( 319597 )

            I live in the UK, but not Scotland.

            The loft is a dry, dusty, windy place. Water does not get in. In fact, it's so dry that even in a thunderstorm you will cough from all the dust etc.

            As such, mould, damp, mildew etc. aren't a problem in the loft (and, yes, we do have minor problems elsewhere in the house - like the porch - where condensation or water builds up. Not enough that you can see the water itself, but enough to promote some slow mould growth).

            My house is a 1930's, double-brick walls all the way

        • On a related topic, I had an issue with my motherboard, which hasn't been resolved. It's an Asrock Z87 Extreme4. Running Windows 7 - I notice that the first hyperthread of my i7 4770k is pegged at 50%. Lots of digging, it looks like it might be a faulty design, putting the intel management engine and the USB subsystem on the same interrupt. What do you lot think?

          http://forum.sysinternals.com/... [sysinternals.com]

        • Is this why everything slows to snail pace when using subshells? Hmmm...
    • by Anonymous Coward on Friday February 28, 2014 @04:11PM (#46370655)

      The OTG host is now relatively bomb-proof as far as USB2.0 high-speed devices (i.e. onboard network) are concerned. Of course, performance and total throughput is never going to be on a par with EHCI hosts because, well, BCM2835 has an ARM11 performing the job that the EHCI controller otherwise does.

      In the last few weeks a major rewrite has been pushed that, following some beta testing, should squash the remaining issues with the Achilles heel (USB1.1 devices on a USB2.0 host - via split transactions) and at least draw a line in the sand saying "these are the things that work flawlessly, and these are the things that will never work". The aim is to make the second set of devices an extremely small one.

      Disclaimer: I am the guy that's spent the last 6 months slaving over a dual-port USB2.0/OTG analyzer figuring out *ALL* the damn bugs.

      • by adri ( 173121 )

        Hi,

        Can you please email me? (adrian@freebsd.org) - we have a vested interest in getting the r-pi USB bugs ironed out on FreeBSD and would likely really benefit from the work you've done.

        Thanks!

        -adrian

    • by YesIAmAScript ( 886271 ) on Friday February 28, 2014 @05:50PM (#46371503)

      I gave up on BBB and went to rpi because BBB couldn't come up with a distro that worked.

      Got tired of dd'ing my SD storage space back to stock and starting over when the unit ceased to boot after installing another stock apt.

      And that's assuming it even worked when clean which it didn't, at least not at first (I got one of the first batch).

    • by rephlex ( 96882 )

      No, they haven't completely resolved the multiple bugs with USB on the Raspberry Pi and I don't expect they ever will. Some of them seem to be completely unsolvable in software..

      Incidentally, they've just started a beta test of the latest round of USB fixes: http://www.raspberrypi.org/php... [raspberrypi.org]

  • Communication? (Score:4, Insightful)

    by Anonymous Coward on Friday February 28, 2014 @03:58PM (#46370575)

    "One of the few but lingering complaints about the Raspberry Pi is that it relies on a proprietary GPU blob for communication between the graphics drivers and the hardware."

    It wouldn't be so bad if this was the case. Unfortunately, closed GPU core is the main one in the device and the CPU is in fact a small, "slave core" in relation to the GPU. Without closed blobs running on the GPU, you cannot even boot CPU at all. Open OpenGL stack won't change that.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Some people are never happy. Fortunately, with 2.5M sold, a lot of people are happy, and don't really care about propriety blobs.

      This news reduces the size of the closed source code, but more importantly now means Android can be ported.

      • Re:Communication? (Score:5, Insightful)

        by maevius ( 518697 ) on Friday February 28, 2014 @05:07PM (#46371035)

        At this point, I have concluded that many slashdotters are "hipster geeks"

        Anything that gains traction and is widely known outside of the normal geek circles becomes "uncool" and is slammed down. As you can see for raspberry, although the things to bitch about are getting fewer and fewer, there are always things that slashdotters bitch about. I'm pretty sure that even if they resolve everything, slashdotters will bitch about its color.

        Now think what would happen if only a couple of thousand raspis were sold and only part of the geek community knew about it. It would be all the rage!

        • You deserve the Slashdot most insightful comment of the year award.
          I personally love the Raspberry PI. It is not the greatest computer ever made, but it might be the greatest computer you can buy for $35
          • by fisted ( 2295862 )
            I agree, but i hate the fucking color of it!
          • The Pi is a fine pedagogical computer. It's easy for anybody to hook one up. But if you're going to embed, it's better to stay closer to the silicon, because $35 is way, way high for an embedded controller. I prefer chips in the $2-5 range.

        • As you can see for raspberry, although the things to bitch about are getting fewer and fewer, there are always things that slashdotters bitch about.

          I don't know about you, but the idea that you have a board with a weak 700 MHz ARM core that you can program, and another 24-GFLOPS-capable, fully programmable core you can't program just because someone erected some stupid artificial hurdles always sounded like something bitching-worthy to me.

          • Re:Communication? (Score:4, Insightful)

            by maevius ( 518697 ) on Friday February 28, 2014 @06:52PM (#46372079)

            Sure it is. I don't see you bitching about your phone, pc, car, tv, microwave oven though. You do realise that after this announcement, videocore is the most open core on an ARM chip ever, right?

            btw, http://www.broadcom.com/docs/support/videocore/VideoCoreIV-AG100-R.pdf [broadcom.com] here you go...hack away

            • I don't see you bitching about your phone, pc, car, tv, microwave oven though.

              I don't see you around me for you to see me. You might be surprised.

              You do realise that after this announcement, videocore is the most open core on an ARM chip ever, right?

              No, the most open core on an ARM chip is an ARM core (without Jazelle, of course). But in case you meant purely graphics cores, well, that's nice, but I'm sure the vendors will find other ways to screw any fully open development endeavor anyway - they always come up with some way (like not releasing everything [slashdot.org], apparently).

            • You do realise that after this announcement, videocore is the most open core on an ARM chip ever, right?

              Oooh, so now we can fix the buggy media decodes? Oh, wait, no, that's not open - just the GL/shader stuff.

        • I'm pretty sure that even if they resolve everything, slashdotters will bitch about its color.

          Nope. I spent good money on a handful of RPi's, wasted a few dozen hours on the beasts, just to finally turn up via searching specific error messages on Google that the USB/Ethernet stack is fatally crippled in design and that the GPU blob is secret-source and buggy and crashes on many media file decodes.

          Now, for being a '21st Century C=64' and learning computing for school children, the thing is fine. The proble

          • by maevius ( 518697 )

            I'm pretty sure that even if they resolve everything, slashdotters will bitch about its color.

            Nope. I spent good money on a handful of RPi's, wasted a few dozen hours on the beasts, just to finally turn up via searching specific error messages on Google that the USB/Ethernet stack is fatally crippled in design and that the GPU blob is secret-source and buggy and crashes on many media file decodes.

            I have a raspi in front of me, with the embedded ethernet, 1 bluetooth, 2 wifi devices (1 master, 1 monitor), 1 gps, 2 usb sticks in raid, and it's charging my galaxy nexus through a powered hub. The usb problems have become such a chewing gum for the slammers, all I can say is: bullshit. You had a problem with an early revision of rpi and now you love bitching about it altough it is most probably resolved. You are welcome to post details though...

            Now, for being a '21st Century C=64' and learning computing for school children, the thing is fine. The problem comes from all the geek-chic folks who are hocking the RPi for media center devices, network devices, and a replacement for microcontrollers.

            I am guessing you were going to use it for rocket control of

    • Re:Communication? (Score:4, Interesting)

      by ssam ( 2723487 ) on Friday February 28, 2014 @04:11PM (#46370649)

      Looks like there are some parts (MPEG decoding) that will never be open. But there's a plan to make a open source firmware that is sufficient to boot.
      http://www.raspberrypi.org/arc... [raspberrypi.org]

    • I am so god damn tired of this stupid argument. The CPU is not a "slave core". It is an ARM6 RISC core (with MMU) and it is what runs Linux and all the applications, not the GPU . The GPU does control the L2 cache and the memory controller/arbitrator which allows it to have the highest priority access to memory and meet the video memory bandwidth requirements. I haven't had time to read the hardware documentation now that it has been released (making it no longer closed contrary to your assertion otherwise
  • by skids ( 119237 ) on Friday February 28, 2014 @04:08PM (#46370631) Homepage

    If I spend days writing a GPU core port, I MIGHT get $10,000, unless someone beat me to it.

    I appreciate the injection of funds into the open source community, but that's no way to run an economy. Hire someone. If you want more than one implementation or you want to have it fast, hire multiple people and offer a bonus for completion. But if you do the latter, don't expect to actually use the first one you receive, as it will likely be the shoddiest, meeting the bare minimum of your specs.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Who said it was to run an economy? It's a competition! Why should the foundation hire someone and blow a load of money? They don't need an OS driver, they have the existing one (which does everything they need for the primary purpose of the Foundation). It's the OSS 'community' that wants an OSS driver. Now they have the documentation they need and an added incentive to write one.

    • I used to run competitively. I didn't always win, I just enjoyed doing it.
    • by Kjella ( 173770 )

      If I spend days writing a GPU core port, I MIGHT get $10,000, unless someone beat me to it.

      If you estimate days, not weeks for a shot at $10k you're complaining? Don't worry keep doing your $100k+/year day job. My guess is that there won't be anyone trying to do this in secret anyway, if I was serious about it I'd probably announce it on the mailing list and if there was anyone else thinking the same thing probably one of us would back off or we'd join forces. The worst that could happen is probably that one project starts and then stalls, but they're so far along nobody else dares to start. My g

  • Upton sez: "But we're incredibly proud that VideoCore IV is the first publicly documented mobile graphics core,"

    Uh.. considering that the graphics cores in Baytrail tablet chips have had open source drivers in the mainline Linux kernel since at least early last year (the earliest commits may go all the way back to 2012), and considering that Intel's Gen7 graphics system is very well documented, I'd have to disagree there.

    • by Anonymous Coward

      Upton is a Broadcomm employee so probably only considers non-Intel parts as mobile parts. It also seems peculiar that he as a Broadcomm is championing a reverse engineering contest. Can't he just talk to a guy over in another cubicle to get the source?
       

    • 'mobile' as in powers a 'mobile phone'. You're comparing apples and oranges. Wake me up when Intel HD Graphics compete in the SoC space with Adreno and Lima.

      The Atom used in phones such as the Geeksphone Revolution still uses a PowerVR GPU.

  • by tedgyz ( 515156 )

    I love that Q3 is the test case. That was probably the greatest multiplayer FPS game ever created. A good chunk of my life was spent playing and administrating a server.

    My favorite "accomplishment" while playing was always "Rampage!".

  • How about : Compile that to an ASIC.

  • Not actually sou (Score:5, Interesting)

    by david.given ( 6740 ) <dg@cowlark.com> on Friday February 28, 2014 @05:49PM (#46371491) Homepage Journal

    The Videocore IV on the Raspberry Pi (which totally kicks arse, BTW, it's a beautiful, beautiful processor. Did you know it's dual core?) currently doesn't have an open source compiler that's any good[*] which I'm aware of. I have tried porting gcc, and got a reasonable way into it, but ground to a halt because gcc. I know another guy who's similarly about half-way through an LLVM port. And Volker Barthelmann's excellent vbcc compiler has a VC4 prototype which makes superb code, but that's not open source at all.

    Without a compiler, obviously the source isn't much good, although the VC4-specific code is really interesting to look through.

    In addition, having done a really brief scan of the docs they've released, this isn't what the article's implying: what we've got here looks like the architecture manual for the QPU and the 3D engine. The QPU is the shader engine. Don't get me wrong, this is awesome, and will allow people to do stuff like compile their own shaders and do an OpenCL port, but I haven't seen any documentation relating to the VideoCore IV processor. The binary blob everyone complains about runs on that.

    It does looks like the source dump contains a huge pile of stuff for the VC4, so maybe they're going to release more later. But even incomplete, this is a great step forward, and much kudos to Broadcom for doing this.

    [*] I have done a really crap port of the Amsterdam Compiler Kit compiler for the VC4. It generates terrible, terrible code, but I have got stuff running on the Raspberry Pi bare metal. It's all rather ground to a halt because there's still a lot of stuff to figure out in the boot process, but interested parties may wish to visit http://cowlark.com/piface [cowlark.com].

  • YIS! now maybe i can start scrypt mining on my rpi! only got like ~1kh on the cpu core, with the gpu maybe i can get it to 20! hellllllllllllllllllllllllllooooooooooooooooooooooooo 15 cents a day! ILL NEVER HAVE TO WORK AGAIN
  • If the video core in the BCM21553 is so close to the one in the BCM2835 (Raspberry PI CPU) that its possible to port from one to the other, why cant they release the source for the BCM2835 bits so no port is necessary?

    Or is it too hard to disconnect all the video codec stuff (MPEG etc) that they cant legally release from the OpenGL stuff in the PI firmware?

  • Unfortunately, if you look at the "driver sources" carefully, it's just another shim to the real driver that does the heavy lifting. This implementation does not submit GPU instructions directly nor does it expose the shader compiler where someone can trace how shaders are being transformed into native instructions. In the end, it's just a layer that just submits user data to some specialized (probably proprietary) ioctl that exposes the functionality of the real driver implemented as a binary kernel blob a

If all else fails, lower your standards.

Working...