EEWeb Pulse - Issue 93

Page 10

EEWeb PULSE

“We’ve been getting unanimous accolades from purists who are perfectly satisfied with our mode varieties.” generated, then unless the creators deliberately put noise into it, there isn’t any. That means that we can use a different set of artifact removal tools to give users the option of putting it into Gaming mode. We call the third mode Full Pop. With this one, you probably are going to see artifacts in the content—most notably in text. You’re going to see some things that, if you were going to do a before-andafter, you might notice a few things added on top of the image. This is not really the setting for the purists, but it works really well for lower-quality video. We got the Darblet working in realtime in 2010. We hired our “boy genius” engineer who helped me do that and, in fact, he can do many things that I’m not capable of doing. One of the things he’s a whiz at is graphics cards and FPGAs. He first got it running in real-time on an nVidia graphics card using nVidia’s CUDA supercomputer parallel processing. Not only had we successfully gotten rid of the knobs, which we developed in software, but with CUDA you could run it in realtime on practical hardware. We knew that we really couldn’t run in the field on such graphics cards because we used all the graphics card resources. But in fact the goal was not to get it

10

to run on a graphics card, it was to get it running on a single chip. The next thing to do was to get it onto an FPGA. We were able to do the port to an FPGA in just a matter of a couple of months. It was very clear after we did the first bit of work that it was all going to fit on a low-end consumer version of an FPGA chip. In the intervening time between our first accomplishment of getting it to run and now, we’ve improved the resource utilization of the algorithm to the point where we use an astonishingly small number of resources. We use only 700K bits of RAM and we use about 80,000 gate equivalents, which is 17,000 logical elements (LEs). That’s all we use. We don’t use a processor, we don’t use a digital signal processor (DSP), and we don’t use any external memory like a frame buffer. Our total delay through our pipeline is only 200 microseconds. By processing the image in real-time with that small number of resources, it’s highly practical. We’re in the process now of taking the FPGA and turning it into an application-specific integrated circuit (ASIC), which will bring the price down enormously. In about a year we should be able to offer aftermarket products at better consumer prices. More importantly to us, we

EEWeb | Electrical Engineering Community

can now offer it to third parties—that is to say, OEMs that want to build it into their Blu-ray players and TVs, and private-label outfits that want their own brand of accessory product. Is the Darblet available to purchase? The Darblet is on sale now and has been since May of 2012. It uses an FPGA, which is not a cheap chip, as well as two other HDMI chips and a microcontroller supervise the HDMI. So three of the chips are there just to handle the HDMI and our algorithm is done completely in one chip without any external components whatsoever. What we have now is a four chip product, which, because of its bill of materials, is selling at $319 after it goes through distribution. We don’t sell it direct. We have dealers all over the world now, and we are getting some viral attention because when people first get a Darblet they show it to their neighbors and we get extra sales from word-of-mouth. We’ve been getting unanimous accolades from purists who are perfectly satisfied with our mode varieties. They’ve been essentially saying what I’ve been saying all along—when you look at good pictures and say to yourself that they can’t be improved any more, but strangely enough, they can be. People are noticing that now through our Darblet. ■


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.