Larrabee use discrete chips for HD decoding?

0

Small preview image

When Intel presented details on its discrete graphics circuit Larrabee it revealed that the  x86 chip would have a lot more in common with a regular processor than the ordinary GPU. The x86 architecture would enable possible Larrabee to handle most work loads on its own, without using dedicated specialized chips. Alas, it has now revealed that Larrabee may not be able to get around on its own. Intel is said to have decided that the graphics card will use two HD video decoders.



This means that Larrabee does not impress when playing high definition video, which is is quite important today and the need will hardly have decreased when the chip launches next year.


This doesn’t mean that Larrabee isn’t powerful enough for playing high definition video, but perhaps more how much power the circuit consumes while playing HD video. The most believable is that Larrabee simply consumes quite a lot of juice when it plays HD video, which may have forced Intel to implement technology from the old G45 circuit.



This addition seems like a last minute thing, so it is probably pulled from Intel’s existing IP catalog. The most likely candidate seems to be the G45 decoders, they are small, low power, and have (finally) decent drivers. Then again, does area really matter in a 700+mm^2 chip?


It feels a bit bold to claim that Larrabee is powerful enough for decoding HD video, but like a regular CPU the efficiency is nowhere near high enough to break even the decoding capabilities of G45. The information on Larrabee’s HD decoders are unconfirmed.


Subscribe
Notifiera vid
0 Comments
äldsta
senaste flest röster
Inline Feedbacks
View all comments