<div dir="ltr"><br><br><div class="gmail_quote">On Wed, Aug 20, 2008 at 2:27 PM, Barry Clearwater <span dir="ltr"><<a href="mailto:barryc@bcsystems.co.nz">barryc@bcsystems.co.nz</a>></span> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
I take the point that a high def signal is getting heavily compressed,<br>
but we *are* talking about well known compression algorithms are we not?</blockquote><div><br>Well known but fairly new in terms of concerted use. Also there are different levels of compression and different "features" used to encode material. Some of them are more processor intensive than others. <br>
</div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><br>
Should somebody be writing a more optimised piece decompression code, </blockquote><div><br>Suggest you join the x264 or ffmpeg coding teams :-)<br> </div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
or<br>
in a lower level language? (cough - assembler anybody!!??)<br>
ITs just a 2D picture, our systems are capable of rendering 3D on the<br>
fly...<br>
I must be missing the point and over simplifying it somewhere.<br>
<pride speaking><br>
No way should windows boxes out-compute a gnu/linux system.<br>
</pride></blockquote><div><br>Well of course they do if nVidia supply windows users with a driver for hardware h.264 decompression and not linux users.<br> </div></div><br></div>