Graphics card question

Status
Not open for further replies.
Yes and no. Usually not on youtube (though is supposedly possible by using an H.264 browser plugin or is that native now in Firefox??), but using other video playback apps that can use DXVA or CUDA acceleration, the decoding can be much faster (though only needs to be fast enough for realtime playback). This can help reduce tearing artifacts when upsampling to a larger window on a high resolution (like 4K), and offloads more processing from the CPU. It also allows more decoding quality than software-on-CPU does because a CPU is inherently limited in performance at something like this.

Additionally when upsampling video if you have the extra GPU horsepower you can apply shader filters in real-time to improve image quality such as denoising, edge sharpening, or more complex sharpening algorithms.

The break point for making this worthwhile might be upsampling 720p to a 24" or larger 1080p screen, or anything 2K or lower up to a 4K screen. In other words if you don't have a nice high pixel screen, you should upgrade that first, keeping in mind that his is another benefit of a newer, more powerful card, that some can now do 4K at 60Hz by having HDMI2 output. With only HDMI1, let alone DVI, you are stuck at 30Hz at 4K resolution which is too low for comfort.

I'm suggesting that if your monitor is as old as your video card, the monitor might be the place to start first and if you make it 4K, then you "need" a new video card for 60Hz refresh rate at everything, desktop use too, not just youtube videos or gaming.
 
Last edited:
It's a perfectly adequate card for high end graphics / architecture / scientific use so there should be zero advantage to swap it out for any desktop use. Even driving a TV it's perfectly fine. If you feel you need 4K / 8K etc then spend your money on the TV / AV side. There is absolutely no performance advantage of 4K etc on the majority of desktop monitors as your viewing distance is too near to actually discern any improvement.

Note that Apple's "Retina" displays are simply size, resolution and viewing distance based and mean you can't discern any practically sharper or higher resolution (although it must be said you didn't tell us what your monitor size is, as that factors in the equation. Still, your card can do that resolution at everything except truly massive monitor sizes ... into the 30 inches diagonally).

Your card would put your desktop amongst the less than 1% of PCs with that performance level.

Gaming has it's own rules and if you want to be at the cutting edge you need to be using cards specifically designed for gaming. Even then, gaming cards and high resolution work cards are different; if you're seeking "the best" you have to choose one application and go with it and let the other fall where it may (although it would still be considered high performance).. As always there is tremendous value in staying one step behind even with gaming.

Save your money would be my advice.
 
No complaints with the card, just wondering. Computer is an HP z800 Workstation 2 x 6 core Xeons. 4 sas 15000 drives, 24 gigs of ram - an off lease special from EBAY @ $400- and the above mentioned card.
Great bang for the buck- just had to add an os. Surprised they don't get looked at more often industrial strength components - cheaper than building your own.

Thanks for the replies!
 
Last edited:
Originally Posted By: 928
I currently have a HP Z800 workstation with a NVIDIA QUADRO 4800 graphics card. I don't do any design work on it. Out of curiousity would a more powerful card help with picture speed & quality? Not unhappy with it, but just wondering for thing like YOUTUBE etc.


http://www.nvidia.com/object/product_quadro_fx_4800_us.html


It can but your card should already have full hardware acceleration for the videos on youtube....maybe not 4k stuff. not sure.
 
Won't help much. Plus I wanted to add that graphics cards are overpriced now due to Bitcoin/altcoin GPU mining. The Z800 specs look beefy so I'm not sure what degradation you might be noticing for Youtube!
 
Since You Tube and Google are in bed together, VP9 is there codec of choice. I doubt your almost 10 year old Quadro has been updated with this so most of the decoding will be preformed by your cpu with a little bit of help from your browser. To do it more efficiently there's an add-on called H264IFY. This forces the ubiquitous AVC(h.264) standard to be used instead, hardware accelerating through your gpu.
 
Status
Not open for further replies.
Back
Top