I'm not understanding what you mean by 10-bit color. There is 8 bit color (JPEGS) and 16 Bit Color (TIFFS and other high quality file formats. RAW files ofter do not record the full 16 bits but instead produce 12-14 bit color per channel. The 8 bit and 16 bit is per color channel. So, an RGB image will have 16 bit times 3 channels or 48bits per pixel Or 8 bits times 3 for 24 bit color. Graphics cards might be capable of displaying 48 bits per pixel but more likely this will be a smaller number like 30 bit color. Perhaps this is the 10 bit color that you are referring to. This has been available on PCs since about 2006 and Adobe has supported it for PCs in Photoshop and perhaps LR as well. Only more recently has Apple made this available for OS X and Adobe has updated the OS X version of PSCC to support 30 bit color now too.
As for graphics cards, the only issue is GPU acceleration. This has no relation to how pixels are displayed. GPU is used for an addition CPU processor and not all Graphics cards are compatible with GPU acceleration. Here is a link to explain that
Adobe Lightroom GPU Troubleshooting and FAQ