Here's some data you never asked for, but which I'm going to give you anyway: When doing bilinear texture interpolation, how much precision does your GPU keep in the fractional part? On what I've tried so far:
- Intel HD Graphics (ie., 5th generation): 6 bits
- nVidia GTX 550 Ti: 8 bits
- Radeon HD 3850: 6 bits
(Results probably the same for card models from same generation, and unlikely to change at all except across major redesigns.)
I was surprised to find that the answer was not “full floating point”.