Correct, encoding with 10-bit x264 means you won't produce
extra banding beyond what was already in the source, and opens the door to dithering pre-processing actually being retained in the encode without an insane bitrate. I wasn't replying to you directly, just the thread in general with something I had posted in the playback fourm. Below were the preliminary results Dark Shikari posted, at which point 10-bit x264 was still being decoded to 8-bit without dithering. Things have likely improved with all the 10-bit related commits since then.
Source
Denoise/Deband -> 8-bit x264
Denoise/Deband -> 10-bit x264
The technicalities of why this all works goes over my head, but the basic idea was moving from 8-bit -> 10-bit eliminates
something like 95% 75% of the error x264 accumulates while encoding, at which point high bit-depths >=11-bit show insignificant gains.
Edit:
Why does 10-bit save bandwidth (even when content is 8-bit)? [pdf link]
Edit2:
http://screenshotcomparison.com/comparison/65953 (Steins;Gate Blu-ray NCED 01:30 | 23.5MB 10-bit x264 CRF=15.0 vs. 38.9MB 8-bit x264 CRF=15.0)