A brief update on this.
I have installed EMI filters and additional capacitors on the CODEC board's power rail. Have cut the long power strip and supplied filtered power to each group of 3 (D/A and A/D) chips (will post pictures later).
Unfortunately no major improvement to report. Minor improvement is that all channels have a steady -93.4 dB peak noise reading using the original PSU, which is exactly a 1 bit noise (relative to 16 bits) in center panning. I think power is now as clean as it ever could be, noise on all channels are pretty uniform (regardless of sampling rate), so I'm okay with that. However, I'm not OK with the apparent 16 bits resolution and I still think that it should be possible to bring the noise further down.
I haven't investigated yet what the readout is on the digital tape out, but something tells me that input converters are working in 16 bits mode. Anyone any thoughts on this? I seem to remember having read somewhere that the main output sometimes gives 16 bits stream and that at unity gain tape outputs are truncated to 16 bits (without the OS patch installed, but I do have it installed), but now it seems to me that the actual channel input digital data is truncated to 16 bits. If I lower the channel fader, values do go into the 24 bit resolution range on the main output - so the mix engine and SPDIF output does seem to work fine at 24 bits.