I noticed in S1 4 that the roundtrip and instrument latencies in the Monitoring Latencies section of Options->Audio Setup->Processing are usually significantly higher than the sum of the input and output latency shown on the Audio Device tab. The instrument latency also scales in an odd way as the buffer size increases.
I would think roundtrip latency would exactly equal input latency + output latency, unless there are plugins in the DAW's audio path adding latency. However, I don't believe the Monitoring Latencies displayed account for plugin latency, since it would vary from channel to channel.
For example, with my Quantum 2, a buffer size of 128 and Minimum dropout protection (so the process block size is also 128), gives these numbers:
- Input Latency: 3.08 ms / 136 samples
- Output Latency: 3.27 ms / 144 samples
- Audio Roundtrip Latency: 9.25 ms / 408 samples
- Instrument Latency: 6.17 ms / 272 samples
Furthermore, the change in reported instrument latency doesn't seem to make sense as the buffer size changes. Here are my reported instrument latencies at different buffer sizes. In all cases, Dropout Protection is set to minimum to keep NLLM out of the picture:
@16 3.27 ms / 144 samples (output latency is 32 samples)
@32 3.27 ms / 144 samples (output latency is 48 samples)
@64 3.27 ms / 144 samples
@128 6.17 ms / 272 samples (= 144 + 128)
@256 18.1 ms / 800 samples (why a 3x jump in latency when doubling the buffer size?)
@512 36 ms / 1568 samples
So, it appears there is a hard minimum of 144 samples for instruments. Not sure why, but I'm sure there's a good reason, and 3ms is certainly not a problem. However, I'm particularly confused by the jump in latency from 128 to 256. Why would the latency go up threefold?