Questions & Answers

Delay in the Inspector panel to be automatable

–1 vote
asked Jul 30, 2020 in Recording by simonjameslane (1,320 points)
The Delay in the Inspector panel to be automatable and able to put onto a fader of the Faderport series controllers or otherwise MIDI map to a hardware MIDI device.

Go one better, and offer it as a native plugin, with a scope view (a la Pipeline) which offers automatic polarity and phase adjustment, in addition to manual adjustment parameters, either against other concurrent events or the grid (or grooves).

I will explain this one a little further, as I am sure many users would consider this a game-changer if it were implemented and done right at the DAW-level.

Anyone who has recorded multitrack audio alongside virtual instruments and plugins on both types of tracks recognises the inherent offsets that are part of the recording process, and that automatic delay compensation only achieves so much. Well, more to the point, that it only solves one problem when there are several at play in a situation like this.

I'm talking about the offset times of acoustic delays, electrical/processing delays, plus MIDI roundtrip delays, could all be fixed with one plugin, either manually or automatically. These could be pegged to the grid in a quantised fashion or other tracks so as to preserve the groove and feel in the room at the time of recording. It only need compensate the start of the track, but boy would it save a lot of time and mojo when working on a track. Perhaps it could account for ongoing timing fluctuations e.g. wow and flutter, but IMO these are part of the charm of using such devices and users may like them to remain welcome in a song. I certainly would. Often I am zooming in by huge factors, pulling events forward or backward, only to have to jump back out and repeat the process on subsequent tracks.

I would even go one further and say if such a plugin or option could be toggled, "screenshot" or compared to various states (such as when bypassing, i.e. hearing original vs. altered) our brains tends to recognise changes and make more deliberate and confident decisions. Mike Stavrou talks about this in his book 'Mixing With Your Mind'.  

As as example for the first two types of delays, imagine you are recording drums. You run the kick, snare, hi-hat and toms all through various different pres and outboard on the way in, but record the OH and room mics dry, as you intend to process them in the box later. The signal paths for the various outboard each have inherent processing delays which then affect phase as is heard through the neighbouring mics that pick up bleed from the same instrument(s). Upon playback, the drums may seem a little "off kilter" but it is hard to determine exactly why. Mic position isn't entirely to blame here, as is often easy to identify and separate when monitoring via the desk via the playback. Sure, changing relative levels of your drums will affect the perceived timing and groove of the overall performance, but that then comes at the cost of the mix balance and sound you were trying to achieve. By adjusting the relative phase correlation of all instruments/tracks in a given take or song one could make for a superior and naturally tighter performance without having to rely on tape bounces or external bus processing to paper "over the cracks" in terms of groove. In terms of mic acoustics, it also could serve as a quasi post-recording mic adjustment technique—something akin to how the Townsend Sphere and other modelling mics operate. I'm oversimplifying for sure, and there would be many (especially multitrack) scenarios where it simply wouldn't work as well, but phase can sure fix a lot in terms of performance, psychoacoustics, and overall vibe of a track.

The third use case scenario that I can think of, whereby you are able to mitigate some effects of roundtrip latency with sound-generating MIDI devices sequenced by S1. I haven't upgraded to V5 (will do when I can afford) so please ignore this if it is already implemented with the new track recording features. Again, I envision this working most effectively if it is somewhat similar to Pipeline. One advantage in the scenario of MIDI being sequenced (or even simple received) by S1 is that such a phase offsetting plugin could also receive the MIDI/clock/timecode signal. (Izotope Breaktweaker manages to implement MIDI at the fx plugin/insert level). Basically the MIDI being sent or received by S1 would be compared with this phase comparator plugin and either synchronised or offset in the same way as audio as above. ​As any other hardware MIDI guys/girls will already be painfully aware of, sync can be your greatest ally or worst nightmare. It can in fact affect the whole vibe of a track during recording either positively or negatively.  

My understanding of delay compensation in terms of what really happens under the hood is zilch, but I trust my ears and eyes, and can say there is a lot of ground still to be made when it comes to phase and sync. If a DAW could "calibrate" and appropriately offset all of these effective timing differences inherent to all hardware, multi-mic recordings, and all of the laggy processing that happens in our interfaces and computers, it would make many folks very happy indeed. (I for one, hence why I am going to great lengths writing this!)   

IMO if you fix your monitoring, levels, and phase, that's 90% of your mix done. The rest then becomes pure joy.  

If someone wants to make a TL;DR be my guest! ha!

1 Answer

+1 vote
answered Nov 11, 2020 by simonjameslane (1,320 points)
Hi to the folks that down-voted my suggestion.

I would like to hear your reasoning as to why you disagree. Perhaps I am overlooking a current method of achieving this, or simply an alternative workaround.

Whatever the case may be I would like to hear from your perspective the case against this feature request.