PDA

View Full Version : How 2 measure RTL Latency



mikebuzz
05-20-2008, 11:14 AM
How do you guys measure the latency of your system ?? from input - output-back in= ??

Our return trip latency , I did a test from 1 track to the next ( record click on trk 1 send it out the mains return to trk2 record ) @ 3x64 buffers I get near 2.9 ms latency BUT I cannot figure out how to measure it accurately !!

How would you measure the difference between tracks ???? ACCURATELY :D

Later
Buzz

Iain Westland
05-20-2008, 11:22 AM
print a click track first
then record that using a mic lead from that output to the new tracks input. then take the timeline into samples and press 1 on the key pad to tak it to the smallest amount. measure just how many samples difference there is.

not go and adjust the audi box to compensate that many samples and record again, if you right the new track will record in sinq with the source track.

if you are looking to compensate then i wout put a mic about an inch from the cone of a speaker instead of straight cable, it will allow for the headphones being a bit off from the head

mikebuzz
05-20-2008, 11:54 AM
That was it TIMELINE = SAMPLE MODE !!!!

I tried it in the soundfile view but no luck ??

Thanks
Buzz

Iain Westland
05-20-2008, 12:11 PM
i had a senior moment a couple of weeks ago with this:D

i was setting up ready for a band coming in to track, so i thought theres a mic in the live room set up, i'll whack a click through the pa and perform the tests.

played the click n recorded.
played the recording and recorded

measured the latency 240 odd samples:eek: i knew my system was nowhere near that large so i'm looking all over the MT and mixer for secret plugs, vsti's patched, nothing. i am now pulling my hair out as the bands due in in 20 mins and i am clueless.

i look up and see my problem, the mics feet from the speaker!!! run into the room reposition the mic run out and hit record, 109 samples!!! nope thats wrong too, look up again, in my haste i had not set the mic right, it had fallen and was just away from the speaker and to its side!!!

run back in, mic/speaker CAREFULLY positioned, hit record and violin. 74 samples. that was what i was expecting roughly.

now i am happy

mikebuzz
05-20-2008, 12:24 PM
Ok I am using a direct line out of an ADA8000 into ch. 2 of the ada8000 so no air/distance involved !! I'm using a click region to test this .

I just did some tests and I keep getting around 102 samples regardless of the buffer setting ????? ( 3x64 , 1x64 , 1x32 )

How can this be ???

Later
Buzz

Iain Westland
05-20-2008, 12:28 PM
what card Mike?

Bob L
05-20-2008, 12:48 PM
That sounds possible due to AD/DA converters... but also... be aware... again I caution against these types of tests... it is not reality... SAWStudio lines up the recorded signal with the first outgoing sample position on the timeline... what is normally recorded is not going thru two conversions... be careful.

Bob L

mikebuzz
05-20-2008, 12:49 PM
Its a VSL2020 by Steinberg in a Dell 1.2 ghz pc w/ 512 mb , that should not matter though the card is reporting that it is fine ?? I'm not monitoring so it might be crackling but ??

Bob I tried getting some data on ADA8000 latency but found nothing ?

BTW this is a one way test at this time . I'm trying to figure out my latency in LIVE MODE using Saw

As I said when I change the buffer settings there is NO change in the latency of the recorded track ?????

Later
Buzz

Iain Westland
05-20-2008, 12:58 PM
what is normally recorded is not going thru two conversions... be careful.

Bob L

?

i am looking at it like this, the audio is going through one set to get into the earphones, then it has to go through the second to get into the computer, but replace the return with the guitar part and thats what i am trying to get into time. this method seems to sort that out very very well

Iain Westland
05-20-2008, 01:01 PM
Bob I tried getting some data on ADA8000 latency but found nothing ?


using rme and ada8000 i get 64 samples direct wire? somethings not quite reet there Mike

AudioAstronomer
05-20-2008, 01:10 PM
Mike, all adda converters have some inherent latency.

102 samples sounds about on par with a low quality converter like the behringer. Some converters/interfaces are as low or lower than 32 samples.

The good thing is that it is constant. So it will always be latent by the same number of samples, so it should not affect anything except hardware routing (this has been discussed before).

Iain Westland
05-20-2008, 01:17 PM
the ada in the behringers pretty good, its the pres that are not so good and i have got 64 samples with the rme hamerfall cards i use

mikebuzz
05-20-2008, 01:40 PM
Ya ada8000 + the VSL2020 card ??
But at 3x64 should'nt it be 192 samples ????

Later
Buzz

Iain Westland
05-20-2008, 01:44 PM
no, the three is the buffers the 64 is the samples bit. is it a built in card?

Bob L
05-20-2008, 02:15 PM
The playback conversion to the headphones is not part of the input sync... as I said... SS compensates and places the input data at the proper position on the timeline... no matter how late it is coming back.

You will perform in sync with what you are hearing... we have been down this road many times before... and so far... I have never had a sync issue in thousands of recordings and performances... and neither will any of you most likely... if you just let SS do its thing. :)

Bob L

Iain Westland
05-20-2008, 02:32 PM
now i am confused, how can saw decide where my input should be placed on the time line? i'm talking about overdubs etc, what if the bassist wanted to play just infront of the beat or sit back in the pocket, i cant see how saw would adjust the playing to make this happen.

and just out of interest, why include the record loopback latency adjust if saw performs this task anyway. not being funny or meaning to argue, or am i talking about a different thing to you?

Iain

Iain Westland
05-20-2008, 02:57 PM
i set this system up each recording. i work with kids and semi (being very generous there at times) talented musicians. if any of them try to complain about the computer not recording right, i run the test again and show them the sinqs perfect, they just need to adjust to a professional arena (i'm not heartless enough to tell them a stopped watch keeps better time)

Naturally Digital
05-20-2008, 03:07 PM
Ya ada8000 + the VSL2020 card ??
But at 3x64 should'nt it be 192 samples ???? Mike, when you perform the loopback test as you are doing, you are ONLY measuring the latency of the system from the output of the "driver" to the input of the "driver" (ie. SAW).

The 102 samples does NOT include the buffers in SAW. It includes any latency after those buffers. IOW, that is basically the latency of your *hardware*.

Iain Westland
05-20-2008, 03:12 PM
IOW

Que?

Iain Westland
05-20-2008, 03:41 PM
murky buckets

mikebuzz
05-20-2008, 04:04 PM
Dave that makes more sense now !! thats my HARDWARE delay/latency !!
which is fine 2ms is NO PROBLEM.

Bob I'm just trying to get my head around this from a technical point of view I am an electronics/mechanical engineer so It's natural for me to want to know OR figure it out.

I think there are others here that are curious as well !!! , anyhow I could not figure out why when changing buffers the latency did not change , I was under the threshhold of the hardware latency ( I think ?? )

Anyhow it appears that with My VSL2020 card and an ADA8000 the latency is around 102 samples !! or 2.3ms @44.1khz :D

LAter
Buzz

Iain Westland
05-20-2008, 04:22 PM
ah, now i can see where there might be confusion.

it never crossed my mind that there might be lattency within the program!!! i'm that used to the rock steady beat of saw that i forgot about this possibility. i have just been trying to compensate for manufacturers hardware constraints:D:D:D

UpTilDawn
05-20-2008, 04:28 PM
murky buckets

Like we're supposed to know what THIS means?! :D:D

Iain Westland
05-20-2008, 04:34 PM
is waiting, talk in italian


bananarama moment there, please bear with, reality shall return in 2 to the minus 11 seconds and counting

mikebuzz
05-20-2008, 04:50 PM
That 2.3ms was a D/A/D conversion , missing the last A conversion , It should be about 2.9ms

LAter
Buzz

Bob L
05-20-2008, 05:08 PM
The loopback latency is there as an option just for those users who felt it absolutely necessary because of discussions like this one. :)

But... if you leave it set at zero... and record and punch-in and do whatever else you need to do in a normal recording session... you will find that SS handles everything perfectly and your recordings will be as performed... your bass player can play ahead or behind what he is hearing... and SS will keep it that way.

Worrying about loopback latency issues is pretty much a waste of time in the SS environment.

Perhaps it has meaning in other systems... but then again... SS is not like other systems is it... in fact many people complain about that... but I find it to be quite refreshing. :)

Bob L

mikebuzz
05-20-2008, 05:20 PM
Bob just for understandings sake , I would like to know the data/audio flow for a system in a basic form.

My take ( probably wrong but ? ) > = step
MIC> A/D > adat cable >PCI card > Buss/SAW > buss/Saw out >PCI card > adat cable > D/A > amp > speakers ( In a live setup for reference )

I ASSume:D that most of the latency is in the A/D/A conversion in a good setup ( RME ETC. ) all of the internal CPU based processing will be dependent on CPU Freq + Ram qty ( faster CPU = lower process times ? ) also the PCI Card obviously plays into this equation also ( like my built in card 512 buffers min. )
So If this is correct than ADA conversion is the weak link now adays ???


LAter
Buzz

Bob L
05-20-2008, 05:25 PM
Not really... in most cases AD/DA conversion becomes insignificant to the overall required buffer size setting for stable performance... the buffer size becomes the limiting factor.

There is no added latency for the internal processing and mixing... as long as the loop makes realtime before the next buffer is demanded by the soundcard... the size of that buffer pretty much determines everything... there are usually at least 3 internal buffers between the real signal input and the first output of the same data.

Bob L

Grekim
05-20-2008, 05:35 PM
My understanding has always been that a DAW knows when a sample at a point in time is hitting the soundcard (meaning the soundcard in communication with the DAW). The recorded sample coming into the soundcard at that instant is then aligned with the outgoing one in the DAW's timeline. There can be a ton of processing/time before a sample leaves to the D/A, but the DAW has to know when it leaves so it can sync to the incoming data.

Now, if for example you hook an Apogee up to your RME digital I/O, then latency in the Apogee is unknown to the DAW, but can sometimes be manually adjusted for by the user. If you run 1000 ft of cable to the headphones, that is again unknown to the DAW and not automatically compensated for.

Cary B. Cornett
05-21-2008, 05:02 AM
If you run 1000 ft of cable to the headphones, that is again unknown to the DAW and not automatically compensated for. Have you checked out the speed of light lately? Let's see... about 186,000 miles per second. Or 186 miles per millisecond. Or 982,080 ft/ms. That would be just over 982 feet per microsecond.

Even if you take the velocity factor of the cable used into account, the total delay through 1000 feet of cable is probably less than two millionths of a second. :rolleyes: By contrast, for a 44.1k sample rate a delay of ONE SAMPLE is about 22.7 microseconds. :eek:

I don't know about you, but I personally would regard a delay time equivalent to 1 tenth of one sample period as negligible. :D:p:D:cool:

I would say Grekim has made a pretty good point... (if I understood him correctly).

AudioAstronomer
05-21-2008, 05:24 AM
My understanding has always been that a DAW knows when a sample at a point in time is hitting the soundcard (meaning the soundcard in communication with the DAW). The recorded sample coming into the soundcard at that instant is then aligned with the outgoing one in the DAW's timeline. There can be a ton of processing/time before a sample leaves to the D/A, but the DAW has to know when it leaves so it can sync to the incoming data.

Now, if for example you hook an Apogee up to your RME digital I/O, then latency in the Apogee is unknown to the DAW, but can sometimes be manually adjusted for by the user. If you run 1000 ft of cable to the headphones, that is again unknown to the DAW and not automatically compensated for.


All the software needs to do is compensate for the buffer latency. Converter latency is constant.

Simply adjust the region's timestamp by the buffer size and you're set.

Once you add 2x buffer (with i/o) you've now doubled converter and protocol .does not compensate for

Grekim
05-21-2008, 05:42 AM
Have you checked out the speed of light lately? Let's see... about 186,000 miles per second. Or 186 miles per millisecond. Or 982,080 ft/ms. That would be just over 982 feet per microsecond.

Even if you take the velocity factor of the cable used into account, the total delay through 1000 feet of cable is probably less than two millionths of a second. :rolleyes: By contrast, for a 44.1k sample rate a delay of ONE SAMPLE is about 22.7 microseconds. :eek:

I don't know about you, but I personally would regard a delay time equivalent to 1 tenth of one sample period as negligible. :D:p:D:cool:

I would say Grekim has made a pretty good point... (if I understood him correctly).

Sure even 1000 ft of cable (meant to be slightly humorous) would not matter much at all.
By the way, although I have never seen an actual speed or way to calculate the speed, it is not exactly the speed of light. It is fast though! It's not as if the electrons have a straight shot down the pipe. I found it best to think of current as a net disturbance propagating along with the electrons in between having a lot of random directions. If you traced the path of one electron it would take a relatively long time to find its way through the cable.

Grekim
05-21-2008, 05:52 AM
All the software needs to do is compensate for the buffer latency. Converter latency is constant.

Simply adjust the region's timestamp by the buffer size and you're set.

Once you add 2x buffer (with i/o) you've now doubled converter and protocol .does not compensate for

I don't think you can assume that the time to perform an analog to digital conversion is the same as the time taken to do the analog to digital conversion. But, likely I'm misunderstanding which latency we're talking about.

AudioAstronomer
05-21-2008, 07:18 AM
I don't think you can assume that the time to perform an analog to digital conversion is the same as the time taken to do the analog to digital conversion. But, likely I'm misunderstanding which latency we're talking about.

The latency is a set amount in converters. Usually i/o latencies match, but they do not need to since they are always constant. Once you know the latency it does not need to be compensated for unless you are reconverting, in which case you can manually compensate with the known amount.

Grekim
05-21-2008, 08:21 AM
The latency is a set amount in converters. Usually i/o latencies match, but they do not need to since they are always constant. Once you know the latency it does not need to be compensated for unless you are reconverting, in which case you can manually compensate with the known amount.

Okay now I follow the constant part, thanks.

Cary B. Cornett
05-21-2008, 11:33 AM
By the way, although I have never seen an actual speed or way to calculate the speed, it is not exactly the speed of light. That is what the "Velocity Factor" is for. I learned about this when I worked in broadcasting, where sometimes knowing the exact delay time through a transmission line is VERY important. Just for fun, I looked in my 1972 ARRL Handbook and found this:

"...electromagnetic fields travel more slowly in material dielectrics than through free space. ... [therefore] the physical length [of transmission line] corresponding to an electrical [free space] wavelength is given by

Length in feet = (984 V) / f
where
f = frequency in megahertz
V = velocity factor
The velocity factor is the ratio of the actual velocity along the line to the velocity in free space."

So, if you know the velocity factor of the cable used, it is easy to calculate the actual speed of signal travel in the line. The book I just quoted from also has a table that shows the velocity factors for several types of RF transmission lines, and the SLOWEST cable listed has a velocity factor of 66%, so it seems reasonable to expect that, with most common wiring, the speed of the signal would still be, at worst, more than half the speed of light.

Isn't science fun?? :D:eek::cool:

Grekim
05-21-2008, 12:59 PM
Neat! I had not run across anything like that. Science is indeed fun and keeps me sane.:D

mikebuzz
05-21-2008, 01:28 PM
Ok so why do MFG NOT list the latency of thier ADA converters ???

Anyhow thanks all got it sorted out in my head now !!!

Later
Buzz

DominicPerry
05-21-2008, 02:25 PM
Some manufacturers do, in fact I think RME boast about the speed of the A/D and D/A in their latest high-end convertors. And IIRC, the two figures are not the same.

Dominic

EDIT : From the ADI-QS page
"In the QS RME uses a high-class AD converter from Cirrus Logic, offering exceptional Signal to Noise and distortion figures. But the biggest difference to all other ADCs out there is its innovative digital filter, achieving for the first time a delay of only 12 samples in Single Speed (0.25 ms), 9 samples in Double Speed (0.09 ms), and 5 (!) samples in Quad Speed (0.026 ms).
These values are less than a quarter of those available from even much more expensive devices. They represent an important step in further reducing the latency in the computer-based recording studio. At DS and QS the added latency can simply be ignored. The DA-converter offers similar conversion in the range of 5 to 10 samples, turning analog digital monitoring into real analog-style monitoring!"

sebastiandybing
05-21-2008, 02:42 PM
That is one of the reason I am using the mistacy.

Sebastian

mikebuzz
05-21-2008, 02:55 PM
Good info !!
Later
Buzz