This month I am going to do a build on one of Ed's posts from this month. It was titled "Using Binary Data Transfers to Improve Your Test Throughput". If you have not read it, go ahead and click on the link. I'll be here when you get back.
I wanted to reiterate how drastic the difference is between
using these two interfaces when you are reading large amounts of data. I did some bench-marking a little while ago and I wanted to share it now with everybody.
Please note that these were quick tests that I did and in no way
are official numbers. In fact if you see
anything wrong with my methods, please comment.
The first thing that I will talk about is my method. I did the test with a N6700B MPS Mainframe and a N6781A SMU module. I wrote a program that set up the module to source 5 V and then take an array of voltage measurements. I set it for the maximum number of measurement points (524288 points) with the fastest sample rate (though for this experiment the sample rate does not really matter). Before I did the reading of the data from the N6700B to my PC I started a programmatic stopwatch and stopped the stopwatch after the reading was complete. I looped 20 times and took the average.
One thing that I would highly recommend is to use the Agilent VISA-COM IO library. The VISA-COM library offers a ReadIEEEBlock function that makes reading binary data really easy for you.
The screenshot below shows the relevant loop and the calculation. This program was written in VB and I used LAN to communicate with the instrument.
The other important piece that this is not showing is that I am setting the data format to real using FORM REAL command. When you use ASCII, the command is FORM ASCII (this is also the default setting).
You can see the commented out ReadString command that I swapped in when I used the ASCII data format. You can also see my extremely professional (and useful) "I am on line" counter that I put in so that I knew my program was looping correctly.
So now for the times. ASCII format took around 100 s to read back all 524288 measurements into a string. When I switched to the binary format, it took under 5 seconds. As you can see, that is a very drastic difference and if you are reading back a lot of data from an instrument that supports the binary format, you really need to use it.
I also did a few other experiments. I changed the total number of points down to 1000. The binary format took a little under 20 ms to read the data and the ASCII format took about 125 ms. The last test that I did was 3 data points. The binary format took a little less than 15 ms and the ASCII format took under 5 ms to make the measurement. So you can see that as you read less and less data back, the ASCII format does catch up to the binary format and even exceed it.
Moral of the story is that if it is something more than a few points to read back, use binary because it will save you a ton of time.
That's all I have this month and I will be back next month!