Virtual Instrument For Spectrum Analysis

  • June 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Virtual Instrument For Spectrum Analysis as PDF for free.

More details

  • Words: 4,515
  • Pages: 6
82

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 51, NO. 1, FEBRUARY 2002

Virtual Spectrum Analyzer Based on Data Acquisition Card Piotr Bilski and Wieslaw Winiecki, Member, IEEE Abstract—A virtual spectrum analyzer based on a data acquisition card is presented. The implemented functions are described. The properties of the device were examined. Such features as accuracy and speed are considered. The virtual analyzer and an instrument equipped with a DSP are compared. The integrated programming package LabView was used to design the analyzer. Index Terms—Spectrum analyzer, virtual instrument.

I. INTRODUCTION

T

HE present methods of measurement processes have undergone significant modifications, mostly due to virtual measurement devices, designed as a software and hardware combination. The most important features of such equipment are very low design and fabrication costs. The creation of virtual instruments is a very simple task, thanks to modern programming environments (such as Lab Windows/CVI or LabView) and personal computers. Accuracy, next to the speed of computations, is one of the most important parameters of a virtual instrument, projecting its usefulness. This is especially in comparison with a “classic,” hardware instrument, which, while being fast and accurate, is simultaneously rather expensive. Today, there can be virtual instruments based on data acquisition cards that fulfill a digital oscilloscope, waveform generator, multiplexer, or function generator role. This type of card, in conjunction with specialized software, can give a combination with very good parameters. There are three ways to obtain a virtual instrument. The first is by using a virtual instrument firmware software package and applying it to any DAQ card that is available (like eNVi or SigLab applications [5], also HSVI3000 spectrum analyzer [4]). The second is by using a DAQ card and designing software for a user (for example, National Instruments, Siglab or Texas Instruments [9]). Finally, the third way is by using ready-to-use DAQ cards and software sets. Some packages are from Oliver Worldclass Labs (Lab Volt) or PC Instruments ([5], [11]) companies, which include both hardware and software equipment. It is good to note that manufacturers evaluate a computer with Pentium 90 as good enough for basic audio analysis. A much better computer configuration enables the user to process advanced analysis. There are many virtual instruments such as multimeters, oscilloscopes, and waveform generators. Virtual spectrum analyzers are only rarely available on the market. The need of creating such systems is the consequence of high prices for hardware analyzers. Fortunately, software packages, designed to create such instruments, as mentioned above, offer a variety Manuscript received November 21, 2001; revised December 8, 2001. The authors are with the Institute of Radioelectronics, Warsaw University of Technology, Warsaw, Poland Publisher Item Identifier S 0018-9456(02)02289-1.

of ready-to-use functions, (e.g., filters, windows or FFT algorithms), making the development of such an instrument much easier. Some commercial solutions are the HSVI spectrum analyzer [4] and RICS HP8563E [12]. These are ready-to-use virtual instruments, for acoustic spectrum range analysis, which require low-cost personal computers. The main decisions during design of the virtual instrument are: choosing the data acquisition card and, what is equally important, choosing the set of its procedures. The choice of the data acquisition card has a great influence on the efficiency of the whole instrument. Particular DAQ cards are differentiated by price and, furthermore, their viabilities (sampling frequency, scale of acquired signal, etc.). After making definitions of assumptions in thisproject,alow-costcardwaschosen.Itenablesadesignertoexaminehowcomplexinstrumentscanbecreated(bymeansoffunctionality) at the lowest (minimum) costs. That was the main aim of this project. A separate problem is designing software, which, in that case, is the main part of the instrument creation phase. Integrated programming environments are a relatively young branch of measuring system design tools. That is the reason why many issues connected with virtual instrument design have not yet found standard and well-explained solutions. A key problem in such work must be assessment of the instrument capabilities to work in the “real-time” mode, which means simultaneously acquiring samples and performing mathematical operations. This is possible only if there are no delays from any of those modules. The present commercial solutions in virtual instrumentation for spectrum analysis provide rather simple devices, none of them working in “real-time.” There are some tricks that can be applied, for example, fast sampling and after gathering of the samples, processing them in off-line mode [15]. Another idea for creating a fast analyzer is applying a few simple functions, which are not time consuming [4], [5], [8]. Still, this is not “real-time” mode, because the DAQ card works faster than the program itself. Therefore, the main issue in this article is considering how to create a virtual analyzer, which conducts spectral operations in the same (or less) time, which is required for gathering samples. In this paper, a few well-known measurement domain acronyms are used: FFT for fast fourier transform, DAQ for data acquisition, DFT for discrete fourier transform, and DSP for digital signal processor. II. PROJECT ASSUMPTIONS The project assumed the most complete virtual spectral analyzer device created as a combination of a program written especially for this purpose, a personal computer, and low-cost Lab-PC 1200 data acquisition card (National Instruments). A

0018–9456/02$17.00 © 2002 IEEE

BILSKI AND WINIECKI: VIRTUAL SPECTRUM ANALYZER BASED ON DATA ACQUISITION CARD

computer with the Windows 95 operating system was a typical hardware platform, within range of any user. Neither an AMD K6–2 350 MHz processor nor 64 MB of operating memory (RAM) is a sufficient configuration, in reference to the present computer technologies. The card parameters can be compared with the similar devices of such a class and are as follows. — A sampling frequency of 100 kHz makes possible the audio spectrum group, and even higher (up to 50 kHz), frequency signal acquisition. — A 12-bit resolution appears to be enough for many semi-professional applications and assures typical commercial virtual instrument accuracy, offered for example by the Internet. — A PCI bus which enables communication with other hardware components of the computer assures compatibility with all currently existing mainboard standards (compliant with the PCI 2.2 specification). One of the concurrent goals of the project was an experimental verification of the DAQ card possibilities, that allows assigning optimal work conditions, especially the “real-time” work viability mode, when processing delays are minimal or seemingly unnoticed by the user. A comparison of the created virtual analyzer and hardware analyzer was conducted. The following parameters of the hardware analyzer were used: — sampling frequency 48 kHz adapted to audio waveform acquisition; — spectrum analysis conducted by octave filters of the Chebyshev’s type sixth-order; — liquid crystal display LCD 128 on 128 points. A multifunctional waveform generator, as a signal source, was applied. III. REALIZATION According to the project assumptions, a great set of DSP functions useful for spectral analysis has been implemented. The created analyzer contains — a set of all types of digital filters (highpass, lowpass, bandpass, bandstop); — a set of all types of cutting signal fragment windows (for example, Hamming, Hannings, Kaiser-Bessel and others); — FFT and an octave or 1/3 octave filter algorithm spectrum calculation; — complex cepstrum; — statistical parameters such as moments (square value, root mean square, etc.) or median, including drawing histogram; — level fetching after determining type and value of this operation calculation; — two averaging methods (linear and exponential), applied accordingly in the domain of time or frequency; — harmonic compounds display; — software waveform generation and simulation; — saving results of analysis into the text file; — zooming graph.

83

All of these operations are being conducted by a typical spectrum analyzer, introduced in [4], [5], [7]–[9]. On the other hand, a description of all functions and algorithms can be found in the following literature ([1], [2], [10]). The instrument activity rule consists of dividing the time of work into two phases. In the first phase, the user keeps control over the whole system. He can then make changes in setting controls and preparing analysis conditions — for example, by setting the sampling frequency, the length of the waveform vector taken for calculations, or the breadth of the filter’s passband. After pressing the appropriate button, fetching begins and the instrument takes the control of the system and the front panel controls, blocking the user’s access to them. The measurement process follows. It consists of executing activities in the determined sequence: acquisition of the samples, introductory processing, averaging (if that is the user’s wish), exact processing, displaying results, and saving them into the text file. Because of the averaging procedure, the whole processing stage is repeated in a program loop, where the iteration number is set by the user. These operations can be conducted both in the time and frequency domains, which is important for spectral analysis. The algorithm is shown in Fig. 1. During the measurement process, all data vectors are updated on-line. The term “introductory processing” means two activities: checking if the waveform level is in the proper range (with respect to fetching level) and the vector of samples for further processing, or if the instrument stops working and displaying errors. This is necessary protection for preventing the program from hanging. On the other hand, “exact processing” means the spectrum vector, based on input samples (with one of the available methods) or statistical properties or cepstrum calculation (because those functions work interchangeably). As the operation proceeds, complete results are drawn on the display and written in appropriate fields on the panel. A separate problem was the classic instrument’s front panel design issue [3]. In contrast to traditional devices, where a device’s front panel can be of any size, a virtual instrument has to stand out with a typical monitor screen panel fitting (that is, between 15 and 17 in.). The designed complex spectrum analyzer is characterized by a large number of controls. It was impossible to place all of them simultaneously on the panel. There was then a need for applying some solutions. The first solution is exchanging some of the controls with other ones, in a way that only part of them is visible (for example, statistical properties instead of spectrum characteristics, only one group is needed at a time). The second solution assumes functions as compound instruments, working in separate windows. The first solution was chosen because of the greater user comfort and better clarity of on-screen visualization. Finally, depending on the option chosen by the user, the different indicators are shown in the specified areas of the panel. One of the inconveniences arising from the instrument design process that became evident was the inevitable code complication. The LabView package offers intelligible and easy-to-use graphical tools. Although they are nontypical, using them demands separate approaches to code design (described in detail in [3]). Creating large applications is usually troublesome because of a very big area occupied by the graphical form code. That is the main reason why closing all operations in hierarchically

84

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 51, NO. 1, FEBRUARY 2002

cutes all the operations, which were planned and implemented. The important issue was examination of its speed and accuracy. The main problem of the first topic of investigation was assessing the cost of operations conducted by the software, because the DAQ card speed was set by the manufacturer and the user can not change it. The particular operation analysis time is very interesting. The results have been gathered in Table I for a few different numbers of iterations. The iteration number has been chosen in a way to allow accurate measurement of the whole process time. LabView displays flow of time in milliseconds, and a large number of loop iterations were set to obtain a more reliable result of time measurements. The latter have been performed using the PC computer with AMD K6–2 350 MHz processor. The input signal parameters were as follows

Fig. 1. Algorithm of control and computations.

organized functions is very important. Such a method of graphical program design could make the developer’s control over the written program easier. The second problem has been noticed when there were designed radio buttons, which allow choosing different types of analysis. In contrast to other high-level languages offering object-oriented programming, such as Visual Basic or Visual C++, these elements are not so easy to implement. Its application requires introducing some delays in the main program loop. Such operation causes a decline in the speed of the instrument’s work (this effect is easy to notice when there are many iterations). The front panel of the designed virtual analyzer is shown in Fig. 2. A part of the source code in graphical language G (LabView) is presented in Fig. 3. IV. DISCUSSION The instrument, which has been created, fulfills the project conditions with respect to its functionality. That means it exe-

— number of samples: 1024; — waveform frequency: 1 kHz; — waveform: sine; — waveform amplitude: 1 V. The execution time of each phase of operation strongly depends on the processor frequency and the computer memory. As can be noticed, the most time consuming operation is the spectrum calculation, with the FFT algorithm (which is also a part of cepstrum and power spectrum functions), and statistical calculation. The other operations (such as averaging, filtering, and windowing) engage the processor in a minimal way. That is also an explanation why octave filters are faster than FFT (and, of course, DFT). Also the time-consumption of such operations as changing values of attributes and empty loops (delay because of them is marginal) was measured. However, matrix operations, very common during analysis, highly depend on the length of a sample vector. Results of these experiments are presented in Fig. 4. Two methods of spectral analysis, using FFT and octave filters, were examined and compared. The octave filters are often used in many different devices, especially in those which should work very fast in “real-time” (for example, simple analyzers in hi-fi stereo sets). The DFT method also was added, although it can be noticed that this kind of operation is extremely slow and makes all applications senseless. Octave filters are faster than FFT, but only for some numbers of samples. The FFT seems to be better for a larger number of samples, although this may be the result of not-optimal octave filter design. A very important problem was comparing the created analyzer with the hardware DSP analyzer. The instrument chosen for this experiment was the SVAN 912 [10], having two 40 MHz signal processors: Motorola 56 002 and Texas Instruments 320C50. Its spectral analysis is based on a set of the sixthorder Chebyshev’s second-type filters. The comparison results are shown in Table II. That table shows the time consumption of the whole measurement process (including data acquisition, FFT processing and result presentation) for 256 and 1024 samples of the 1 kHz sinus input signal. One of the unfavorable factors during comparison experiments was time analysis result change for a single experiment (for the same waveform parameters). The reason is probably the Windows 95 architecture and its method of memory and central processing unit time assign-

BILSKI AND WINIECKI: VIRTUAL SPECTRUM ANALYZER BASED ON DATA ACQUISITION CARD

Fig. 2.

85

Front panel of virtual analyzer. TABLE I COMPARISON OF TIME CONSUMPTION FOR PARTICULAR OPERATIONS (MS) ON 350 MHZ PROCESSOR AND 1024 SAMPLES

Fig. 3.

Part of source code in graphical language G (LabView).

ment. Results included in both tables are from averaging the effects of five experiments. As it can be seen in Table II, the differences between both devices are not so big as would be presumed. When the length

Fig. 4. Comparison of time consumption of operations for 350 MHz processor and number of samples 1024.

of the samples vector is shorter,those differences are almost unnoticeable. The big number of iterations is determined by time analysis and by construction of the hardware analyzer. Because there is no possibility of catching time flow directly in the instrument

86

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 51, NO. 1, FEBRUARY 2002

TABLE II COMPARISON OF VIRTUAL (BASED ON 350 MHZ PROCESSOR) AND HARDWARE SPECTRUM ANALYZERS (TIME IN MILLISECONDS); NS — NUMBER OF SAMPLES

equipped with DSP,the experiments were conducted with the help of a precise stopper. Multiple trials of experiments in the measurement stand showed clearly that the instrument’s “real-time” mode applications are limited by certain conditions. There must be mentioned, among others, the sampling frequency (which depends on the DAQ card properties), the analysis mode, and (very important) the sample vector length. Because in the existing instrument a common parameter value is 1024, it seems that both FFT and octave filters are acceptable, although for longer vectors a lesser number of computations give the octave filters the advantage. Octave filters are the fastest of all applied ones, but may still be too time-consuming to become useful during measurements. The reason for this conclusion is the work mode of octave filters. They must run simultaneously with other functions during the whole work time. On the other hand, it has to be stated that the faster computer will do such a job in a shorter length of time. Beyond that issue, the virtual instrument, which was created, appears to comply with all the requirements set at the beginning of the design process. It turned out to be a quite functional tool serviceable in an analysis of most of the examined signals. Predictions on how virtual instruments will be working in the future are possible on the basis of the comparison of work speed on different computers. The required experiments have been conducted, revealing a linear change in time consumption for some operations (for instance, statistical operations or cepstrum) and nonlinear for others (octave filters, power spectrum). In most cases, the increase of speed was better than predicted. While processor speed has been doubled, the operations mentioned have even been tripled. It appears that well-conditioned operations can take advantage of faster computers. In the future such instruments can apparently become faster than hardware ones. Results are gathered in Table III. The main conclusion at this point is the important role of the computer processor. Today’s technology brings CPUs with a frequency of 1.5 GHz and higher. Therefore, the time for most operations falls below 1 ms and the whole sweep below 25 ms (that is, for the specified analyzer, created during the project, which has many complicated functions, like cepstrum, FFT, statistics, averaging, level fetching, zooming, filtering, and windowing). Hardware analyzers don’t have the viability to increase their speed, because of closed architecture. The only way to increase the speed of the measurement system in this case is buying a better analyzer. The virtual instrument speed depends on the CPU clock. Therefore, the same program will run faster on more advanced computers. However, virtual instruments have an important restriction. It is the finite speed of gathering the samples which remains constant for a certain DAQ card. For the PC-Lab 1200, the time of gathering is connected to sampling frequency (100 kHz), which means that the terminal

CHANGE

IN

TIME

OF

TABLE III OPERATIONS IN ACCORDANCE WITH PROCESSOR SPEED (FOR 1024 SAMPLES)

TABLE IV RELATIVE ACCURACY OF VIRTUAL ANALYZER (NS — NUMBER OF SAMPLES, A — AMPLITUDE, F — FREQUENCY)

speed of the card’s work is about 10 ms. It is very difficult for today’s processors to conduct all operations in such a short time (in our case it appears that the required speed could be obtained by the processor of 4 GHz). That is why creating instruments on slower PCs is possible only on one condition. The instrument can not have too many advanced functions, which slow its work. Tests of the created analyzer show clearly that the time of sample gathering is much shorter than that of the whole analysis process, conducted on the 350 MHz processor, and does not have much influence on the instrument’s speed. It means that a very simple spectrum analyzer can consist only of FFT analysis supported by filters/windows, level fetching, or averaging. Such an instrument will work significantly faster (below 40 ms), but still not in “real-time” mode. Therefore, the only way to work in “real-time” mode in this case is applying faster computers (with frequency clock 1.5 GHz). Another important issue is assessing the accuracy of such virtual instrumentation. This problem is quite new and has not yet been solved in a satisfactory way. The main hindrance is uncertainty of partial hardware and software estimates. Every part of the instrument introduces some errors, though their separation still is not easy. During the test the standard method of accuracy appraisal, based on [14], was used. Experimental results are in Table IV. As can be noticed, the relative error did not exceed 2 per cent. Such accuracy is sufficient for semi-professional applications. The dynamic range of the designed instrument has also been estimated. This parameter is dependent on the number of bits of A/D and D/A converters. Experiments showed that it is about 60 dB, which is a typical value for acoustic analyzers. Although the number of bits shows that theoretically it could be about 75 dB, noises from electronic parts apparently lessen it. Therefore, for very small signals (less than 50 mV), A/D and D/A converters appear to be nonlinear. The results of linearity analysis are shown in Fig. 5. The last problem, which had to be overcome, was the virtual instrument calibration method elaboration. Because the whole device is a software and hardware composition, the issue can be solved only by separate procedures designed for both of them. The first approach refers to a DAQ card calibration, which should be conducted with the instructions from the manufacturer, that is, National Instruments (described in [9]). This operation requires

BILSKI AND WINIECKI: VIRTUAL SPECTRUM ANALYZER BASED ON DATA ACQUISITION CARD

Fig. 5. Results of linearity measurement.

a standard signal source of the accuracy 0.001 per cent, which is ten times better than the accuracy of the card. The second procedure refers only to the software part and consists of comparing the sampled waveform values (from a standard source) with the effect gained by the hardware analyzer. V. CONCLUSION The created project was designed to give answers to important questions about the usefulness, the accuracy and the virtual spectrum analyzer’s work speed, especially in comparison with “classic” devices, based on DSP. The experiments that have been performed, showed that the hardware analyzer is the more accurate device, though it has to be said that the price of most instruments of this class is much higher than the price of an appropriate virtual analyzer. On the other hand, personal computers become more inexpensive as time goes by, and also even more advanced (every half year a new processor type is put on the market). The speed of work is better for machines with DSP, because spectrum analysis calculations are faster when the hardware digital filters are in action. The typical, “universal” processor, besides working with the virtual analyzer, also has to handle the whole operating system, while the digital signal processor executes only one task. However, results display is usually faster in virtual instruments (especially when the hardware device has a liquid crystal display). Research has shown that creating large, complex, and simultaneously useful virtual instruments is possible, and even advisable. It allows creating a system, which can be attractive in some (not too broad) aspects and can be able to compete with hardware instruments. That is the reason why virtual devices are a tempting alternative. For acoustic waveforms in semi-professional recording studio analysis, the slowdown should be small and almost unnoticeable. This project also revealed that the LabView package allows one to create more complex programs, although it has some disadvantages, such as large graphical code and the necessity of a nontypical approach to programming. It is clear that during the upcoming years, the importance of virtual instruments will be increasing. Such fusion of computer and specialized software does not allow turning the basic PC into an industrial analyzer with a range of 20 GHz. However, it gives possibilities to create the whole laboratory on one computer, making it a perfect di-

87

agnosis and test tool. Reconfigurability plays the main role and broadens its applications. The “real-time” work mode requires an accurately balanced computer and DAQ card. This mode allows simply simultaneous signal processing (conducted by the computer) and sample acquisition (conducted by the card). Therefore, the overall performance of the analyzer depends on both mentioned parts. This instrument can not be faster than its DAQ card, so the computer processor should be only as fast as it is required by the card. However, today it is quite difficult to create a “real-time” virtual signal analyzer, because computers are not able to perform all the operations in the time needed for acquiring samples by the DAQ card (especially if the program is written in high-level language, which is slower than, for example, Assembler). There is the possibility of creating a “real-time” analyzer with the fastest accessible processors (for example, Athlon 1.5 GHz), but it can not be too complicated and must contain only some simple operations (averaging, FFT, filters/windows). Truly fast and accurate virtual spectrum analyzers will be the domain of the future (thanks to optical and molecular technologies). The strong dependencies mentioned give almost limitless possibilities of creating virtual instrumentation and are about to become a significant advantage over hardware instruments. REFERENCES [1] R. G. Lyons, Understanding Digital Signal Processing. Reading, MA: Addison–Wesley, 1997. [2] A. V. Oppenheim, Digital Signal Processing. Warsaw, Poland: WKL, 1979. [3] L. Wells and J. Travis, Lab View for Everyone. Englwood Cliffs, NJ: Prentice Hall, 1997. [4] [Online]. Available: www.hyperception.com/j98htm/hsvi3000.htm [5] [Online]. Available: www.oliverlabs.com/elec/virt_inst.htm [6] [Online]. Available: odysseus.ieee.org [7] [Online]. Available: www.picotech.co.uk/spectrum.html [8] [Online]. Available: www.msaxon.com/envi.htm [9] [Online]. Available: www.ni.com [10] [Online]. Available: www.svantek.com.pl [11] [Online]. Available: www.pcinstruments.com [12] [Online]. Available: www.comprehensivesolutions.com [13] , ANSI Std. S1.11, 1986. [14] Guide to the Expression of Uncertainty in Measurement, International Organization for Standardization, 1993. [15] [Online]. Available: www.edi.lv/dasp-web/dasp-labsystem/dasplabsystem.htm Piotr Bilski was born in 1977 in Olsztyn, Poland. He received the M.Sc. degree in 2001 from the Faculty of Electronics, Warsaw University of Technology,Warsaw, Poland. He is currently pursing the PhD degree at the Institute of Radioelectronics, Warsaw University of Technology. His interests are computer-aided measurements, diagnostic of electronic and mechatronic circuits, and relative and object databases. Wieslaw Winiecki (M’00) received the M.Sc. and Ph.D. degrees from the Faculty of Electronics, Warsaw University of Technology, Warsaw, Poland, in 1975 and 1986, respectively. Since 1975, he has been with the Institute of Radioelectronics, Warsaw University of Technology. Since the beginning of his professional career, he has been involved in the activities of the Group on Computer-aided Measurements concerning the hardware and software for measuring systems. Since 1987, he has worked as Assistant Professor on the problems of measurement automation and interface systems. He is author/co-author of two books and about 90 papers Dr. Winiecki is Head of the Computer-Aided Measurement Laboratory, Faculty of Electronics, Warsaw University of Technology, Member of the Measuring Systems Section of the Metrology and Instrumentation Committee, Polish Academy of Sciences, Deputy Chairman of the Measurement Committee of the Polish Society for Measurement, Automatic Control and Robotics (POLSPAR).

Related Documents

Spectrum
June 2020 24
Spectrum
June 2020 25
Instrument
October 2019 52
Instrument
November 2019 47