Data Acquisition Systems

Data Acquisition Systems

 

Automation of many processes requires the use of various sensors. One of such applications are advanced, industrial production lines. They operate autonomously by processing feedback from devices that monitor a number of characteristics, describing the conditions and manufacturing parameters. It is used for both: quality assessment and real-time adjustments of production processes. Automation of the monitoring can be found in almost all aspects of our lives: from monitoring the condition of power plants and delivery lines through massive transportation systems, weather, up to something very personal like miniature health monitoring devices, that we can carry in our pockets. All those applications complete a standard scheme that consists of sensors, measurement devices and electronics with software that process gathered data. The sensor detects some physics phenomena and converts it into an electrical impulse, which is then digitized by dedicated electronics. The result is processed for extracting some interesting feature, presenting the result in some form and archiving. The set of electronics, firmware and software needed to process the data from the sensors is called Data Acquisition System.

 

One can distinguish three levels while trying to categorize the DAQ systems. First level, would be very heavy duty applications, like systems used in mining, petrol facilities or transportation. They are designed to operate at very harsh conditions, thus robustness is their key feature. Measurement precision, number of sensors and readout frequency is relatively low. The second level is dedicated to applications which require moderate precision and readout frequency, while keeping limited number of input channels. The most advanced systems, like the ones used in physics experiments, impose the highest requirements. The third level is reserved to applications which operate on hundred thousand or even hundred millions of sensors that measure with resolutions of single nanoseconds at megahertz frequencies.


All three levels, however, share some common functions. They all need a real-time data path from the sensors, through the digitizing device and some sort of processing unit. In the units, algorithms used to process data are designed to extract features from the incoming data stream and generate triggers that will have immediate result on the operation of the system. In applications where information loss due to high readout rate is significant, this critical path and processing time have to be minimized. The time needed to process a given portion of data, during which the system cannot accept any input is called the dead time.


Each of the levels presents a set of different requirements, which impose the use of technologies developed for those dedicated applications. Heavy duty systems are often based on Programmable Logic Controllers (abbr. PLC), which are modular computers typically used for industrial processes. They can implement real-time algorithms, driven by the input ports. More advanced features are presented by systems like LabView from National Instruments. Those are complex solutions developed for measurement and monitoring systems, instrument control and validation systems. They are successfully used for many laboratory applications, as small scale, off-the-shelf measurement stations. Although they present versatile functionality, their scalability is limited and costs per channel are significant. Requirements imposed by applications from the third category, force the development of dedicated, custom solutions, fine-tuned for achieving the peak performance.


A particular example and undoubtedly the most advanced in terms of technology and demands are systems used in particle physics experiments. Although each experiment faces a different aspect, the main structure of their construction is common. Usually experiments are located at particle accelerators which boost projectiles to a specified energy and then hit either stationary target or other beam. The reaction products are measured by a dedicated detector system. The response of the detectors is then registered by specialized readout electronics (analog and digital) and transmitted to some storage devices for further analysis. Properly designed data acquisition system working together with the trigger system (a system which makes a decision to store or abandon given reaction event) are key elements for the efficient collection of data.

 

Scheme1: Building blocks of a single channel readout chain. A physical event excites the detector to generate an analog signal that is processed by shapers, digitizers, data collectors and transmitters. The output is saved by Event Builders for offline analysis.

 

 

 

 

Contact person:

Grzegorz Korcyl

grzegorz.korcyl (at) uj.edu.pl