You can sum the intensities of ions, and plot them as a function of time (chromatographic retention time) for a total ion chromatogram (TIC), which looks much like the output of a spectrophotometer such as a UV detector. In the case of MS, one axis represents ion intensity; the other can be time or the digital sample taken at a particular time (i.e., a spectrum). You can display each of the spectra can separately, much like a series of images acquired by modern digital video cameras that are, in essence, a series of high speed still photos.
Simple but very useful techniques are possible: for example, reducing the array of data in a selected ion chromatogram or applying digital filters to reduce noise, as you could by displaying only the most intense peak of each digital sample (a base peak ion chromatogram, or BPI).
Software design has become a separate specialty over the years, not simply a means to set acquisition parameters. Today, operating and data systems permit intricate control of an instrument by its operator.
Significantly, these specialty software packages have evolved:
The demands of data management are fast outstripping the ability to meet them. High resolution, mass-accurate data can generate a prodigious 1 GB/h. Such enormous quantities of data are generated not only by life science investigators but, increasingly, by those working in industries that depend on high volume processes like characterizing the presence of metabolites and their biotransformations. After 180 days of operation, five mass spectrometers, each producing 24 GB of data per day, will present you with the need to store, retrieve, sort, and otherwise make sense of 21.6 terabytes (TB).
The first question in any data scenario must address what we intend to do with the data we collect. Unlike e-mail, which imparts its message and thereafter serves little purpose, online data increases in value over time as biological, pharmaceutical, and physicochemical measurements continue to amass within a data file. But this increase in value comes with the cost of ensuring the data's accessibility. In view of the increasing size of data files, and the length of time over which they must be accessed, a solution might include some form of hierarchical storage management. Thus, some smaller percentage of the data would be immediately accessible, or "active," while the remainder, in successive stages, are in-process or earmarked for long-term archiving.
See MS - The Practical Art, LCGC