This tutorial does not work with any recent versions of Teem. Sorry. Here all the steps it took to go from a volume in non-nrrd format to an opacity function:
unu make -h -i ./engine.den -t uchar -s 256 256 110 -sp 1 1 1 -c engine \ -bs 62 -e raw -o engine.den.nhdr unu crop -i engine.den.nhdr -min 59 20 0 -max 207 227 M -o engine-crop.nrrdFrom the preparing the data section, we saw how to generate a NRRD header for an existing data file, as well as cropping it down to size, using unu. gkms hvol -i engine-crop.nrrd -s f:1.0 p:0.005 p:0.6 -o engine-hvol.nrrdThis made the histogram volume This is the slowest step.
- optional:
gkms scat -i engine-hvol.nrrd -o engine-vg2.png engine-vh2.pngMaking the scatterplots wasn't necessary, but hopefully it was educational.
gkms info -i engine-hvol.nrrd -one -o engine-info1.nrrd gkms info -i engine-hvol.nrrd -o engine-info2.nrrdThis distilled out the essential information from the histogram volume.
- optional:
gkms pvg -i engine-info2.nrrd -o engine-pvg.pngThis made pretty pictures of the boundary characteristics of the dataset, as measured by the histogram volume.
- gkms opac -s 1.5 engine-info1.nrrd engine-opac11.nrrd
gkms opac -s 1 -g 15 -m 1 engine-info2.nrrd engine-opac22.nrrd
These are two of opacity function commands that were used.This research has worked to reduce the numbers of parameter settings which are necessary to make a direct volume rendering. Of course, there are some parameters which remain. I view these three as the most important (in order of importance):
Here are some other parameters which matter:
- The boundary emphasis function. This is required, because it is via this function that an actual opacity assignment is possible. One way to describe this research is that it has broken opacity function generation into two parts, a hard part and an easy part. The hard part is largely automated, the easy part is the primary user control.
- gthresh, and sigma. These are parameters which control how the position function (distance map) is calculated from the histogram volume, and setting them to good values is pretty important in getting nice opacity functions out.
- Derivative inclusion. The histogram volume is not useful if it doesn't contain the important information about the boundaries, and this information is tied up in derivative measurements taken throughout the dataset. The fidelity with which these measurements are represented in the histogram volume is determined by what range of derivative values are included in it.
... More experience/suggestions on how to set the remaining parameters as time permits ...
- Choice of second derivative measure. Actually, I view this as being part of a larger problem of determining the best way to measure complex higher order derivatives in volume data, and this is a research topic unto itself.
- Method of histogram volume distillation. I've usually used the mean, but experience has shown that the median or the mode work better on some datasets.
- Post-processing of the generated opacity function. The median filtering of the 2-D opacity function is one example of this, cropping out large regions by hand is another.