• The data set is divided in half and corresponding scans are subtracted from each other. The way this works is that basically the set of scans taken over total time interval, dT_total, (no mattern how many there are; it could be just 2 scans , or it could be 200 scans), is divided in half, and the first scan from the first half is subtracted from the first scan of the second half of the set. For example, if there are 100 scans [as is often the case for Rayleigh data on the XL-I], I subtract scan 1 from scan 51, scan 2 from 52 ... scan 50 from scan 100. So that the time interval is always dT_total/2. If I had 1000 scans over the same interval, dT_total, I would have 10 times as many scans each with the same S/N ratio as for each scan in the smaller data set. But, now, when I average these, the overall S/N ratio for the average is SQRT(10) bigger than if I used only 100 scans. The value of dT_total,( i.e. the time difference between the first and last scans of the entire set), is chosen by my rule of thumb that allows the (assumed to be) gaussian boundary (assumed to be located in the middle of the cell) to move no more that 1 standard deviation during the time interval dT_total/2.

  • After subtraction of pairs of concentration profiles to eliminate the baseline, the x-axis of each curve must be tranformed to compensate for the different times of sedimentation. This tranformation takes us from the stationary reference frame of the centrifuge cell to the moving reference frame of the sedimenting boundary.

  • The result is the averaged, baseline corrected time derivative pattern containing reduced random noise. The noise is reduced by the inverse square root of the number of pairs of concentration profiles used.

  • The error bars are computed as the standard error of the mean at each point. They are the standard deviation (at each radial position) of all the dcdt patterns computed as explained above divided by the square root of the number of dc/dt patterns. Standard statistics.