Advanced Photon Source at Argonne National Laboratory   APS USAXS instrument
USAXS instrument at the Advanced Photon Source,
X-ray science division, beamline 15ID (ChemMatCARS)

Advanced Photon Source

A U.S. Department of Energy, Office of Science,
Office of Basic Energy Sciences national synchrotron x-ray research facility

Argonne Home > Advanced Photon Source > USAXS >

Igor experiments

Examples from NX School
    2004 - casein protein

Ultra-Small-Angle X-ray Scattering Facility

instrument scientist: Jan Ilavsky, 630-252-0866,

USAXS data reduction tutorial:
Size distribution of alumina polishing powder


Jan Ilavsky
and Pete Jemian, APS

Prepared for the National School on Neutron and X-ray Scattering.


Task Descriptions
Data collection

Samples and instrument will be provided by beamline scientist at 32ID beamline. The instrument function will be described by the instrument scientists. Data collection will be performed with each group, assuming availability of X-rays and functionality of the instrument. In any case, sample of data measured will be provided to everyone.

Each student will prepare powder sample for the experiment.

Data reduction

Computers with data evaluation software will be provided to each student. Short description of data evaluation procedure will be available at each station, as well as few copies of manual for data evaluation will be available at the computer room.

SAS modeling

Model of scattering from polystyrene spheres – as well as measured data – will be available to the students on the computer. Students will be able to observe changes in the scattering pattern when size of the scatterers is changed or when distribution of sizes is present. The model results can be compared with the measured data.

Data collection will proceed at the beam line. This document will describe the data reduction and SAS modeling tasks.

To start, double click on the desktop the USAXS.pxt link. This will start Igor Pro with the proper macros loaded and configured.

Quick review of data reduction

  1. Get data from Spec file into an IgorPro experiment
  2. Create USAXS data from raw Spec data
  3. Create Rocking curve data (R data – Q, intensity, error)
  4. Subtract blank run from sample run, do the calibration if desired (smeared data, assuming standard USAXS,– Q, SMR Intensity and error).
  5. Desmear data to get pinhole collimated data (Q, DSM Intensity, error).
  6. Plot and fit data to learn something about the sample.

Introductory notes before you begin:

  1. Two set of Igor Pro – based macros are provided to users of USAXS instrument for free:
    IgorPro macros

    USAXS data reduction macros. Reduces raw instrument data into Q,I,dI (columns of Q, intensity, and estimated standard deviation of intensity which are the standard small-angle scattering form).

    These macros are found in IgorPro under the USAXS menu.

    Important note: Do not run any “Indra” (= data reduction) macros concurrently – that is, do not run one (let's say graphing) macro and while having it opened, try to run another macro – and then return to the first macro (graph) – and assume, that it will continue working. At this time it will not…

    Small-angle scattering data analysis macros. Plots data from the USAXS instrument, as well as data from various other instruments. Provides rich data analysis tools for small-angle scattering. Also provides desmearing useful for USAXS.

    These macros are found in IgorPro under the SAS menu.

  2. What to do, when you get error message?

    Check, that you have done steps in proper order… And if yes, scream for help, if support is around!!! If I no one is around, do following steps: Make notes on piece of paper (or in the computer (Word) file with copy of screen) about the message and what exactly you were trying to do – that includes as many steps back, as you can recall. Save me a copy of (Save as) Igor file and keep it so I can figure out, what happened. Then start again with data evaluation from somewhere early in data evaluation – for USAXS data for example start with re-evaluating data from R-waves etc…

    Experience is, that most errors are generated due to skipping (by mistake) necessary steps and forgetting to create at some point output waves/strings/variables.

Data reduction

1 Importing data from SPEC to Igor and creating USAXS data

Select from “Spec” “Import RAW data”. Select data file:



Following dialog is presented:



This dialog allows to select which data are imported from spec file to Igor experiment. As rule, the data must be consecutive range of spec scans – but this range can be only one scan long and does not have to contain only USAXS scans.


No need to change any other parameters.


Choices for methods of loading data:

•  Load all data in the spec data file.

•  Load range of spec scans – select also the starting and ending scan number.

•  Load one selected scan.


First time the data are loaded you should see following dialog:


The waves should be preselected correctly. Just check the names against the table in the figure or against the ones provided to you at the time of experiment (these names may change for your current experimental data). Push “Continue” following dialog appears:


Click continue here, the default answers are the right ones… Click OK in the dialog saying something about non-USAXS scans not converted…


The following is the structure of data created by this procedure: Note the Logbook, which can be at any time opened by selecting “Open Logbook” from “USAXS”->“Logging Notebook” menu. The notebook can be closed at any time – it cannot be killed. The macros are making notes there about what was done. Further note History are at lower end of the screen, where comments during import are written.



At this moment we have raw data imported into the Igor file.


Following data were created within the USAXS data folders:


Note the wave note in the lower window. This note is attached to each wave created by these macros and is routinely updated. It contains most of the history of the data – what has happened to the data to get here…



2 R data

This first step in the data evaluation is creating Rocking curve data of the sample and blank. During this procedure the angular data (AR_encoder) and USAXS_PD counts (with range info) are used to generate the Q values and intensity in counts. The beam center in angle is found and set to Q=0 and the PD data are corrected by range info and dark current info to correct for amplifier gain and changes…


Lets walk through this procedure for one run:

Select “USAXS” – “Create R wave”.


Keep the selection. Note, that as you will proceed through the data evaluation, the present folder in this menu changes to next folder…


This dialog appears only the first time in Igor experiment. This would allow change of the PD size, if different PD was used… Up to now we have never used different PD, so this value is set correctly. Unless instructed differently, do not change…


This is the first graph of this procedure. Here the user needs to find center of the beam. There are three methods, which can be used 1a, 1b, 1c… The most likely appropriate is 1b – fitting Gaussian function between the cursors. Using 1a Lorenzian function may be also useful sometimes… For special cases only is the manual 1c method…

Move cursors around as needed, so the function reasonable well describes the peak… Note that both peak center and peak height are used in the following procedures, so the fit needs to be reasonable…


Move the cursors around and push the 1b button. You may have to move the legend in the graph to see the top part of the curve… Note, that the text box in the graph now shows the full width at half maximum, maximum height and beam center in degrees… No need to make notes, the macros will make them shortly in the logbook and wave notes… When satisfied, push button 2. Continue.


This is second graph of this procedure. This one is here mostly to check the background for each range… Note that ranges of the data are marked with the colors, which are also used in the panel to aid user. This graph can be zoomed in and out as needed using Igor commands to check how ranges join together.

Let me demonstrate problem with background:


Here I have changed background 3 from 33 into 330 and zoomed in the area, where range 3 and 4 merge… Note the discontinuity. Lets try changing the background into 3 again and zoom in some more… Changing the background 3 into 33 fixes this discontinuity:

It is IMPERATIVE to work carefully here, because wrong background setting here will propagate through the data evaluation procedures further and get amplified. It can reduce usefulness of the data significantly…

When done push button 1. Save the R data. This will create in R_Int, R_Qvec, and R_Error waves in the appropriate USAXS folder. Note, that this also takes away the panel, as changing any values in it would not influence the already saved R data. If you need to change the R wave you have to rerun the Create R wave procedure on the same data.

Finish up with all scans before continuing, including the instrumental curves (blanks).


3 Subtracting Blank from Sample

Next step is to subtract instrumental curve (usually we call it blank) from the sample measurement. Select “USAXS” – “Subtract Blank from Sample”:


Select sample. Continue:


Select blank closest to sample and keep Calibrate as USAXS. I suggest always calibrating, since the only input user needs is sample thickness (rest are instrumental parameters) and even when the thickness is not known, this procedure takes out all the other changes of this instrument and makes comparing samples together easier…



Calibration parameters. Do not change here – there is no need to change upper number and thickness for these powder samples is unknown……



This graph is for aligning the peaks so their center and height are aligned. The pre-alignment is done using the fitting parameters, so it is usually done well. If you do not like it, you can change the alignment using the transmission (should be always smaller than 1!) or sample shift in Q (should be really small number)… The best transmission here should be around 0.52 or so..

When satisfied, push continue:


In this graph you can check log-log plot of both sample and blank and still modify Qshift and transmission. Modifying Qshift is unlikely here – the readout in log scale is poor, but it may be sometimes (rarely) possible to modify transmission. When happy, push button : 1. subtract and save


Note new black line in the graph, plotted on right axis. This is subtracted sample and blank, for regular USAXS these are smeared data. If the data were calibrated and the thickness of the sample is properly inserted, these data are in cm -1 .

Finish all samples…

4 Desmearing

The data from the Bonse-Hart camera of standard type are smeared by slit width – another words, in one direction (vertically) the Q resolution is very tight (defined by the rocking curve width of the 6-bounce channelcut in our case), horizontally the Q resolution is defined geometrically by the width of the photodiode, wavelength and sample to detector distance.

To convert data from slit smeared geometry into standard pin-hole type data the Indra2 macros provide implementation of Lake method.


Select “USAXS” – “Desmear”:


Select Sample 1, continue:


Note the slit length. It has been calculated from the experiment geometry used in the spec command and from known X-ray energy. Macros calculate and paste the value from the wave note in here. If this procedure would be used on data from other sources, it would be necessary to change the slit length here…



This is graph, in which user needs to limit the range of data into useful one. Note, that few data points in the beginning and in the end of the Q range are artifacts of the measurement and will have to be removed before the desmearing. Note, that the low area of the graph indicates, on which points the cursors are – limit the Q range to range with data (first 3 points and last 3 point in above graph are not physically reasonable, but it may be different in you case), then push Button Trim.


OK, this looks like useful range of data, push button Continue:


Here user needs to select best function to extend the data beyond measured (and trimmed) range. In order to desmear the last few data points correctly, the data need to “virtually” exists at least slit length beyond these points. So we need to make them there – the area of data in green are data points used for fitting a function which parameters are then used to calculate and extrapolate the blue data points at high Q. Select proper background function and Q range above the graph. This is time for experimentation. The presented case is the simplest since the data go flat at the end and constant is a good extrapolation method. Select “flat” in the pull down menu and range as indicated above.

When happy, push continue:


This is fitting graph. By pushing button “Do one iteration” the routine will run through once. Lets run it once:


The red points are original data, black crosses are current estimate of desmeared data and circles are normalized errors. Ideally, the normalized errors should be within +1 and –1. Note, that normalized errors are plotted wrt right axis, which changes as the fit improves.


After 10 cycles, the desmearing looks reasonably good – some points are noisy at the low Q area – these are often difficult due to noise… However, most of the Q range seems to be desmeared fine.

When done, push button Continue to get final screen:


Final graph shows the original data (red) and desmeared data (black). The DSM_Int, DSM_Qvec, and DSM_Error waves were created in the sample folder. It is possible now to export the data using the button Export DSM data - or further evaluate them using some built in functions – fitting of size distributions or modeling.

To desmear another measurement, use button start again, to end kill the window.


The settings from the last sample which was desmeared are remembered and used as defaults with the next sample. This is because most users come with similar samples and this saves time.

Finish all samples.

Data analysis: particle size distribution

In the present case we will use the Irena1 set of macros has built in regularization method for fitting size distributions of spheroids. The macros do have also capability to run external “Sizes” program by Pete Jemian. This allows the code to run not only built in regularization but also maximum Entropy method.


Using Size distribution

This program uses one complex interface – a challengingly complex graph and panel. To start, load “Irena 1 modeling macros” from the SAS menu. Select “Size distribution” from “SAS” menu…


The new panel on left is control panel. Check the “Use Indra 2 data structure” checkbox (top right corner). Select waves with data – it is possible to run the model on desmeared data or on smeared data. I strongly suggest using the desmeared data – the error estimates on those seem to give better results. Push button “Graph” to generate following graph…


Fill in the various fields as shown on this figure… NOTE POSITION OF CURSORS. Then push button “Run internal regularization” and following result should appear:


Sizes input panel description
Measured data
Data selection Slit smeared data and slit length – these are preselected by default in the proper form for 33ID USAXS data and should not have to be changed. Their presence here allows user to use this macro for data obtained on different instrument, which may have different name structure and may not contain slit length in the wavenote.
Distribution parameters
Minimum diameter (Å) Lower limit of fit distribution.
Maximum diameter (Å) Upper limit of fit distribution.
Bins in diameter how many bins to use for the size distribution
Logarithmic binning

If checked, the bins are binned logarithmically (spaced evenly in a geometric series)
If unchecked, the bins are binned linearly (spaced evenly in an arithmetic series)

With log binning, the bins at small sizes are smaller and at large sizes are larger, giving same width bins when plotted on axis logarithmically. This is a very useful setting for the wide ranges of Q measured using USAXS instrument.

Fitting parameters

this is flat background to be subtracted from data. The red line in the graph shows current value.

(delta rho squared)

If this is properly inserted, then the volume fraction is in absolute units

Multiply errors by

Allows to loosen requirements on the fit by multiplyinh the reported estimated standard deviation of intensity by this factor .

Particle model
Particle shape model


Aspect ratio

anything, 1 is for sphere. If 1 is selected, sphere model is used, otherwise spheroid model is used

Maximum entropy
number of iterations

The maximum number of iterations to allow MaxEnt to use before giving up with a report that no solution has been found. Prevent MaxEnt from iterating infintely. Only used by MaxEnt analysis.

sky background

This is the value to which the distribution will tend if you have no data. Generally, the distribution will be this value at the minimum and maximum diameter bins. Try to set it so that it is about two decades below the peak value in the distribution. Don't fine-tune it, though; only make adjustments by log decades (e.g., 1x10-8, 1x10-7, 1x10-6). Only used by MaxEnt analysis.

MaxEnt multiplier

Always set this to 1. Only used by MaxEnt analysis.

action buttons at bottom of panel

If you do not push this, the fitted size distribution results will be discarded when you go to the next sample or close this window!. When you push this, the results are copied (from a scratch area) into the sample folder.

Run regularization

Performs regularization analysis (maximum smoothness) of SAS data.

Run MaxEnt

Performs maximum entropy analysis (always positive) of SAS data.

This is somehow acceptable fit for the data in the graph – and for purpose of description of this graph now. To get this fit set values in the panel to values in the figure and push Run Internal regularization button. Note, that the fitting will take longer time, since the number of point and number of bins is large – and the code calculates matrices, which are number of points x number of bins large…

Now lets get to explanations:

The green points are the original data points.

The red points (top part of graph) are points selected for fitting (without background)

The blue line (very difficult to see) is the fit obtained by the fitting routine

The bar graph is the particle volume distribution (use top and right axis)

In the low graph

The red dots are normalized residuals. Ideally these should be random within +1 and –1, this structure suggests some misfits in some areas…


Try varying parameters of the fit (size range, background, fitted data range, etc) to see, what these parameters do and change. Ideally the resulting size distribution SHOULD NOT be negative. Note, that you need relatively narrow range of data to obtain the solution well…



Privacy & Security Notice  |   Contact Us    
This page last modified: 2006-09-28 10:54 AM