Data collection will proceed at the beam line. This document will describe the data reduction and SAS modeling tasks.
To start, double click on the desktop the USAXS.pxt link. This will start Igor Pro with the proper macros loaded and configured.
Quick review of data reduction
Introductory notes before you begin:
1 Importing data from SPEC to Igor and creating USAXS data
Select from “Spec” “Import RAW data”. Select data file:
Following dialog is presented:
This dialog allows to select which data are imported from spec file to Igor experiment. As rule, the data must be consecutive range of spec scans – but this range can be only one scan long and does not have to contain only USAXS scans.
No need to change any other parameters.
Choices for methods of loading data:
Load all data in the spec data file.
Load range of spec scans – select also the starting and ending scan number.
Load one selected scan.
First time the data are loaded you should see following dialog:
The waves should be preselected correctly. Just check the names against the table in the figure or against the ones provided to you at the time of experiment (these names may change for your current experimental data). Push “Continue” following dialog appears:
Click continue here, the default answers are the right ones… Click OK in the dialog saying something about non-USAXS scans not converted…
The following is the structure of data created by this procedure: Note the Logbook, which can be at any time opened by selecting “Open Logbook” from “USAXS”->“Logging Notebook” menu. The notebook can be closed at any time – it cannot be killed. The macros are making notes there about what was done. Further note History are at lower end of the screen, where comments during import are written.
At this moment we have raw data imported into the Igor file.
Following data were created within the USAXS data folders:
Note the wave note in the lower window. This note is attached to each wave created by these macros and is routinely updated. It contains most of the history of the data – what has happened to the data to get here…
2 R data
This first step in the data evaluation is creating Rocking curve data of the sample and blank. During this procedure the angular data (AR_encoder) and USAXS_PD counts (with range info) are used to generate the Q values and intensity in counts. The beam center in angle is found and set to Q=0 and the PD data are corrected by range info and dark current info to correct for amplifier gain and changes…
Lets walk through this procedure for one run:
Select “USAXS” – “Create R wave”.
Keep the selection. Note, that as you will proceed through the data evaluation, the present folder in this menu changes to next folder…
This dialog appears only the first time in Igor experiment. This would allow change of the PD size, if different PD was used… Up to now we have never used different PD, so this value is set correctly. Unless instructed differently, do not change…
This is the first graph of this procedure. Here the user needs to find center of the beam. There are three methods, which can be used 1a, 1b, 1c… The most likely appropriate is 1b – fitting Gaussian function between the cursors. Using 1a Lorenzian function may be also useful sometimes… For special cases only is the manual 1c method…
Move cursors around as needed, so the function reasonable well describes the peak… Note that both peak center and peak height are used in the following procedures, so the fit needs to be reasonable…
Move the cursors around and push the 1b button. You may have to move the legend in the graph to see the top part of the curve… Note, that the text box in the graph now shows the full width at half maximum, maximum height and beam center in degrees… No need to make notes, the macros will make them shortly in the logbook and wave notes… When satisfied, push button 2. Continue.
This is second graph of this procedure. This one is here mostly to check the background for each range… Note that ranges of the data are marked with the colors, which are also used in the panel to aid user. This graph can be zoomed in and out as needed using Igor commands to check how ranges join together.
Let me demonstrate problem with background:
Here I have changed background 3 from 33 into 330 and zoomed in the area, where range 3 and 4 merge… Note the discontinuity. Lets try changing the background into 3 again and zoom in some more… Changing the background 3 into 33 fixes this discontinuity:
It is IMPERATIVE to work carefully here, because wrong background setting here will propagate through the data evaluation procedures further and get amplified. It can reduce usefulness of the data significantly…
When done push button 1. Save the R data. This will create in R_Int, R_Qvec, and R_Error waves in the appropriate USAXS folder. Note, that this also takes away the panel, as changing any values in it would not influence the already saved R data. If you need to change the R wave you have to rerun the Create R wave procedure on the same data.
Finish up with all scans before continuing, including the instrumental curves (blanks).
3 Subtracting Blank from Sample
Next step is to subtract instrumental curve (usually we call it blank) from the sample measurement. Select “USAXS” – “Subtract Blank from Sample”:
Select sample. Continue:
Select blank closest to sample and keep Calibrate as USAXS. I suggest always calibrating, since the only input user needs is sample thickness (rest are instrumental parameters) and even when the thickness is not known, this procedure takes out all the other changes of this instrument and makes comparing samples together easier…
Calibration parameters. Do not change here – there is no need to change upper number and thickness for these powder samples is unknown……
This graph is for aligning the peaks so their center and height are aligned. The pre-alignment is done using the fitting parameters, so it is usually done well. If you do not like it, you can change the alignment using the transmission (should be always smaller than 1!) or sample shift in Q (should be really small number)… The best transmission here should be around 0.52 or so..
When satisfied, push continue:
In this graph you can check log-log plot of both sample and blank and still modify Qshift and transmission. Modifying Qshift is unlikely here – the readout in log scale is poor, but it may be sometimes (rarely) possible to modify transmission. When happy, push button : 1. subtract and save
Note new black line in the graph, plotted on right axis. This is subtracted sample and blank, for regular USAXS these are smeared data. If the data were calibrated and the thickness of the sample is properly inserted, these data are in cm -1 .
Finish all samples…
The data from the Bonse-Hart camera of standard type are smeared by slit width – another words, in one direction (vertically) the Q resolution is very tight (defined by the rocking curve width of the 6-bounce channelcut in our case), horizontally the Q resolution is defined geometrically by the width of the photodiode, wavelength and sample to detector distance.
To convert data from slit smeared geometry into standard pin-hole type data the Indra2 macros provide implementation of Lake method.
Select “USAXS” – “Desmear”:
Select Sample 1, continue:
Note the slit length. It has been calculated from the experiment geometry used in the spec command and from known X-ray energy. Macros calculate and paste the value from the wave note in here. If this procedure would be used on data from other sources, it would be necessary to change the slit length here…
This is graph, in which user needs to limit the range of data into useful one. Note, that few data points in the beginning and in the end of the Q range are artifacts of the measurement and will have to be removed before the desmearing. Note, that the low area of the graph indicates, on which points the cursors are – limit the Q range to range with data (first 3 points and last 3 point in above graph are not physically reasonable, but it may be different in you case), then push Button Trim.
OK, this looks like useful range of data, push button Continue:
Here user needs to select best function to extend the data beyond measured (and trimmed) range. In order to desmear the last few data points correctly, the data need to “virtually” exists at least slit length beyond these points. So we need to make them there – the area of data in green are data points used for fitting a function which parameters are then used to calculate and extrapolate the blue data points at high Q. Select proper background function and Q range above the graph. This is time for experimentation. The presented case is the simplest since the data go flat at the end and constant is a good extrapolation method. Select “flat” in the pull down menu and range as indicated above.
When happy, push continue:
This is fitting graph. By pushing button “Do one iteration” the routine will run through once. Lets run it once:
The red points are original data, black crosses are current estimate of desmeared data and circles are normalized errors. Ideally, the normalized errors should be within +1 and –1. Note, that normalized errors are plotted wrt right axis, which changes as the fit improves.
After 10 cycles, the desmearing looks reasonably good – some points are noisy at the low Q area – these are often difficult due to noise… However, most of the Q range seems to be desmeared fine.
When done, push button Continue to get final screen:
Final graph shows the original data (red) and desmeared data (black). The DSM_Int, DSM_Qvec, and DSM_Error waves were created in the sample folder. It is possible now to export the data using the button Export DSM data - or further evaluate them using some built in functions – fitting of size distributions or modeling.
To desmear another measurement, use button start again, to end kill the window.
The settings from the last sample which was desmeared are remembered and used as defaults with the next sample. This is because most users come with similar samples and this saves time.
Finish all samples.
Data analysis: particle size distribution
In the present case we will use the Irena1 set of macros has built in regularization method for fitting size distributions of spheroids. The macros do have also capability to run external “Sizes” program by Pete Jemian. This allows the code to run not only built in regularization but also maximum Entropy method.
Using Size distribution
This program uses one complex interface – a challengingly complex graph and panel. To start, load “Irena 1 modeling macros” from the SAS menu. Select “Size distribution” from “SAS” menu…
The new panel on left is control panel. Check the “Use Indra 2 data structure” checkbox (top right corner). Select waves with data – it is possible to run the model on desmeared data or on smeared data. I strongly suggest using the desmeared data – the error estimates on those seem to give better results. Push button “Graph” to generate following graph…
Fill in the various fields as shown on this figure… NOTE POSITION OF CURSORS. Then push button “Run internal regularization” and following result should appear:
This is somehow acceptable fit for the data in the graph – and for purpose of description of this graph now. To get this fit set values in the panel to values in the figure and push Run Internal regularization button. Note, that the fitting will take longer time, since the number of point and number of bins is large – and the code calculates matrices, which are number of points x number of bins large…
Now lets get to explanations:
The green points are the original data points.
The red points (top part of graph) are points selected for fitting (without background)
The blue line (very difficult to see) is the fit obtained by the fitting routine
The bar graph is the particle volume distribution (use top and right axis)
In the low graph
The red dots are normalized residuals. Ideally these should be random within +1 and –1, this structure suggests some misfits in some areas…
Try varying parameters of the fit (size range, background, fitted data range, etc) to see, what these parameters do and change. Ideally the resulting size distribution SHOULD NOT be negative. Note, that you need relatively narrow range of data to obtain the solution well…