HFT Alignment Procedures


Introduction

The silicon detectors (HFT) alignment is based on minimization of the residuals between track projections and hit positions of all detectors starting from initial survey information. The method utilizes linear approximations to several dependencies between observable quantities and the related alignment parameters. This aproximation requires that any misalignment must be very small. Thus the process is necessarily iterative.


Method



Software

Once logged into RCF, the very first step to do before starting with the alignment procedure is to set the star version of ROOT to ".DEV2".
starver .DEV2
The alignment software contains 2 packages located in $STAR/StRoot/StSvtPool: EventT and SvtMatchedTree


And the corrections obtained with the alignment steps are located in $STAR/StarDb/Geometry:


Usually, you will not have permission to write on these folders. In this case, the same "directory structure" must be created into the folder where the code is going to be run, i.e., create your own "StRoot/" and "StarDb/". Copy the macros showed above to the respective folders. Put your results in "StarDb".
mkdir -p StRoot/StSvtPool/EventT
mkdir StRoot/StSvtPool/SvtMatchedTree
mkdir -p StarDb/Geometry/ssd
mkdir StarDb/Geometry/svt
cp $STAR/StRoot/StSvtPool/EventT/* ./StRoot/StSvtPool/EventT/
cp $STAR/StRoot/StSvtPool/SvtMatchedTree/* ./StRoot/StSvtPool/SvtMatchedTree/
The code will look for the corrections files first in the current directory and then, if does not find anything, in $STAR/.

You must work with Root ignoring Floating Points Exception
setenv STARFPE NO

Also you need to enable QT tools to inspect the resulting histograms easily. For this, just copy and set the path:
cp ~fisyak/.rootrc
setenv PATH ${PATH}:/afs/rhic.bnl.gov/star/users/fisyak/macros/


Procedures

A basic view of the whole procedure is showed in the picture below. Of course, this is a very simplified description and all details regarding each step will be presented afterwards.



This loop must be done as many times as needed to get a "good" result, and also, separately for Global, Local, etc. Usually, the convergence is achieved in one or two iterations. Once the results have converged, the geometry is frozen and a new sample is generated including the hits of the detector which has just been aligned.
The sequence to be followed for each detector is:
  1. SSD Alignment: (TPC Only)
  2. SVT Alignment: (TPC+SSD)
  3. Consistency Check: (TPC+SSD+SVT)
For each one of the big steps mentioned above (1. SSD; 2. SVT; 3. Check) we define Passes. In the first "Pass" (SSD), a calibration sample with TPC only tracks is used to calculate the projections on the silicon detectors. The alignment is processed and the obtained geometry constants are used to generate a new sample which includes the SSD hits in the calculation of the track projections, increasing the resolution. Using this new sample, we go to a new "Pass" (SVT), and after finish the alignment and update the geometry constants, a last "Pass" must be created to check the consistency of the results.

  1. Getting the Calibration Sample (st_physics_*event.root files): The calibration sample is obtained as result of standard reconstruction procedure with activated SVT and SSD cluster reconstruction and (de)activated tracking reconstruction with these detectors. For CuCu sample (Run V) was used the following chain options:
    1. TPC only: "P2005,MakeEvent,ITTF,tofDat,ssddat,spt,-SsdIt,Corr5,KeepSvtHit,Hitfilt,skip1row,SvtMatTree,EvOutOnly"
    2. TPC+SSD: "P2005,MakeEvent,ITTF,tofDat,ssddat,spt,SsdIt,-SvtIt,Corr5,KeepSvtHit,hitfilt,skip1row,SvtMatTree,EvOutOnly"
    3. TPC+SSD+SVT: "P2005,MakeEvent,ITTF,tofDat,ssddat,spt,SsdIt,SvtIt,Corr5,KeepSvtHit,hitfilt,skip1row,SvtMatTree,EvOutOnly"
    and job looks like,
    root4star -q -b 'bfc.C(9999,"chains_option","path_to_daq_file")'
    
    Important!!! You have to run the above script from directory where you have your StarDb structure.

  2. (Re)Generating the TTree's (Event*.root files): to generate the TTree's, run StRoot/StSvtPool/SvtMatchedTree maker with macro StRoot/StSvtPool/SvtMatchedTree/runSvtTree.C. These TTree's contains primary track predictions for each wafer and all hits on this wafer. During this step, a set of alignment constants stored in StarDb/Geometry/svt and StarDb/Geometry/ssd is used. You have to create a TTree for each one of the st_physics*.event.root file, and the easiest way to do this is using the scrip "NOT WORKING" "NOT WORKING"subm.pl to submit jobs. In this script, it is necessary to set the queue name to'star_cas_short' or 'star_cas_prod' (lines 33 and 34, just comment one of them).
    During regeneration of TTree's it is used:
    • reconstructed track parameters obtained in above reconstruction (which are not changed),
    • local SSD coordinates obtained during cluster reconstruction. This allows to change transformation local to global and
    • SVT drift time and anode, which allows to modify local coordinate (drift velocities) and local to global transformation.

    The output files (Event*.root) and logfiles will be saved in the folder from where you submited the jobs. This step usually takes a long time.

  3. Histograms (Pass*.root files): once the jobs submited in the previous step has finished, the histogram file can be created running macro makeTPlots.C. Run this macro from the directory where you want the histogram file be saved. The command is the following:
    root.exe PATH_TO_TTREE_FILES/TTREE_FILES makeHFTPlots.C 
    Click
    here to see the code.

    See example below:


    You can run just root instead of root4star. The macro doesn't take any argument but you tell the code which files to run from the root execution command line. The "| tee" (pipe tee) command allows you to send the output to both the screen and the log file "mylog.log".
    The output file appears in the directory where you run from and the file name is formed from the path to TTree file with "/" replaced by "_", example:
    Pass128/TpcSsdSvt/021/Event*.root ===> Pass128_TpcSsdSvt_021PlotsNFP25rCut0.5cm.root
    
    The next step is to do the analysis of the histograms.


Analysis

The analysis of the histograms is done using the macro TDraw.C. This macro allows you to choose the results from Global, Local, Drift and Anodes analysis: The starting point is the Global alignment, i.e., the alignment of shells (SVT) and sectors (SSD) on global.

  1. Global: in the directory of the "Pass" which you are working with, create a new directory "Global".
    mkdir Global
    cd Global
    
    From here, you can run the analysis for Global alignment: TDraw.C+. It takes as input the histogram file created before.
    root.exe PATH_TO_HISTOGRAM_FILE HISTOGRAM_FILE TDraw.C+
    
    A "Results*" file is produced and also a table of plots showing the fits of all alignment parameters. To see the plots one by one, go to "View > Zoomer" and then click with the middle button on the plot you want to see. A common procedure is to save all these plots as web pages, and to do this, go to "File > Save as Web Page", like in the picture below.


    Now, you have to check the results. Some derivatives are ill defined and the respective plots will look weird, but do not worry. Grep the average results from the file produced, check whether the values satisfy the following criteria:
    • dX, dY, dZ < 10 mkm (microns) or within the standart deviation;
    • α, β, γ < 0.1 mrad or within the standart deviation;
    If so, congratulations! you have good results and can go to the next step, otherwise, you will have to put these values into a correction file and run the whole loop again in order to try to improve the results. To do this, grep the average results (see example below):


    Then, edit the macros,
    MakeSvtOnGlobal.C
    MakeSsdOnGlobal.C
    MakeShellSectorOnGlobal.C
    
    and insert the values into a correction table. The last two rows, Average for All Svt and Average for All Ssd must be put into MakeSvtOnGlobal.C and MakeSsdOnGlobal.C, respectively, and the rows corresponding to SVT Clam Shells and SSD Sector must be put into MakeShellSectorOnGlobal.C. The next picture shows the structure of the table, it is the same for all three macros:


    In this structure, it is very important that the entries "date" and "time" agree with the validity period of your data set. The run number of your data set is in the name of the files st_physics_<run number>*.event.root. The information about the run date is in the
    STAR Online web page. The time stamp is used as a flag for the analysis loop, i.e., the first analysis loop (Global) you set time=000000, then in the second iteraction you put time=000001... in the nth iteraction, time=previous_time+1, and do it up to the final of the whole alignment. It helps you to keep track of what have been done.
    After you have edited these macros, you have to run them using root4star.
    root4star MakeShellSectorOnGlobal.C (example)
    
    Some corrections files will be generated and you will have to copy or move them to StarDb/Geometry/svt and StarDb/Geometry/ssd (the StarDb structure that you have created).
    MakeSvtOnGlobal.C
    »
    SvtOnGlobal.<date>.<time>.C
    MakeSsdOnGlobal.C
    » SsdOnGlobal.<date>.<time>.C
    MakeShellSectorOnGlobal.C
    » ShellOnGlobal.<date>.<time>.C
    SsdSectorsOnGlobal.<date>.<time>.C
    After that, the corrections obtained will be applied in the next iteraction, so you have to start again from the step (Re)Generating the TTree's. A good advice would be to check the log file produced whether your corrections was indeed found.
    If you are doing the global alignment for SVT, that means that the SSD geometry has already been frozen, so you must not use the SSD values, just the SVT ones.

  2. Local: since you have got good results for the Global Alignment, you can go to Local Alignment. In the directory of the "Pass" which you are working with, create a new directory "Local".
    mkdir Local
    cd Local
    
    Run the analysis for Local Alignment TDraw.C+(5).
    root.exe PATH_TO_HISTOGRAM_FILE/HISTOGRAM_FILE "TDraw.C+(5)"
    
    Results files will be produced for each one of the barrels and also windows with plots of the alignment parameters. The same way as before, save the plots as web pages (View>Zoomer; File>Save as Web page) and check the results. Again, if you have "good" results, you can go to the next step, otherwise, you will have to generate corrections files for the local geometry and run one more time.
    The Results* files produced in this case have a pair, Results*.h, which has the correct format required in the macros. You can use them to edit the macros.


    Note that the table structure is a little bit different. Although this table structure does not take into account the "date" and "time" flags, you still have to put them in separated variables, and remember "that" time must follow the sequence from the previous step and "date" must agree with the validity period of your data set.


    The macros which you need to use to generate the corrections files in this case are:
    root4star MakeSsdLaddersOnSectors.C
    root4star MakeSvtLadderOnShell.C
    
    You have to run the first one if you are working on the SSD alignment, and the second one if you are working on the SVT alignment.
    MakeSsdLaddersOnSectors.C
    »
    SsdLaddersOnSectors.<date>.<time>.C
    MakeSvtLadderOnShell.C
    » LadderOnShell.<date>.<time>.C
    Copy or move the corrections files for SSD or SVT into the directories StarDb/Geometry/ssd or StarDb/Geometry/svt, respectively. And after that, go back to the step
    (Re)Generating the TTree's and do the procedure again in order to try to improve the results.


Bookkeeping

It is very important to keep track of what is done, otherwise one can get lost and the work become confuse and not productive. A very easy way of bookkeeping is to write a file like this README "NOT WORKING" and put it where the final results are been saved, so people can check what have already been done.


References

Here, there are some references regarding the alignment method, procedures, and examples:
  1. "NOT WORKING" Results for Cu+Cu data: http://www.star.bnl.gov/~fisyak/star/Alignment "NOT WORKING"
  2. "NOT WORKING" How-to's: http://www.star.bnl.gov/~fisyak/star/Alignment/HowTo/HowToDoAlignment.html "NOT WORKING"
  3. "BOTH NOT WORKING" Presentations:
    http://www.star.bnl.gov/~fisyak/star/alignment/SVT%20calibration%20issues%20and%20remaining%20tasks%20for%20the%20future.ppt "NOT WORKING"
    http://www.star.bnl.gov/~fisyak/star/alignment/SVT%20+%20SSD%20calibration%20for%20run%20V.ppt "NOT WORKING"
  4. "Alignment Strategy for the SMT Barrel Detectors", D. Chakraborty, J. D. Hobbs; smtbar01.ps
  5. "A New Method for the High-Precision Alignment of Track Detectors ", V. Blobel, C. Kleinwort; hep-ex/0208021
  6. "Estimation of detector alignment parameters using the Kalman filter with annealing", R Fruhwirth et al; 2003 J. Phys. G: Nucl. Part. Phys. 29 561-574
  7. "The STAR Silicon Vertex Tracker: A large area Silicon Drift Detector", R. Bellwied et al; SVT
  8. "The STAR silicon strip detector (SSD)", L. Arnold et al. SSD
  9. "Sensor Alignment by Tracks", V. Karimaki et al. CHEP03, La Jolla California, March 24-28, 2003
  10. "Simulation of Misalignment Scenarios for CMS Tracking Devices", I. Belotelov et al. CMS NOTE 2006/008
  11. "The HIP Algorithm for Track Based Alignment and its Application to the CMS Pixel Detector", V. Karimaki et al. CMS NOTE 2006/018



Author: Spiros Margetis
Last modified: 11/17/2008 19:54:40