Gen01 Analysis

====>     D O N E     <====

final updates in progress...


Plots   Parameter Files   Results   syncfilter   Reference Information   Tools

the various iterations of Butcher are now here

also of interest may be: offline replay instructions and batch replay instructions.

Back to Gen01 Home Page


Plots         Overview,   Systematics,   Single Event Display     ---     butcher plots are now here
Overview Plots       more plots are in the syncfilter section
Histogram Scans (Q2 = 0.5) of HMS delta and HMS x FP

impact of syncfilter on beam current (Q2=0.5, BCM2)

SEM calibration verification (Q2=0.5, outlaying cluster has been fixed!)

event rates by target (Q2=1.0)

Packing Fraction initial values for Q2 = 0.5 (HMS-I only)

target systematics via proton asymmetry (Q2=0.5)

Plots on Cut Systematics   Q2 = 0.5 only
The following plots show the impact of our cuts, separately for each relevant variable. One version of each plot shows the absolute changes, while the other one has the sets renormalized to highlight changes to the shape of the distribution. In all plots, the data shown in blue have all cuts applied, while those in red are lacking the indicated cut. The yellow line(s) indicate the cut(s) usually applied to the plotted variable (non-normalized plots only), if any.

ntrk_E   (normalized)
ntrk_t   (normalized)
ntrk_dy   (normalized)
W       (normalized)
hstheta   (normalized)
hsdelta   (normalized)
theta_pq   (normalized)
hszbeam   (normalized)
hcer_npe   (normalized)
Single Event Display Plots
bad double 1   bad double 2   double np   double maybe   nice p   plain p   noisy p   xmas tree   plain n


Parameter Files
Beam Polarization authoritative (per MZ)
Target Polarization offline v9 (details -- postscript!)
syncfilter Summary Report
(4th pass)
Q2 = 0.5 Q2 = 1.0 BCA has reversed sign convention compared to GEn data!
(we consider BCM2 to be authoritative) details on syncfilter below...
Background Contamination
Q2 = 0.5 Q2 = 1.0 (details)
Charge Exchange Correction
Q2 = 0.5 Q2 = 1.0 for Q2 = 0.5 the correction was determined in bins, for Q2 = 1.0 only an overall value was calculated; both have a separate 15N correction
calculation details are here
Obsolete Versions:
Q2 = 0.5 Beam Polarization

Target Polarizations:   Q2 = 0.5 ONLINE (2nd-to-last column),   Q2 = 0.5 offline v2 (9th coulumn -- 8th is 1=top, 2=bottom),   Q2 = 0.5 offline v1,   offline v7 incl. target position, Helium level, and magnet field info

target position 1=top+ 2=top- 3=bottom+ 4=bottom- (now in polarization file)

2nd-pass syncfilter report summary Q2 = 0.5


Results         Offline Replay,   Errors     ---     butcher data are now here
Offline Replay Results
All offline replays were hosted on the Jlab work disk, in subdirectories of /work/hallc/e93026/Gen01/. The specific directory hereunder for each replay version is identified in the table on the right. Note that there was no third pass for the Q2 = 0.5 data; the naming of the fourth pass was chosen for parallelism with the Q2 = 1.0 data set.
At this point, most files have been moved to the tape archive, under /mss/hallc/e93026/analysis/. Most small files will have been consolidated into gzip'd tar archives, e.g. individual report types. These can be retrieved via jget or jcache -- see the Jlab CC's writeups on these.
All iterations as of pass 2 were conducted using the respecitve most current version of our tool BatchMan. In most cases, the replay directories (or the tape archive) will contain a file identifying reasons for changes to the list of good runs, e.g. replay.issues. Some of these may also contain details on runs that are still considered good but did encounter replay problems anyway.
All these directories only contain the results of the simple replay; any subsequent processing was done using our Butcher tools. The resulting detail data can be found here, and the final results are also available below.
replay directory on work disk
Q2 = 0.5
Q2 = 1.0
tape archive
Pass 1 FirstPass/ firstpass/
Pass 2 SecondPass/ Q2=1pass2/ secondpass/
Pass 3 (none) Q2=1pass3/ thirdpass/
Pass 4 FourthPass/ Q2=1pass4/ pass4/

Run Lists

Q2 = 0.5 good ND3 runs  after pass 1,  after pass 2   and after pass 4 (final) all Q2 = 0.5 Carbon and empty runs, just Carbon and just MT runs, and good Q2 = 0.5 Carbon and empty runs
Q2 = 1.0 good ND3 runs  after pass 1,  after pass 2   and after pass 4 (final) stick 3 Carbon and empty Runs,   stick 4 Carbon and empty Runs raw ONLINE runlist (manual)
Butcher Results
are now here
Final Results

The secret summary page is here.


The standard way to analyze our data is now to pre-process them with syncfilter. This means you need to make sure that
  • you have the syncfilter executable
  • it is used in your REPLAY.PARM source file designation
  • you turn to the syncfilter.run_number report for
    • run time
    • beam charge, current
    • event count, rate
    • beam charge asymmetry (BCA)
Details on syncfilter and its usage can be found here.
Special Note
The syncfilter results for the 2nd pass of the Q2=0.5 data inadvertently used the calibration of BCM2 for the BCM1 data. This resulted in the BCM1 based data to be incorrect. Since BCM2 is used to measure the charge, the affect of this error is mostly limited to incorrect report values.

Specifically, the current and charge values based on BCM1 are 5% (relative) too small. The exact expression needed to correct the data is (within the calibration limits, in nA):

Itrue = 1.054 * IBCM1 - 0.3
Report Summaries
final summary reports (4th pass)
    Q2 = 0.5     Q2 = 1.0

C and empty runs time & charge

Plots   (EPS files, based on pass 4)
Beam Current Q2 = 0.5 Q2 = 1.0
Event Rates (cfg. HMS pre-trigger rates) Q2 = 0.5 Q2 = 1.0
Beam Charge Asymmetry Q2 = 0.5 Q2 = 1.0
helicity bucket good/all ratio Q2 = 0.5 Q2 = 1.0


Reference Information
Beam Raster ADC Calibration

web page about the determination of the Charge Exchange (CEX) Contamination and the final report (postscript)

Background Study

syncfilter raw data pre-processor

BatchMan analysis management tool

target analysis tech note (postscript)

this and that
dedicated page all about the target, including a target event history.

Target composition details from mailing list msg00620.html

Glen's   nDet   and   checkout   histories, and error summary

Details on the Contents of the coincidence Ntuple

nDet Particle ID Definitions

Cut Definitions for Q2 = 0.5 and Q2 = 1.0 data   (obsolete first idea)

Results of study on impact of vertical raster offset (from mailing list), incl. plot: msg00770.html


Post-Replay Analysis Tools
After the raw data have been analyzed with the replay engine, the experimental information is contained in the events of the coincidence Ntuple. In order to extract a measured asymmetry from these, we need to parse the coincidence events, apply selection cuts, and tabulate event counts for each helicity state.
These counts then provide an asymmetry for each run, which are corrected for beam and target polarization, and also beam charge asymmetry, and then merged into an overall asymmetry. The task is characterized not by difficult calculations but by large data sets. The software tools described here were used to facilitate this post-replay calculations.

The sequence is that first theButcher is used to extract the helicity-based counts from each run. The specific result depends on the exact cuts applied, and this procedure provides for many alternative or complementary cuts (bins) to be handled together, see ButcherSchool.
As the processing done by theButcher needs to be carried out for each run separately, we use the batch farm to speed the process through parallelism: several jobs are running simultaneously, each processing a subset of the runs.
Upon completion, theFreezer merges the results into a single summary file, each row listing the results of one run. This merged file (the chunky beef file) is then processed by theButcherShop to produce the other beef files, ground (per-run asymmetries), mashed (per-run counts and rates), patties (run group asymmetries), and burger (overall results). In the process, the various per-run and overall corrections are applied.

theButcher Example
FORTRAN code;   applies cuts to coincidence Ntuple, event by event, and keeps count of the number of events passing each cut, separately for each beam helicity state;   additional helper tools are used to loop over all runs (on the batch farm);   results require some simple reformatting (manually or via theFreezer) prior to processing with theButcherShop;   use of ButcherSchool in creating this source code is highly recommended.
Compile with:
g77 -L/apps/cernlib/pc_linux/99/lib -lpacklib -lc -lm -ffixed-line-length-132 -g -o theButcher.exe theButcher.f

ButcherSchool Q2 = 0.5 Q2 = 1.0
TCL script;   creates the FORTRAN source of theButcher based on the cuts defined inside the script;   primary utility is the intermediate cut-grouping syntax which permits the un-applying of cuts and therefore significantly eases the definition of exceptions;   also provides a high level of consistency across the various cut groups that may be defined at one time, e.g. for protons and neutrons;   provides significant debugging feedback;   script needs to be edited so desired cuts are defined and applied;   expects two file names on command line, FORTRAN source code to be created and a file for debugging output:
theButcherSchool.tcl   <code>   <debug>
theButcherShop download
TCL script;   after the results from theButcher have been merged into a single chunky file, this script calculates the asymmetries, applies the corrections, and determines subgroup and overall averages, resulting in the files ground, patties and burger, respectively.

apprentice download
shell script (tcsh);   intended for batch farm;   reads list of run numbers and processes Ntuples with theButcher;   results need to be process by theFreezer.   Note: split run list into multiple subsets and run apprentice for each set (at the same time in multiple batch jobs) to maximize processing speed.

theFreezer download
TCL script;   properly merges the results from theButcher (optionally via apprentice) into a single file in chunky format;   counts from multiple Ntuple segments are merged into required single entry per run;   preserves formatting and corrects header.

theDisplaycase download
TCL interface to GNUPLOT;   allows quick display of the "butchered" results;   text interface;   reads the standard beef files chunky, ground and patties, and decodes the file structure;   identifies data columns by label and permits plotting of any number of them, scale adjustments and output to file; probably most useful if used with the patties files;   includes online help.


construction sign This page is a moving target!
(frw) 7-2003
Valid HTML 4.01!