[Gate-users] loss of intensity at irregular time intervals gjs
Nick Laver
Nick.Laver at kromek.com
Mon Jan 16 18:23:10 CET 2012
Good evening Gaters,
I am seeing a loss of intensity in a number of "time slices" when running my model using the gjs (gate job splitter), I was wondering if anybody had experienced anything similar?
I am unable to run my model without the gjs to compare as 500+ CPU hours are required in each run to achieve a significant number of detected counts.
I am splitting my simulation into 125 4ms slices (0.5 seconds in total) across as 9 node simulation farm (on which 67 "cores" are avaliable with 5 remaining spare to allow access via a single node for code development and adding more runs to the pbs queue whilst simulations are running).
My data output is into a ROOT format only and I have run my model with both moving and stationary samples and for an open beam measurement, and repeated identical runs with a different seed (autoseed is always used)
My model is a (slightly modified in source code to allow > 3 clusters) version of the CTScanner scanner with 9 clusters and pixellated detectors.
The loss of intensity does not seem to occur in regular time slices in each run nor are they a result of incomplete runs or my code crashing (verified from the log and err files produced at the end of each run and the completeness of the output root files)
In the root files of reduced detected intensity, there are approximately half the expected counts (compared to the 4ms before and 4ms after for an open beam run) detected and an unusual amount of fluorescence in the output spectum. The distribution of x-rays (singles) interacting with the crystal is also unusual in distribution between the front and back rows of pixels.
Apologies for the poor description of my problem.
Any advice/ideas are much appreciated.
Kind regards,
Nick
More information about the Gate-users
mailing list