TWINPHOX version 1.0
Twinphox has been tested on DEC Alpha and Linux PC/Laptop.
Before running the program,
the user is advised to read the
concerning the limitations of applicability
of the code.
Structure of the program
| twinphox |
|Directory which contains the files:
"param_histo.indat" and "start.pl"
||Directories containing all the source
||Directory containing the photon pdfs afg and afg02
||Directory where the histograms
will be stored
||Directory where the ntuple can be stored
||Directory containing the BASES/SPRING package
Load the TWINPHOX code as gzipped .tar file
and unpack it.
In the directory "working", there is a
Makefile (using the GNU makefile).
The user has to verify that the fortran compiler called in the Makefile
is the one implemented on his machine.
The default is the f77 DEC fortran compiler
FC = f77.
On Linux PCs use the GNU compiler
Options of compilation:
FFLAGS = -O : the compilation flag FFLAGS = -O corresponds to the
default level of optimization used by the f77 DEC
compiler (it works also with g77). For any alternative choice of optimization,
please see your compiler's manual.
The user has to verify that the choice of the makefile to build
the BASES library is the one corresponding to his machine.
One can choose between:
Makefile.dec, Makefile.f2c, Makefile.hiux, Makefile.hpux, Makefile.sgi,
Makefile.sun, Makefile.tpl, Makefile.linux
depending on the computer to be used. The default is
MAKEBASES = Makefile.dec
The program uses the exe cernlib which is not provided with the
program package because it is very likely that the cernlib
is already installed on your system (if not you can get it from the
program library ).
The user has to verify that the path to the CERN library on his system
has been set correctly.
Default: PATHCERNLIB = cern/pro/bin
- Run once the Makefile with the option bases ("gmake bases")
to create the library for Bases/Spring.
To run the program
Fix the input parameters of your choice in the file
in the directory "working".
This is the main input file containing the parameters for
integration and event generation.
Further information on
these parameters is given directly in this file.
If you also would like to project events into histograms,
define the corresponding parameters in the file
- To run the perl script "start.pl"
which creates the executable files, type "perl start.pl". This does the following:
The two files "parameter.indat" and
containing the input parameters are read automatically.
In case you have several parameter files with different names,
you simply have to specify the names of your parameter files as options: type
"perl start.pl --parameter your_parameterfilename
Note that "param_histo.indat"
will be read only if the flag "histo" in the file
"parameter.indat" has been selected.
start.pl automatically rewrites the Makefile
according to the options you chose in
and runs the compiler.
start.pl creates an executable file called "run_string.exe" where "string" is the
name of the run
defined in "parameter.indat". You have to execute this file with your favorite options.
- start.pl creates a directory called
In this directory, there will be a file
a summary of the input which "start.pl"
has read from "param_histo.indat".
The string NameHisto used to label the files containing the histograms (or the ntuple) is defined in "parameter.indat".
The directory "resultstring"
also will contain subdirectories called
depending on which subprocesses you selected in your input.
The shorthand "ddd"
stands for the contribution where both the initial state and the final state photons
are direct. (Note that the final state photon is always direct when using Frixione's isolation criterium,
since the fragmentation part does not contribute.)
In "drd" the initial state photon travelling in positive z-direction is
direct and the other one is resolved,
in "rdd" the reverse is true.
In "rrd" both photons are resolved.
So the structure of the output is the following:
| resultstring |
|parameters read from "param_histo.indat"
||output files specified below
|| see below
|| see below
|| see below
|"output.param" ||parameters read from parameter.indat ||"sigmaint.res" ||physical integrated cross-section
||"integral.res" ||results of integration of absolute values of
no physical meaning,
needed for normalization of generated events
|"*.bs" ||results of the grid for each pseudo-cross section;
needed for event generation
||"twinphox.log" ||details of the integration by BASES
(efficiency, convergence behaviour, accuracy,time per point, ...)
- With twinphox, we provide the AFG and the new AFG02 parton
distribution functions for the real photon,
but the user can choose other pdfs from the CERN pdflib
- In the input file "parameter.indat"
you have to choose between two options:
If you you are not interested in integrated cross sections, set the
corresponding flag to "FALSE" and make directly the grids in order to save
integration time. You do not need to calculate
first the integrated
cross section in order to make the grid and/or generate events.
- either to compute the integrated cross section over the phase space defined by
- or to make the grid and generate events.
- If you want to generate isolated photon events, you have to apply the
isolation cuts already when making the grids.
- Once you have produced the grids (stored in the *.bs files),
you can use them again to generate new events and enlarge an existing event
sample without having to recompute grids. However if you are interested in a
kinematic range which is not populated efficiently by the existing grids,
keeping these grids will generate too many events that further will be lost
in the histrogramming stage. In such a case the computation of a new
dedicated grid is much more efficient in the end.
If you chose the option "histo" in your input to produce directly
histograms instead of passing via ntuples, you can refine some cuts
already defined in "parameter.indat"
and specify the histograms you want in the
(there is a list of defined cut-parameters and variables in
When running "start.pl" the corresponding Fortran files
to fill the histograms with the specified cuts are created automatically.
You can use the information given in the file "twinphox.log" to adjust the
parameters for BASES/SPRING such that the efficiency is optimized.
- Please contact
in case you have any questions or comments.
Back to main page
Back to Twinphox presentation page