This page is intended to help visualize how the choice of jet
definition impacts a dijet invariant mass reconstruction at
LHC.
The controls fall into 4 groups:
the jet definition
the binning and quality measures
the jet-type (quark, gluon) and mass scale
pileup and subtraction
The events were simulated
with Pythia
6.4 (DWT tune) and reconstructed
with FastJet 2.3.
For more information, view and listen to
the flash demo, or click on
individual terms.
This page has been tested with Firefox v2 and v3, IE7,
Safari v3, Opera v9.5, Chrome 0.2.
if a graph does not appear here, this may be a sign of a problem with javascript
kt algorithm
The inclusive,
longitudinally invariant kt algorithm is a pairwise
sequential recombination jet algorithm, with a distance measure that
relates to the relative transverse momentum between particles.
Cambridge/Aachen algorithm
The Cambridge/Aachen algorithm is a pairwise sequential
recombination jet algorithm, whose distance measure is the
rapidity-azimuth distance between particles. The algorithms stops
when all objects are separated by more than R.
anti-kt
The anti-kt
algorithm is a pairwise sequential recombination algorithm with
the property that the hard jets in an event tend to have a circular
profile in the rapidity-azimuth plane.
This algorithm is a good (infrared, collinear safe) replacement for
iterative and fixed-cone algorithms that implement
progressive-removal (an example being the CMS iterative cone),
insofar as these also produce circular jets.
SISCone
The SISCone jet algorithm finds all stable cones in an event and
then runs a Tevatron Run II type split-merge procedure on them.
It is similar in many respects to the midpoint cone algorithm (used
at Tevatron) and other iterative cones with split-merge steps, but
without their infrared-safety issues.
Here it has been used with the split-merge overlap threshold of
f=0.75 (which limits monster-jet formation), no pt
threshold on stable cones, and an infinite number of passes.
Cambridge-Aachen with filtering
This consists of the Cambridge-Aachen algorithm with an
additional filtering
procedure: subsequent to the jet finding, each jet is
unclustered down to subjets at angular scale xfiltR and
one retains only the nfilt hardest of the subjets. We
use xfilt=0.5 nfilt=2.
Filtering is designed to limit sensitivity to the underlying
event and pileup, while retaining the bulk of perturbative
radiation. The parameters of the filtering might well be an
interesting subject for further study.
R
The jet-radius parameter for the jet definition.
For an event that has one hard particle and one soft one in a
common neighbourhood, R corresponds to the distance in the
rapidity-azimuth plane below which the two particles will be
combined into a single jet.
For situations involving two hard particles in a neighbourhood,
and for more complex events, the relation between R and the
clustering will depend on the details of the jet algorithm.
ρL
The extra factor in luminosity that is needed to obtain a signal to
background significance that is as good as that with the best jet
definition (for this process and energy scale, with no pileup and
no subtraction).
Quality-measure plots
In the plots of quality measures versus R, the green line indicates
the best value obtained across all jet definitions for the given
process and energy scale (without pileup or subtraction).
Qwf=z
A quality measure defined as the width of the smallest mass
window that contains a fraction f of the generated massive
objects. Smaller values indicate a better jet definition.
The value used for the fraction f depends on the process, and
has been chosen so that one considers roughly 25% of the events that
pass the event selection (or if "x 2" is checked, 50%).
Q1/fw=x√M
For this quality measure, one takes a mass window of width w,
positioning it so as to maximise its contents. The quality measure
itself is given by 1/f, the inverse of the fraction of the generated
massive objects that are contained in the mass window. Smaller
values indicate a better jet definition.
The value used for the window width w is 1.25√M,
corresponding to a typical experimental energy resolution for jets.
If "x 2" is checked, the window width is doubled.
x 2
The values for the fraction of events f in
Qwf=z and for the window width w in
Q1/fw=x√M are, to some extent, arbitrary choices.
When "x 2" is checked, the default choices for those values are
doubled. This allows one to gauge the degree to which any physics
conclusions might depend on those choices.
In the few cases where the resulting best-R value changes
noticeably, an examination of the histograms usually provides
insight into what is occurring.
Rebinning
One's impression of the quality of the peak can depend on the
binning chosen for the histogram. If your intuition disagrees with
the quality measures as to what constitutes the best peak, try
choosing a binning whose width is of the same order as the width of
the shaded band.
qq
The qq case allows one to examine the mass reconstruction quality
for qq→Z'→qq events.
The Z', which has been given an
artificially small width, serves as a physically well-defined source
of mono-energetic quark jets.
gg
The gg case allows one to examine the mass reconstruction quality
for gg→H→gg events.
The H, which has been given an
artificially small width (and sometimes artificially high masses),
serves as a physically well-defined source of mono-energetic gluon
jets.
mass
Vary the mass to see how the jet-finding is affected by the energy
scale of the process.
A good jet definition at one given energy scale may not be optimal
at all energy scales. One reason for this is that the energy
resolution for reconstructing a jet is affected by an interplay
between underlying-event (UE) contamination and loss of perturbative
radiation from the "parton" that induced the jet
(cf. arXiv:0712.3014). The former should mostly be independent of
jet energy, but the latter isn't.
pileup
Pileup degrades mass resolution because it adds extra noise to each jet.
To help appreciate the normalisation quoted for the pileup,
0.05mb-1/ev corresponds to an average of 5 minimum-bias
events added to each hard event, distributed according to a
Poissonian. This is foreseen to be the level during the first
years of running of the LHC.
0.25mb-1/ev, corresponding to an average of 25
minimum-bias events per hard event, is the level expected for the
high-luminosity phase of LHC.
Subtraction
The LHC experiments will probably attempt some form of subtraction
of pileup contamination from their events. Various methods exist,
some of which are specific to a given detector.
To establish the impact of subtraction, here you can turn on the
(experiment-independent) area-based subtraction
of arXiv:0707.1378. This
uses an event-by-event estimation of the level of noise, which is
then used to correct each jet individually according to its area.
Note that subtraction here also removes part of the
underlying-event activity.