Page MenuHomeHEPForge

No OneTemporary

This file is larger than 256 KB, so syntax highlighting was skipped.
This document is not UTF8. It was detected as ISO-8859-1 (Latin 1) and converted to UTF8 for display.
diff --git a/ChangeLog b/ChangeLog
--- a/ChangeLog
+++ b/ChangeLog
@@ -1,4610 +1,4635 @@
+2015-03-16 Andy Buckley <andy.buckley@cern.ch>
+
+ * Adding Cuts-based constructor to PrimaryHadrons.
+
+ * Adding missing compare() method to HeavyHadrons projection.
+
+2015-03-05 Andy Buckley <andy.buckley@cern.ch>
+
+ * Adding Cuts-based constructors and other tidying in UnstableFinalState and HeavyHadrons
+
+2015-03-03 Andy Buckley <andy.buckley@cern.ch>
+
+ * Add support for a PLOT meta-file argument to rivet-cmphistos.
+
+2015-02-27 Andy Buckley <andy.buckley@cern.ch>
+
+ * Improved time reporting.
+
+2015-02-24 Andy Buckley <andy.buckley@cern.ch>
+
+ * Add Particle::fromHadron and Particle::fromPromptTau, and add a
+ boolean 'prompt' argument to Particle::fromTau.
+
+ * Fix WFinder use-transverse-mass property setting. Thanks to Christian Gutschow.
+
2015-02-04 Andy Buckley <andy.buckley@cern.ch>
* Add more protection against math domain errors with log axes.
* Add some protection against nan-valued points and error bars in make-plots.
2015-02-03 Andy Buckley <andy.buckley@cern.ch>
* Converting 'bitwise' to 'logical' Cuts combinations in all analyses.
2015-02-02 Andy Buckley <andy.buckley@cern.ch>
* Use vector MET rather than scalar VET (doh...) in WFinder
cut. Thanks to Ines Ochoa for the bug report.
* Updating and tidying analyses with deprecation warnings.
* Adding more Cuts/FS constructors for Charged,Neutral,UnstableFinalState.
* Add &&, || and ! operators for without-parens-warnings Cut
combining. Note these don't short-circuit, but this is ok since
Cut comparisons don't have side-effects.
* Add absetaIn, absrapIn Cut range definitions.
* Updating use of sorted particle/jet access methods and cmp
functors in projections and analyses.
2014-12-09 Andy Buckley <andy.buckley@cern.ch>
* Adding a --cmd arg to rivet-buildplugin to allow the output
paths to be sed'ed (to help deal with naive Grid
distribution). For example BUILDROOT=`rivet-config --prefix`;
rivet-buildplugin PHOTONS.cc --cmd | sed -e
"s:$BUILDROOT:$SITEROOT:g"
2014-11-26 Andy Buckley <andy.buckley@cern.ch>
* Interface improvements in DressedLeptons constructor.
* Adding DEPRECATED macro to throw compiler deprecation warnings when using deprecated features.
2014-11-25 Andy Buckley <andy.buckley@cern.ch>
* Adding Cut-based constructors, and various constructors with
lists of PDG codes to IdentifiedFinalState.
2014-11-20 Andy Buckley <andy.buckley@cern.ch>
* Analysis updates (ATLAS, CMS, CDF, D0) to apply the changes below.
* Adding JetAlg jets(Cut, Sorter) methods, and other interface
improvements for cut and sorted ParticleBase retrieval from JetAlg
and ParticleFinder projections. Some old many-doubles versions
removed, syntactic sugar sorting methods deprecated.
* Adding Cuts::Et and Cuts::ptIn, Cuts::etIn, Cuts::massIn.
* Moving FastJet includes, conversions, uses etc. into Tools/RivetFastJet.hh
2014-10-07 Andy Buckley <andy.buckley@cern.ch>
* Fix a bug in the isCharmHadron(pid) function and remove isStrange* functions.
2014-09-30 Andy Buckley <andy.buckley@cern.ch>
* 2.2.0 release!
* Mark Jet::containsBottom and Jet::containsCharm as deprecated
methods: use the new methods. Analyses updated.
* Add Jet::bTagged(), Jet::cTagged() and Jet::tauTagged() as
ghost-assoc-based replacements for the 'contains' tagging methods.
2014-09-17 Andy Buckley <andy.buckley@cern.ch>
* Adding support for 1D and 3D YODA scatters, and helper methods
for calling the efficiency, asymm and 2D histo divide functions.
2014-09-12 Andy Buckley <andy.buckley@cern.ch>
* Adding 5 new ATLAS analyses:
ATLAS_2011_I921594: Inclusive isolated prompt photon analysis with full 2010 LHC data
ATLAS_2013_I1263495: Inclusive isolated prompt photon analysis with 2011 LHC data
ATLAS_2014_I1279489: Measurements of electroweak production of dijets + $Z$ boson, and distributions sensitive to vector boson fusion
ATLAS_2014_I1282441: The differential production cross section of the $\phi(1020)$ meson in $\sqrt{s}=7$ TeV $pp$ collisions measured with the ATLAS detector
ATLAS_2014_I1298811: Leading jet underlying event at 7 TeV in ATLAS
* Adding a median(vector<NUM>) function and fixing the other stats
functions to operate on vector<NUM> rather than vector<int>.
2014-09-03 Andy Buckley <andy.buckley@cern.ch>
* Fix wrong behaviour of LorentzTransform with a null boost vector
-- thanks to Michael Grosse.
2014-08-26 Andy Buckley <andy.buckley@cern.ch>
* Add calc() methods to Hemispheres as requested, to allow it to
be used with Jet or FourMomentum inputs outside the normal
projection system.
2014-08-17 Andy Buckley <andy.buckley@cern.ch>
* Improvements to the particles methods on
ParticleFinder/FinalState, in particular adding the range of cuts
arguments cf. JetAlg (and tweaking the sorted jets equivalent) and
returning as a copy rather than a reference if cut/sorted to avoid
accidentally messing up the cached copy.
* Creating ParticleFinder projection base class, and moving
Particles-accessing methods from FinalState into it.
* Adding basic forms of MC_ELECTRONS, MC_MUONS, and MC_TAUS
analyses.
2014-08-15 Andy Buckley <andy.buckley@cern.ch>
* Version bump to 2.2.0beta1 for use at BOOST and MCnet school.
2014-08-13 Andy Buckley <andy.buckley@cern.ch>
* New analyses:
ATLAS_2014_I1268975 (high mass dijet cross-section at 7 TeV)
ATLAS_2014_I1304688 (jet multiplicity and pT at 7 TeV)
ATLAS_2014_I1307756 (scalar diphoton resonance search at 8 TeV -- no histograms!)
CMSTOTEM_2014_I1294140 (charged particle pseudorapidity at 8 TeV)
2014-08-09 Andy Buckley <andy.buckley@cern.ch>
* Adding PromptFinalState, based on code submitted by Alex
Grohsjean and Will Bell. Thanks!
2014-08-06 Andy Buckley <andy.buckley@cern.ch>
* Adding MC_HFJETS and MC_JETTAGS analyses.
2014-08-05 Andy Buckley <andy.buckley@cern.ch>
* Update all analyses to use the xMin/Max/Mid, xMean, xWidth,
etc. methods on YODA classes rather than the deprecated lowEdge
etc.
* Merge new HasPID functor from Holger Schulz into
Rivet/Tools/ParticleUtils.hh, mainly for use with the any()
function in Rivet/Tools/Utils.hh
2014-08-04 Andy Buckley <andy.buckley@cern.ch>
* Add ghost tagging of charms, bottoms and taus to FastJets, and
tag info accessors to Jet.
* Add constructors from and cast operators to FastJet's PseudoJet
object from Particle and Jet.
* Convert inRange to not use fuzzy comparisons on closed
intervals, providing old version as fuzzyInRange.
2014-07-30 Andy Buckley <andy.buckley@cern.ch>
* Remove classifier functions accepting a Particle from the PID
inner namespace.
2014-07-29 Andy Buckley <andy.buckley@cern.ch>
* MC_JetAnalysis.cc: re-enable +- ratios for eta and y, now that
YODA divide doesn't throw an exception.
* ATLAS_2012_I1093734: fix a loop index error which led to the
first bin value being unfilled for half the dphi plots.
* Fix accidental passing of a GenParticle pointer as a PID code
int in HeavyHadrons.cc. Effect limited to incorrect deductions
about excited HF decay chains and should be small. Thanks to
Tomasz Przedzinski for finding and reporting the issue during
HepMC3 design work!
2014-07-23 Andy Buckley <andy.buckley@cern.ch>
* Fix to logspace: make sure that start and end values are exact,
not the result of exp(log(x)).
2014-07-16 Andy Buckley <andy.buckley@cern.ch>
* Fix setting of library paths for doc building: Python can't
influence the dynamic loader in its own process by setting an
environment variable because the loader only looks at the variable
once, when it starts.
2014-07-02 Andy Buckley <andy.buckley@cern.ch>
* rivet-cmphistos now uses the generic yoda.read() function rather
than readYODA() -- AIDA files can also be compared and plotted
directly now.
2014-06-24 Andy Buckley <andy.buckley@cern.ch>
* Add stupid missing <string> include and std:: prefix in Rivet.hh
2014-06-20 Holger Schulz <hschulz@physik.hu-berlin.de>
* bin/make-plots: Automatic generation of minor xtick labels if LogX is requested
but data resides e.g. in [200, 700]. Fixes m_12 plots of, e.g.
ATLAS_2010_S8817804
2014-06-17 David Grellscheid <David.Grellscheid@durham.ac.uk>
* pyext/rivet/Makefile.am: 'make distcheck' and out-of-source
builds should work now.
2014-06-10 Andy Buckley <andy.buckley@cern.ch>
* Fix use of the install command for bash completion installation on Macs.
2014-06-07 Andy Buckley <andy.buckley@cern.ch>
* Removing direct includes of MathUtils.hh and others from analysis code files.
2014-06-02 Andy Buckley <andy.buckley@cern.ch>
* Rivet 2.1.2 release!
2014-05-30 Andy Buckley <andy.buckley@cern.ch>
* Using Particle absrap(), abseta() and abspid() where automatic conversion was feasible.
* Adding a few extra kinematics mappings to ParticleBase.
* Adding p3() accessors to the 3-momentum on FourMomentum, Particle, and Jet.
* Using Jet and Particle kinematics methods directly (without momentum()) where possible.
* More tweaks to make-plots 2D histo parsing behaviour.
2014-05-30 Holger Schulz <hschulz@physik.hu-berlin.de>
* Actually fill the XQ 2D histo, .plot decorations.
* Have make-plots produce colourmaps using YODA_3D_SCATTER
objects. Remove the grid in colourmaps.
* Some tweaks for the SFM analysis, trying to contact Martin Wunsch
who did the unfolding back then.
2014-05-29 Holger Schulz <hschulz@physik.hu-berlin.de>
* Re-enable 2D histo in MC_PDFS
2014-05-28 Andy Buckley <andy.buckley@cern.ch>
* Updating analysis and project routines to use Particle::pid() by
preference to Particle::pdgId(), and Particle::abspid() by
preference to abs(Particle::pdgId()), etc.
* Adding interfacing of smart pointer types and booking etc. for
YODA 2D histograms and profiles.
* Improving ParticleIdUtils and ParticleUtils functions based on
merging of improved function collections from MCUtils, and
dropping the compiled ParticleIdUtils.cc file.
2014-05-27 Andy Buckley <andy.buckley@cern.ch>
* Adding CMS_2012_I1090423 (dijet angular distributions),
CMS_2013_I1256943 (Zbb xsec and angular correlations),
CMS_2013_I1261026 (jet and UE properties vs. Nch) and
D0_2000_I499943 (bbbar production xsec and angular correlations).
2014-05-26 Andy Buckley <andy.buckley@cern.ch>
* Fixing a bug in plot file handling, and adding a texpand()
routine to rivet.util, to be used to expand some 'standard'
physics TeX macros.
* Adding ATLAS_2012_I1124167 (min bias event shapes),
ATLAS_2012_I1203852 (ZZ cross-section), and ATLAS_2013_I1190187
(WW cross-section) analyses.
2014-05-16 Andy Buckley <andy.buckley@cern.ch>
* Adding any(iterable, fn) and all(iterable, fn) template functions for convenience.
2014-05-15 Holger Schulz <holger.schulz@cern.ch>
* Fix some bugs in identified hadron PIDs in OPAL_1998_S3749908.
2014-05-13 Andy Buckley <andy.buckley@cern.ch>
* Writing out [UNVALIDATED], [PRELIMINARY], etc. in the
--list-analyses output if analysis is not VALIDATED.
2014-05-12 Andy Buckley <andy.buckley@cern.ch>
* Adding CMS_2013_I1265659 colour coherence analysis.
2014-05-07 Andy Buckley <andy.buckley@cern.ch>
* Bug fixes in CMS_2013_I1209721 from Giulio Lenzi.
* Fixing compiler warnings from clang, including one which
indicated a misapplied cut bug in CDF_2006_S6653332.
2014-05-05 Andy Buckley <andy.buckley@cern.ch>
* Fix missing abs() in Particle::abspid()!!!!
2014-04-14 Andy Buckley <andy.buckley@cern.ch>
* Adding the namespace protection workaround for Boost described
at http://www.boost.org/doc/libs/1_55_0/doc/html/foreach.html
2014-04-13 Andy Buckley <andy.buckley@cern.ch>
* Adding a rivet.pc template file and installation rule for pkg-config to use.
* Updating data/refdata/ALEPH_2001_S4656318.yoda to corrected version in HepData.
2014-03-27 Andy Buckley <andy.buckley@cern.ch>
* Flattening PNG output of make-plots (i.e. no transparency) and other tweaks.
2014-03-23 Andy Buckley <andy.buckley@cern.ch>
* Renaming the internal meta-particle class in DressedLeptons (and
exposed in the W/ZFinders) from ClusteredLepton to DressedLepton
for consistency with the change in name of its containing class.
* Removing need for cmake and unportable yaml-cpp trickery by
using libtool to build an embedded symbol-mangled copy of yaml-cpp
rather than trying to mangle and build direct from the tarball.
2014-03-10 Andy Buckley <andy.buckley@cern.ch>
* Rivet 2.1.1 release.
2014-03-07 Andy Buckley <andy.buckley@cern.ch>
* Adding ATLAS multilepton search (no ref data file), ATLAS_2012_I1204447.
2014-03-05 Andy Buckley <andy.buckley@cern.ch>
* Also renaming Breit-Wigner functions to cdfBW, invcdfBW and bwspace.
* Renaming index_between() to the more Rivety binIndex(), since that's the only real use of such a function... plus a bit of SFINAE type relaxation trickery.
2014-03-04 Andy Buckley <andy.buckley@cern.ch>
* Adding programmatic access to final histograms via AnalysisHandler::getData().
* Adding CMS 4 jet correlations analysis, CMS_2013_I1273574.
* Adding CMS W + 2 jet double parton scattering analysis, CMS_2013_I1272853.
* Adding ATLAS isolated diphoton measurement, ATLAS_2012_I1199269.
* Improving the index_between function so the numeric types don't
have to exactly match.
* Adding better momentum comparison functors and sortBy, sortByX
functions to use them easily on containers of Particle, Jet, and
FourMomentum.
2014-02-10 Andy Buckley <andy.buckley@cern.ch>
* Removing duplicate and unused ParticleBase sorting functors.
* Removing unused HT increment and units in ATLAS_2012_I1180197 (unvalidated SUSY).
* Fixing photon isolation logic bug in CMS_2013_I1258128 (Z rapidity).
* Replacing internal uses of #include Rivet/Rivet.hh with
Rivet/Config/RivetCommon.hh, removing the MAXRAPIDITY const, and
repurposing Rivet/Rivet.hh as a convenience include for external
API users.
* Adding isStable, children, allDescendants, stableDescendants,
and flightLength functions to Particle.
* Replacing Particle and Jet deltaX functions with generic ones on
ParticleBase, and adding deltaRap variants.
* Adding a Jet.fhh forward declaration header, including fastjet::PseudoJet.
* Adding a RivetCommon.hh header to allow Rivet.hh to be used externally.
* Fixing HeavyHadrons to apply pT cuts if specified.
2014-02-06 Andy Buckley <andy.buckley@cern.ch>
* 2.1.0 release!
2014-02-05 Andy Buckley <andy.buckley@cern.ch>
* Protect against invalid prefix value if the --prefix configure option is unused.
2014-02-03 Andy Buckley <andy.buckley@cern.ch>
* Adding the ATLAS_2012_I1093734 fwd-bwd / azimuthal minbias correlations analysis.
* Adding the LHCB_2013_I1208105 forward energy flow analysis.
2014-01-31 Andy Buckley <andy.buckley@cern.ch>
* Checking the YODA minimum version in the configure script.
* Fixing the JADE_OPAL analysis ycut values to the midpoints,
thanks to information from Christoph Pahl / Stefan Kluth.
2014-01-29 Andy Buckley <andy.buckley@cern.ch>
* Removing unused/overrestrictive Isolation* headers.
2014-01-27 Andy Buckley <andy.buckley@cern.ch>
* Re-bundling yaml-cpp, now built as a mangled static lib based on
the LHAPDF6 experience.
* Throw a UserError rather than an assert if AnalysisHandler::init
is called more than once.
2014-01-25 David Grellscheid <david.grellscheid@durham.ac.uk>
* src/Core/Cuts.cc: New Cuts machinery, already used in FinalState.
Old-style "mineta, maxeta, minpt" constructors kept around for ease of
transition. Minimal set of convenience functions available, like EtaIn(),
should be expanded as needed.
2014-01-22 Andy Buckley <andy.buckley@cern.ch>
* configure.ac: Remove opportunistic C++11 build, until this
becomes mandatory (in version 2.2.0?). Anyone who wants C++11 can
explicitly set the CXXFLAGS (and DYLDFLAGS for pre-Mavericks Macs)
2014-01-21 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Core/Analysis.cc: Fixed bug in Analysis::isCompatible where
an 'abs' was left out when checking that beam energes does not
differ by more than 1GeV.
* src/Analyses/CMS_2011_S8978280.cc: Fixed checking of beam energy
and booking corresponding histograms.
2013-12-19 Andy Buckley <andy.buckley@cern.ch>
* Adding pid() and abspid() methods to Particle.
* Adding hasCharm and hasBottom methods to Particle.
* Adding a sorting functor arg version of the ZFinder::constituents() method.
* Adding pTmin cut accessors to HeavyHadrons.
* Tweak to the WFinder constructor to place the target W (trans) mass argument last.
2013-12-18 Andy Buckley <andy.buckley@cern.ch>
* Adding a GenParticle* cast operator to Particle, removing the
Particle and Jet copies of the momentum cmp functors, and general
tidying/improvement/unification of the momentum properties of jets
and particles.
2013-12-17 Andy Buckley <andy.buckley@cern.ch>
* Using SFINAE techniques to improve the math util functions.
* Adding isNeutrino to ParticleIdUtils, and
isHadron/isMeson/isBaryon/isLepton/isNeutrino methods to Particle.
* Adding a FourMomentum cast operator to ParticleBase, so that
Particle and Jet objects can be used directly as FourMomentums.
2013-12-16 Andy Buckley <andy@duality>
* LeptonClusters renamed to DressedLeptons.
* Adding singular particle accessor functions to WFinder and ZFinder.
* Removing ClusteredPhotons and converting ATLAS_2010_S8919674.
2013-12-12 Andy Buckley <andy.buckley@cern.ch>
* Fixing a problem with --disable-analyses (thanks to David Hall)
* Require FastJet version 3.
* Bumped version to 2.1.0a0
* Adding -DNDEBUG to the default build flags, unless in --enable-debug mode.
* Adding a special treatment of RIVET_*_PATH variables: if they
end in :: the default search paths will not be appended. Used
primarily to restrict the doc builds to look only inside the build
dirs, but potentially also useful in other special circumstances.
* Adding a definition of exec_prefix to rivet-buildplugin.
* Adding -DNDEBUG to the default non-debug build flags.
2013-11-27 Andy Buckley <andy.buckley@cern.ch>
* Removing accidentally still-present no-as-needed linker flag from rivet-config.
* Lots of analysis clean-up and migration to use new features and W/Z finder APIs.
* More momentum method forwarding on ParticleBase and adding
abseta(), absrap() etc. functions.
* Adding the DEFAULT_RIVET_ANA_CONSTRUCTOR cosmetic macro.
* Adding deltaRap() etc. function variations
* Adding no-decay photon clustering option to WFinder and ZFinder,
and replacing opaque bool args with enums.
* Adding an option for ignoring photons from hadron/tau decays in LeptonClusters.
2013-11-22 Andy Buckley <andy.buckley@cern.ch>
* Adding Particle::fromBottom/Charm/Tau() members. LHCb were
aready mocking this up, so it seemed sensible to add it to the
interface as a more popular (and even less dangerous) version of
hasAncestor().
* Adding an empty() member to the JetAlg interface.
2013-11-07 Andy Buckley <andy.buckley@cern.ch>
* Adding the GSL lib path to the library path in the env scripts
and the rivet-config --ldflags output.
2013-10-25 Andy Buckley <andy.buckley@cern.ch>
* 2.0.0 release!!!!!!
2013-10-24 Andy Buckley <andy.buckley@cern.ch>
* Supporting zsh completion via bash completion compatibility.
2013-10-22 Andy Buckley <andy.buckley@cern.ch>
* Updating the manual to describe YODA rather than AIDA and the new rivet-cmphistos script.
* bin/make-plots: Adding paths to error messages in histogram combination.
* CDF_2005_S6217184: fixes to low stats errors and final scatter plot binning.
2013-10-21 Andy Buckley <andy.buckley@cern.ch>
* Several small fixes in jet shape analyses, SFM_1984, etc. found
in the last H++ validation run.
2013-10-18 Andy Buckley <andy.buckley@cern.ch>
* Updates to configure and the rivetenv scripts to try harder to discover YODA.
2013-09-26 Andy Buckley <andy.buckley@cern.ch>
* Now bundling Cython-generated files in the tarballs, so Cython
is not a build requirement for non-developers.
2013-09-24 Andy Buckley <andy.buckley@cern.ch>
* Removing unnecessary uses of a momentum() indirection when
accessing kinematic variables.
* Clean-up in Jet, Particle, and ParticleBase, in particular
splitting PID functions on Particle from those on PID codes, and
adding convenience kinematic functions to ParticleBase.
2013-09-23 Andy Buckley <andy.buckley@cern.ch>
* Add the -avoid-version flag to libtool.
* Final analysis histogramming issues resolved.
2013-08-16 Andy Buckley <andy.buckley@cern.ch>
* Adding a ConnectBins flag in make-plots, to decide whether to
connect adjacent, gapless bins with a vertical line. Enabled by
default (good for the step-histo default look of MC lines), but
now rivet-cmphistos disables it for the reference data.
2013-08-14 Andy Buckley <andy.buckley@cern.ch>
* Making 2.0.0beta3 -- just a few remaining analysis migration
issues remaining but it's worth making another beta since there
were lots of framework fixes/improvements.
2013-08-11 Andy Buckley <andy.buckley@cern.ch>
* ARGUS_1993_S2669951 also fixed using scatter autobooking.
* Fixing remaining issues with booking in BABAR_2007_S7266081
using the feature below (far nicer than hard-coding).
* Adding a copy_pts param to some Analysis::bookScatter2D methods:
pre-setting the points with x values is sometimes genuinely
useful.
2013-07-26 Andy Buckley <andy.buckley@cern.ch>
* Removed the (officially) obsolete CDF 2008 LEADINGJETS and
NOTE_9351 underlying event analyses -- superseded by the proper
versions of these analyses based on the final combined paper.
* Removed the semi-demo Multiplicity projection -- only the
EXAMPLE analysis and the trivial ALEPH_1991_S2435284 needed
adaptation.
2013-07-24 Andy Buckley <andy.buckley@cern.ch>
* Adding a rejection of histo paths containing /TMP/ from the
writeData function. Use this to handle booked temporary
histograms... for now.
2013-07-23 Andy Buckley <andy.buckley@cern.ch>
* Make rivet-cmphistos _not_ draw a ratio plot if there is only one line.
* Improvements and fixes to HepData lookup with rivet-mkanalysis.
2013-07-22 Andy Buckley <andy.buckley@cern.ch>
* Add -std=c++11 or -std=c++0x to the Rivet compiler flags if supported.
* Various fixes to analyses with non-zero numerical diffs.
2013-06-12 Andy Buckley <andy.buckley@cern.ch>
* Adding a new HeavyHadrons projection.
* Adding optional extra include_end args to logspace() and linspace().
2013-06-11 Andy Buckley <andy.buckley@cern.ch>
* Moving Rivet/RivetXXX.hh tools headers into Rivet/Tools/.
* Adding PrimaryHadrons projection.
* Adding particles_in/out functions on GenParticle to RivetHepMC.
* Moved STL extensions from Utils.hh to RivetSTL.hh and tidying.
* Tidying, improving, extending, and documenting in RivetSTL.hh.
* Adding a #include of Logging.hh into Projection.hh, and removing
unnecessary #includes from all Projection headers.
2013-06-10 Andy Buckley <andy.buckley@cern.ch>
* Moving htmlify() and detex() Python functions into rivet.util.
* Add HepData URL for Inspire ID lookup to the rivet script.
* Fix analyses' info files which accidentally listed the Inspire
ID under the SpiresID metadata key.
2013-06-07 Andy Buckley <andy.buckley@cern.ch>
* Updating mk-analysis-html script to produce MathJax output
* Adding a version of Analysis::removeAnalysisObject() which takes
an AO pointer as its argument.
* bin/rivet: Adding pandoc-based conversion of TeX summary and
description strings to plain text on the terminal output.
* Add MathJax to rivet-mkhtml output, set up so the .info entries should render ok.
* Mark the OPAL 1993 analysis as UNVALIDATED: from the H++
benchmark runs it looks nothing like the data, and there are some
outstanding ambiguities.
2013-06-06 Andy Buckley <andy.buckley@cern.ch>
* Releasing 2.0.0b2 beta version.
2013-06-05 Andy Buckley <andy.buckley@cern.ch>
* Renaming Analysis::add() etc. to very explicit
addAnalysisObject(), sorting out shared_pointer polymorphism
issues via the Boost dynamic_pointer_cast, and adding a full set
of getHisto1D(), etc. explicitly named and typed accessors,
including ones with HepData dataset/axis ID signatures.
* Adding histo booking from an explicit reference Scatter2D (and
more placeholders for 2D histos / 3D scatters) and rewriting
existing autobooking to use this.
* Converting inappropriate uses of size_t to unsigned int in Analysis.
* Moving Analysis::addPlot to Analysis::add() (or reg()?) and
adding get() and remove() (or unreg()?)
* Fixing attempted abstraction of import fallbacks in rivet.util.import_ET().
* Removing broken attempt at histoDir() caching which led to all
histograms being registered under the same analysis name.
2013-06-04 Andy Buckley <andy.buckley@cern.ch>
* Updating the Cython version requirement to 0.18
* Adding Analysis::integrate() functions and tidying the Analysis.hh file a bit.
2013-06-03 Andy Buckley <andy.buckley@cern.ch>
* Adding explicit protection against using inf/nan scalefactors in
ATLAS_2011_S9131140 and H1_2000_S4129130.
* Making Analysis::scale noisly complain about invalid
scalefactors.
2013-05-31 Andy Buckley <andy.buckley@cern.ch>
* Reducing the TeX main memory to ~500MB. Turns out that it *can*
be too large with new versions of TeXLive!
2013-05-30 Andy Buckley <andy.buckley@cern.ch>
* Reverting bookScatter2D behaviour to never look at ref data, and
updating a few affected analyses. This should fix some bugs with
doubled datapoints introduced by the previous behaviour+addPoint.
* Adding a couple of minor Utils.hh and MathUtils.hh features.
2013-05-29 Andy Buckley <andy.buckley@cern.ch>
* Removing Constraints.hh header.
* Minor bugfixes and improvements in Scatter2D booking and MC_JetAnalysis.
2013-05-28 Andy Buckley <andy.buckley@cern.ch>
* Removing defunct HistoFormat.hh and HistoHandler.{hh,cc}
2013-05-27 Andy Buckley <andy.buckley@cern.ch>
* Removing includes of Logging.hh, RivetYODA.hh, and
ParticleIdUtils.hh from analyses (and adding an include of
ParticleIdUtils.hh to Analysis.hh)
* Removing now-unused .fhh files.
* Removing lots of unnecessary .fhh includes from core classes:
everything still compiling ok. A good opportunity to tidy this up
before the release.
* Moving the rivet-completion script from the data dir to bin (the
completion is for scripts in bin, and this makes development
easier).
* Updating bash completion scripts for YODA format and
compare-histos -> rivet-cmphistos.
2013-05-23 Andy Buckley <andy.buckley@cern.ch>
* Adding Doxy comments and a couple of useful convenience functions to Utils.hh.
* Final tweaks to ATLAS ttbar jet veto analysis (checked logic with Kiran Joshi).
2013-05-15 Andy Buckley <andy.buckley@cern.ch>
* Many 1.0 -> weight bugfixes in ATLAS_2011_I945498.
* yaml-cpp v3 support re-introduced in .info parsing.
* Lots of analysis clean-ups for YODA TODO issues.
2013-05-13 Andy Buckley <andy.buckley@cern.ch>
* Analysis histo booking improvements for Scatter2D, placeholders
for 2D histos, and general tidying.
2013-05-12 Andy Buckley <andy.buckley@cern.ch>
* Adding configure-time differentiation between yaml-cpp API versions 3 and 5.
2013-05-07 Andy Buckley <andy.buckley@cern.ch>
* Converting info file reading to use the yaml-cpp 0.5.x API.
2013-04-12 Andy Buckley <andy.buckley@cern.ch>
* Tagging as 2.0.0b1
2013-04-04 Andy Buckley <andy.buckley@cern.ch>
* Removing bundling of yaml-cpp: it needs to be installed by the
user / bootstrap script from now on.
2013-04-03 Andy Buckley <andy.buckley@cern.ch>
* Removing svn:external m4 directory, and converting Boost
detection to use better boost.m4 macros.
2013-03-22 Andy Buckley <andy.buckley@cern.ch>
* Moving PID consts to the PID namespace and corresponding code
updates and opportunistic clean-ups.
* Adding Particle::fromDecay() method.
2013-03-09 Andy Buckley <andy.buckley@cern.ch>
* Version bump to 2.0.0b1 in anticipation of first beta release.
* Adding many more 'popular' particle ID code named-consts and
aliases, and updating the RapScheme enum with ETA -> ETARAP, and
fixing affected analyses (plus other opportunistic tidying / minor
bug-fixing).
* Fixing a symbol misnaming in ATLAS_2012_I1119557.
2013-03-07 Andy Buckley <andy.buckley@cern.ch>
* Renaming existing uses of ParticleVector to the new 'Particles' type.
* Updating util classes, projections, and analyses to deal with
the HepMC return value changes.
* Adding new Particle(const GenParticle*) constructor.
* Converting Particle::genParticle() to return a const pointer
rather than a reference, for the same reason as below (+
consistency within Rivet and with the HepMC pointer-centric coding
design).
* Converting Event to use a different implementation of original
and modified GenParticles, and to manage the memory in a more
future-proof way. Event::genParticle() now returns a const pointer
rather than a reference, to signal that the user is leaving the
happy pastures of 'normal' Rivet behind.
* Adding a Particles typedef by analogy to Jets, and in preference
to the cumbersome ParticleVector.
* bin/: Lots of tidying/pruning of messy/defunct scripts.
* Creating spiresbib, util, and plotinfo rivet python module
submodules: this eliminates lighthisto and the standalone
spiresbib modules. Util contains convenience functions for Python
version testing, clean ElementTree import, and process renaming,
for primary use by the rivet-* scripts.
* Removing defunct scripts that have been replaced/obsoleted by YODA.
2013-03-06 Andy Buckley <andy.buckley@cern.ch>
* Fixing doc build so that the reference histos and titles are
~correctly documented. We may want to truncate some of the lists!
2013-03-06 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ATLAS_2012_I1125575 analysis
* Converted rivet-mkhtml to yoda
* Introduced rivet-cmphistos as yoda based replacement for compare-histos
2013-03-05 Andy Buckley <andy.buckley@cern.ch>
* Replacing all AIDA ref data with YODA versions.
* Fixing the histograms entries in the documentation to be
tolerant to plotinfo loading failures.
* Making the findDatafile() function primarily find YODA data
files, then fall back to AIDA. The ref data loader will use the
appropriate YODA format reader.
2013-02-05 David Grellscheid <David.Grellscheid@durham.ac.uk>
* include/Rivet/Math/MathUtils.hh: added BWspace bin edge method
to give equal-area Breit-Wigner bins
2013-02-01 Andy Buckley <andy.buckley@cern.ch>
* Adding an element to the PhiMapping enum and a new mapAngle(angle, mapping) function.
* Fixes to Vector3::azimuthalAngle and Vector3::polarAngle calculation (using the mapAngle functions).
2013-01-25 Frank Siegert <frank.siegert@cern.ch>
* Split MC_*JETS analyses into three separate bits:
MC_*INC (inclusive properties)
MC_*JETS (jet properties)
MC_*KTSPLITTINGS (kT splitting scales).
2013-01-22 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix TeX variable in the rivetenv scripts, especially for csh
2012-12-21 Andy Buckley <andy.buckley@cern.ch>
* Version 1.8.2 release!
2012-12-20 Andy Buckley <andy.buckley@cern.ch>
* Adding ATLAS_2012_I1119557 analysis (from Roman Lysak and Lily Asquith).
2012-12-18 Andy Buckley <andy.buckley@cern.ch>
* Adding TOTEM_2012_002 analysis, from Sercan Sen.
2012-12-18 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2011_I954992 analysis
2012-12-17 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2012_I1193338 analysis
* Fixed xi cut in ATLAS_2011_I894867
2012-12-17 Andy Buckley <andy.buckley@cern.ch>
* Adding analysis descriptions to the HTML analysis page ToC.
2012-12-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2012_PAS_FWD_11_003 analysis
* Added LHCB_2012_I1119400 analysis
2012-12-12 Andy Buckley <andy.buckley@cern.ch>
* Correction to jet acceptance in CMS_2011_S9120041, from Sercan Sen: thanks!
2012-12-12 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2012_PAS_QCD_11_010 analysis
2012-12-07 Andy Buckley <andy.buckley@cern.ch>
* Version number bump to 1.8.2 -- release approaching.
* Rewrite of ALICE_2012_I1181770 analysis to make it a bit more sane and acceptable.
* Adding a note on FourVector and FourMomentum that operator- and
operator-= invert both the space and time components: use of -=
can result in a vector with negative energy.
* Adding particlesByRapidity and particlesByAbsRapidity to FinalState.
2012-12-07 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ALICE_2012_I1181770 analysis
* Bump version to 1.8.2
2012-12-06 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ATLAS_2012_I1188891 analysis
* Added ATLAS_2012_I1118269 analysis
* Added CMS_2012_I1184941 analysis
* Added LHCB_2010_I867355 analysis
* Added TGraphErrors support to root2flat
2012-11-27 Andy Buckley <andy.buckley@cern.ch>
* Converting CMS_2012_I1102908 analysis to use YODA.
* Adding XLabel and YLabel setting in histo/profile/scatter booking.
2012-11-27 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix make-plots png creation for SL5
2012-11-23 Peter Richardson <peter.richardson@durham.ac.uk>
* Added ATLAS_2012_CONF_2012_153 4-lepton SUSY search
2012-11-17 Andy Buckley <andy.buckley@cern.ch>
* Adding MC_PHOTONS by Steve Lloyd and AB, for testing general
unisolated photon properties, especially those associated with
charged leptons (e and mu).
2012-11-16 Andy Buckley <andy.buckley@cern.ch>
* Adding MC_PRINTEVENT, a convenient (but verbose!) analysis for
printing out event details to stdout.
2012-11-15 Andy Buckley <andy.buckley@cern.ch>
* Removing the long-unused/defunct autopackage system.
2012-11-15 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added LHCF_2012_I1115479 analysis
* Added ATLAS_2011_I894867 analysis
2012-11-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2012_I1102908 analysis
2012-11-14 Andy Buckley <andy.buckley@cern.ch>
* Converting the argument order of logspace, clarifying the
arguments, updating affected code, and removing Analysis::logBinEdges.
* Merging updates from the AIDA maintenance branch up to r4002
(latest revision for next merges is r4009).
2012-11-11 Andy Buckley <andy.buckley@cern.ch>
* include/Math/: Various numerical fixes to Vector3::angle and
changing the 4 vector mass treatment to permit spacelike
virtualities (in some cases even the fuzzy isZero assert check was
being violated). The angle check allows a clean-up of some
workaround code in MC_VH2BB.
2012-10-15 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2012_I1107658 analysis
2012-10-11 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CDF_2012_NOTE10874 analysis
2012-10-04 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ATLAS_2012_I1183818 analysis
2012-07-17 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Cleanup and multiple fixes in CMS_2011_S9120041
* Bugfixed in ALEPH_2004_S5765862 and ATLAS_2010_CONF_2010_049
(thanks to Anil Pratap)
2012-08-09 Andy Buckley <andy.buckley@cern.ch>
* Fixing aida2root command-line help message and converting to TH*
rather than TGraph by default.
2012-07-24 Andy Buckley <andy.buckley@cern.ch>
* Improvements/migrations to rivet-mkhtml, rivet-mkanalysis, and
rivet-buildplugin.
2012-07-17 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Add CMS_2012_I1087342
2012-07-12 Andy Buckley <andy.buckley@cern.ch>
* Fix rivet-mkanalysis a bit for YODA compatibility.
2012-07-05 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Version 1.8.1!
2012-07-05 Holger Schulz <holger.schulz@physik.hu-berlin.de>
* Add ATLAS_2011_I945498
2012-07-03 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Bugfix for transverse mass (thanks to Gavin Hesketh)
2012-06-29 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Merge YODA branch into trunk. YODA is alive!!!!!!
2012-06-26 Holger Schulz <holger.schulz@physik.hu-berlin.de>
* Add ATLAS_2012_I1091481
2012-06-20 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added D0_2011_I895662: 3-jet mass
2012-04-24 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* fixed a few bugs in rivet-rmgaps
* Added new TOTEM dN/deta analysis
2012-03-19 Andy Buckley <andy.buckley@cern.ch>
* Version 1.8.0!
* src/Projections/UnstableFinalState.cc: Fix compiler warning.
* Version bump for testing: 1.8.0beta1.
* src/Core/AnalysisInfo.cc: Add printout of YAML parser exception error messages to aid debugging.
* bin/Makefile.am: Attempt to fix rivet-nopy build on SLC5.
* src/Analyses/LHCB_2010_S8758301.cc: Add two missing entries to the PDGID -> lifetime map.
* src/Projections/UnstableFinalState.cc: Extend list of vetoed particles to include reggeons.
2012-03-16 Andy Buckley <andy.buckley@cern.ch>
* Version change to 1.8.0beta0 -- nearly ready for long-awaited release!
* pyext/setup.py.in: Adding handling for the YAML library: fix for
Genser build from Anton Karneyeu.
* src/Analyses/LHCB_2011_I917009.cc: Hiding lifetime-lookup error
message if the offending particle is not a hadron.
* include/Rivet/Math/MathHeader.hh: Using unnamespaced std::isnan
and std::isinf as standard.
2012-03-16 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Improve default plot behaviour for 2D histograms
2012-03-15 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Make ATLAS_2012_I1084540 less verbose, and general code
cleanup of that analysis.
* New-style plugin hook in ATLAS_2011_I926145,
ATLAS_2011_I944826 and ATLAS_2012_I1084540
* Fix compiler warnings in ATLAS_2011_I944826 and CMS_2011_S8973270
* CMS_2011_S8941262: Weights are double, not int.
* disable inRange() tests in test/testMath.cc until we have a proper
fix for the compiler warnings we see on SL5.
2012-03-07 Andy Buckley <andy.buckley@cern.ch>
* Marking ATLAS_2011_I919017 as VALIDATED (this should have
happened a long time ago) and adding more references.
2012-02-28 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* lighthisto.py: Caching for re.compile(). This speeds up aida2flat
and flat2aida by more than an order of magnitude.
2012-02-27 Andy Buckley <andy.buckley@cern.ch>
* doc/mk-analysis-html: Adding more LaTeX/text -> HTML conversion
replacements, including better <,> handling.
2012-02-26 Andy Buckley <andy.buckley@cern.ch>
* Adding CMS_2011_S8973270, CMS_2011_S8941262, CMS_2011_S9215166,
CMS_QCD_10_024, from CMS.
* Adding LHCB_2011_I917009 analysis, from Alex Grecu.
* src/Core/Analysis.cc, include/Rivet/Analysis.hh: Add a numeric-arg version of histoPath().
2012-02-24 Holger Schulz <holger.schulz@physik.hu-berlin.de>
* Adding ATLAS Ks/Lambda analysis.
2012-02-20 Andy Buckley <andy.buckley@cern.ch>
* src/Analyses/ATLAS_2011_I925932.cc: Using new overflow-aware
normalize() in place of counters and scale(..., 1/count)
2012-02-14 Andy Buckley <andy.buckley@cern.ch>
* Splitting MC_GENERIC to put the PDF and PID plotting into
MC_PDFS and MC_IDENTIFIED respectively.
* Renaming MC_LEADINGJETS to MC_LEADJETUE.
2012-02-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* DELPHI_1996_S3430090 and ALEPH_1996_S3486095:
fix rapidity vs {Thrust,Sphericity}-axis.
2012-02-14 Andy Buckley <andy.buckley@cern.ch>
* bin/compare-histos: Don't attempt to remove bins from MC histos
where they aren't found in the ref file, if the ref file is not
expt data, or if the new --no-rmgapbins arg is given.
* bin/rivet: Remove the conversion of requested analysis names to
upper-case: mixed-case analysis names will now work.
2012-02-14 Frank Siegert <frank.siegert@cern.ch>
* Bugfixes and improvements for MC_TTBAR:
- Avoid assert failure with logspace starting at 0.0
- Ignore charged lepton in jet finding (otherwise jet multi is always
+1).
- Add some dR/deta/dphi distributions as noted in TODO
- Change pT plots to logspace as well (to avoid low-stat high pT bins)
2012-02-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* rivet-mkhtml -c option now has the semantics of a .plot
file. The contents are appended to the dat output by
compare-histos.
2012-02-09 David Grellscheid <david.grellscheid@durham.ac.uk>
* Fixed broken UnstableFS behaviour
2012-01-25 Frank Siegert <frank.siegert@cern.ch>
* Improvements in make-plots:
- Add PlotTickLabels and RatioPlotTickLabels options (cf.
make-plots.txt)
- Make ErrorBars and ErrorBands non-exclusive (and change
their order, such that Bars are on top of Bands)
2012-01-25 Holger Schulz <holger.schulz@physik.hu-berlin.de>
* Add ATLAS diffractive gap analysis
2012-01-23 Andy Buckley <andy.buckley@cern.ch>
* bin/rivet: When using --list-analyses, the analysis summary is
now printed out when log level is <= INFO, rather than < INFO.
The effect on command line behaviour is that useful identifying
info is now printed by default when using --list-analyses, rather
than requiring --list-analyses -v. To get the old behaviour,
e.g. if using the output of rivet --list-analyses for scripting,
now use --list-analyses -q.
2012-01-22 Andy Buckley <andy.buckley@cern.ch>
* Tidying lighthisto, including fixing the order in which +- error
values are passed to the Bin constructor in fromFlatHisto.
2012-01-16 Frank Siegert <frank.siegert@cern.ch>
* Bugfix in ATLAS_2012_I1083318: Include non-signal neutrinos in
jet clustering.
* Add first version of ATLAS_2012_I1083318 (W+jets). Still
UNVALIDATED until final happiness with validation plots arises and
data is in Hepdata.
* Bugfix in ATLAS_2010_S8919674: Really use neutrino with highest
pT for Etmiss. Doesn't seem to make very much difference, but is
more correct in principle.
2012-01-16 Peter Richardson <peter.richardson@durham.ac.uk>
* Fixes to ATLAS_20111_S9225137 to include reference data
2012-01-13 Holger Schulz <holger.schulz@physik.hu-berlin.de>
* Add ATLAS inclusive lepton analysis
2012-01-12 Hendrik Hoeth <hendrik.hoeth@durham.ac.uk>
* Font selection support in rivet-mkhtml
2012-01-11 Peter Richardson <peter.richardson@durham.ac.uk>
* Added pi0 to list of particles.
2012-01-11 Andy Buckley <andy.buckley@cern.ch>
* Removing references to Boost random numbers.
2011-12-30 Andy Buckley <andy.buckley@cern.ch>
* Adding a placeholder rivet-which script (not currently
installed).
* Tweaking to avoid a very time-consuming debug printout in
compare-histos with the -v flag, and modifying the Rivet env vars
in rivet-mkhtml before calling compare-histos to eliminate
problems induced by relative paths (i.e. "." does not mean the
same thing when the directory is changed within the script).
2011-12-12 Andy Buckley <andy.buckley@cern.ch>
* Adding a command line completion function for rivet-mkhtml.
2011-12-12 Frank Siegert <frank.siegert@cern.ch>
* Fix for factor of 2.0 in normalisation of CMS_2011_S9086218
* Add --ignore-missing option to rivet-mkhtml to ignore non-existing
AIDA files.
2011-12-06 Andy Buckley <andy.buckley@cern.ch>
* Include underflow and overflow bins in the normalisation when
calling Analysis::normalise(h)
2011-11-23 Andy Buckley <andy.buckley@cern.ch>
* Bumping version to 1.8.0alpha0 since the Jet interface changes
are quite a major break with backward compatibility (although the
vast majority of analyses should be unaffected).
* Removing crufty legacy stuff from the Jet class -- there is
never any ambiguity between whether Particle or FourMomentum
objects are the constituents now, and the jet 4-momentum is set
explicitly by the jet alg so that e.g. there is no mismatch if the
FastJet pt recombination scheme is used.
* Adding default do-nothing implementations of Analysis::init()
and Analysis::finalize(), since it is possible for analysis
implementations to not need to do anything in these methods, and
forcing analysis authors to write do-nothing boilerplate code is
not "the Rivet way"!
2011-11-19 Andy Buckley <andy.buckley@cern.ch>
* Adding variant constructors to FastJets with a more natural
Plugin* argument, and decrufting the constructor implementations a
bit.
* bin/rivet: Adding a more helpful error message if the rivet
module can't be loaded, grouping the option parser options,
removing the -A option (this just doesn't seem useful anymore),
and providing a --pwd option as a shortcut to append "." to the
search path.
2011-11-18 Andy Buckley <andy.buckley@cern.ch>
* Adding a guide to compiling a new analysis template to the
output message of rivet-mkanalysis.
2011-11-11 Andy Buckley <andy.buckley@cern.ch>
* Version 1.7.0 release!
* Protecting the OPAL 2004 analysis against NaNs in the
hemispheres projection -- I can't track the origin of these and
suspect some occasional memory corruption.
2011-11-09 Andy Buckley <andy@insectnation.org>
* Renaming source files for EXAMPLE and
PDG_HADRON_MULTIPLICITIES(_RATIOS) analyses to match the analysis
names.
* Cosmetic fixes in ATLAS_2011_S9212183 SUSY analysis.
* Adding new ATLAS W pT analysis from Elena Yatsenko (slightly adapted).
2011-10-20 Frank Siegert <frank.siegert@cern.ch>
* Extend API of W/ZFinder to allow for specification of input final
state in which to search for leptons/photons.
2011-10-19 Andy Buckley <andy@insectnation.org>
* Adding new version of LHCB_2010_S8758301, based on submission
from Alex Grecu. There is some slightly dodgy-looking GenParticle*
fiddling going on, but apparently it's necessary (and hopefully robust).
2011-10-17 Andy Buckley <andy@insectnation.org>
* bin/rivet-nopy linker line tweak to make compilation work with
GCC 4.6 (-lHepMC has to be explicitly added for some reason).
2011-10-13 Frank Siegert <frank.siegert@cern.ch>
* Add four CMS QCD analyses kindly provided by CMS.
2011-10-12 Andy Buckley <andy@insectnation.org>
* Adding a separate test program for non-matrix/vector math
functions, and adding a new set of int/float literal arg tests for
the inRange functions in it.
* Adding a jet multiplicity plot for jets with pT > 30 GeV to
MC_TTBAR.
2011-10-11 Andy Buckley <andy@insectnation.org>
* Removing SVertex.
2011-10-11 James Monk <jmonk@cern.ch>
* root2flat was missing the first bin (plus spurious last bin)
* My version of bash does not understand the pipe syntax |& in rivet-buildplugin
2011-09-30 James Monk <jmonk@cern.ch>
* Fix bug in ATLAS_2010_S8817804 that misidentified the akt4 jets
as akt6
2011-09-29 Andy Buckley <andy@insectnation.org>
* Converting FinalStateHCM to a slightly more general
DISFinalState.
2011-09-26 Andy Buckley <andy@insectnation.org>
* Adding a default libname argument to rivet-buildplugin. If the
first argument doesn't have a .so library suffix, then use
RivetAnalysis.so as the default.
2011-09-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* make-plots: Fixing regex for \physicscoor. Adding "FrameColor"
option.
2011-09-17 Andy Buckley <andy@insectnation.org>
* Improving interactive metadata printout, by not printing
headings for missing info.
* Bumping the release number to 1.7.0alpha0, since with these
SPIRES/Inspire changes and the MissingMomentum API change we need
more than a minor release.
* Updating the mkanalysis, BibTeX-grabbing and other places that
care about analysis SPIRES IDs to also be able to handle the new
Inspire system record IDs. The missing link is getting to HepData
from an Inspire code...
* Using the .info file rather than an in-code declaration to
specify that an analysis needs cross-section information.
* Adding Inspire support to the AnalysisInfo and Analysis
interfaces. Maybe we can find a way to combine the two,
e.g. return the SPIRES code prefixed with an "S" if no Inspire ID
is available...
2011-09-17 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ALICE_2011_S8909580 (strange particle production at 900 GeV)
* Feed-down correction in ALICE_2011_S8945144
2011-09-16 Andy Buckley <andy@insectnation.org>
* Adding ATLAS track jet analysis, modified from the version
provided by Seth Zenz: ATLAS_2011_I919017. Note that this analysis
is currently using the Inspire ID rather than the Spires one:
we're clearly going to have to update the API to handle Inspire
codes, so might as well start now...
2011-09-14 Andy Buckley <andy@insectnation.org>
* Adding the ATLAS Z pT measurement at 7 TeV (ATLAS_2011_S9131140)
and an MC analysis for VH->bb events (MC_VH2BB).
2011-09-12 Andy Buckley <andy@insectnation.org>
* Removing uses of getLog, cout, cerr, and endl from all standard
analyses and projections, except in a very few special cases.
2011-09-10 Andy Buckley <andy@insectnation.org>
* Changing the behaviour and interface of the MissingMomentum
projection to calculate vector ET correctly. This was previously
calculated according to the common definition of -E*sin(theta) of
the summed visible 4-momentum in the event, but that is incorrect
because the timelike term grows monotonically. Instead, transverse
2-vectors of size ET need to be constructed for each visible
particle, and vector-summed in the transverse plane.
The rewrite of this behaviour made it opportune to make an API
improvement: the previous method names scalarET/vectorET() have
been renamed to scalar/vectorEt() to better match the Rivet
FourMomentum::Et() method, and MissingMomentum::vectorEt() now
returns a Vector3 rather than a double so that the transverse
missing Et direction is also available.
Only one data analysis has been affected by this change in
behaviour: the D0_2004_S5992206 dijet delta(phi) analysis. It's
expected that this change will not be very significant, as it is
a *veto* on significant missing ET to reduce non-QCD
contributions. MC studies using this analysis ~always run with QCD
events only, so these contributions should be small. The analysis
efficiency may have been greatly improved, as fewer events will
now fail the missing ET veto cut.
* Add sorting of the ParticleVector returned by the ChargedLeptons
projection.
* configure.ac: Adding a check to make sure that no-one tries to
install into --prefix=$PWD.
2011-09-04 Andy Buckley <andy@insectnation.org>
* lighthisto fixes from Christian Roehr.
2011-08-26 Andy Buckley <andy@insectnation.org>
* Removing deprecated features: the setBeams(...) method on
Analysis, the MaxRapidity constant, the split(...) function, the
default init() method from AnalysisHandler and its test, and the
deprecated TotalVisibleMomentum and PVertex projections.
2011-08-23 Andy Buckley <andy@insectnation.org>
* Adding a new DECLARE_RIVET_PLUGIN wrapper macro to hide the
details of the plugin hook system from analysis authors. Migration
of all analyses and the rivet-mkanalysis script to use this as the
standard plugin hook syntax.
* Also call the --cflags option on root-config when using the
--root option with rivet-biuldplugin (thanks to Richard Corke for
the report)
2011-08-23 Frank Siegert <frank.siegert@cern.ch>
* Added ATLAS_2011_S9126244
* Added ATLAS_2011_S9128077
2011-08-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ALICE_2011_S8945144
* Remove obsolete setBeams() from the analyses
* Update CMS_2011_S8957746 reference data to the official numbers
* Use Inspire rather than Spires.
2011-08-19 Frank Siegert <frank.siegert@cern.ch>
* More NLO parton level generator friendliness: Don't crash or fail when
there are no beam particles.
* Add --ignore-beams option to skip compatibility check.
2011-08-09 David Mallows <dave.mallows@gmail.com>
* Fix aida2flat to ignore empty dataPointSet
2011-08-07 Andy Buckley <andy@insectnation.org>
* Adding TEXINPUTS and LATEXINPUTS prepend definitions to the
variables provided by rivetenv.(c)sh. A manual setting of these
variables that didn't include the Rivet TEXMFHOME path was
breaking make-plots on lxplus, presumably since the system LaTeX
packages are so old there.
2011-08-02 Frank Siegert <frank.siegert@cern.ch>
Version 1.6.0 release!
2011-08-01 Frank Siegert <frank.siegert@cern.ch>
* Overhaul of the WFinder and ZFinder projections, including a change
of interface. This solves potential problems with leptons which are not
W/Z constituents being excluded from the RemainingFinalState.
2011-07-29 Andy Buckley <andy@insectnation.org>
* Version 1.5.2 release!
* New version of aida2root from James Monk.
2011-07-29 Frank Siegert <frank.siegert@cern.ch>
* Fix implementation of --config file option in make-plots.
2011-07-27 David Mallows <dave.mallows@gmail.com>
* Updated MC_TTBAR.plot to reflect updated analysis.
2011-07-25 Andy Buckley <andy@insectnation.org>
* Adding a useTransverseMass flag method and implementation to
InvMassFinalState, and using it in the WFinder, after feedback
from Gavin Hesketh. This was the neatest way I could do it :S Some
other tidying up happened along the way.
* Adding transverse mass massT and massT2 methods and functions
for FourMomentum.
2011-07-22 Frank Siegert <frank.siegert@cern.ch>
* Added ATLAS_2011_S9120807
* Add two more observables to MC_DIPHOTON and make its isolation cut
more LHC-like
* Add linear photon pT histo to MC_PHOTONJETS
2011-07-20 Andy Buckley <andy@insectnation.org>
* Making MC_TTBAR work with semileptonic ttbar events and generally
tidying the code.
2011-07-19 Andy Buckley <andy@insectnation.org>
* Version bump to 1.5.2.b01 in preparation for a release in the
very near future.
2011-07-18 David Mallows <dave.mallows@gmail.com>
* Replaced MC_TTBAR: Added t,tbar reconstruction. Not yet working.
2011-07-18 Andy Buckley <andy@insectnation.org>
* bin/rivet-buildplugin.in: Pass the AM_CXXFLAGS
variable (including the warning flags) to the C++ compiler when
building user analysis plugins.
* include/LWH/DataPointSet.h: Fix accidental setting of errorMinus
= scalefactor * error_Plus_. Thanks to Anton Karneyeu for the bug
report!
2011-07-18 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added CMS_2011_S8884919 (charged hadron multiplicity in NSD
events corrected to pT>=0).
* Added CMS_2010_S8656010 and CMS_2010_S8547297 (charged
hadron pT and eta in NSD events)
* Added CMS_2011_S8968497 (chi_dijet)
* Added CMS_2011_S8978280 (strangeness)
2011-07-13 Andy Buckley <andy@insectnation.org>
* Rivet PDF manual updates, to not spread disinformation about
bootstrapping a Genser repo.
2011-07-12 Andy Buckley <andy@insectnation.org>
* bin/make-plots: Protect property reading against unstripped \r
characters from DOS newlines.
* bin/rivet-mkhtml: Add a -M unmatch regex flag (note that these
are matching the analysis path rather than individual histos on
this script), and speed up the initial analysis identification and
selection by avoiding loops of regex comparisons for repeats of
strings which have already been analysed.
* bin/compare-histos: remove the completely (?) unused histogram
list, and add -m and -M regex flags, as for aida2flat and
flat2aida.
2011-06-30 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* fix fromFlat() in lighthistos: It would ignore histogram paths
before.
* flat2aida: preserve histogram order from .dat files
2011-06-27 Andy Buckley <andy@insectnation.org>
* pyext/setup.py.in: Use CXXFLAGS and LDFLAGS safely in the Python
extension build, and improve the use of build/src directory
arguments.
2011-06-23 Andy Buckley <andy@insectnation.org>
* Adding a tentative rivet-updateanalyses script, based on
lhapdf-getdata, which will download new analyses as requested. We
could change our analysis-providing behaviour a bit to allow this
sort of delivery mechanism to be used as the normal way of getting
analysis updates without us having to make a whole new Rivet
release. It is nice to be able to identify analyses with releases,
though, for tracking whether bugs have been addressed.
2011-06-10 Frank Siegert <frank.siegert@cern.ch>
* Bugfixes in WFinder.
2011-06-10 Andy Buckley <andy@insectnation.org>
* Adding \physicsxcoor and \physicsycoor treatment to make-plots.
2011-06-06 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Allow for negative cross-sections. NLO tools need this.
* make-plots: For RatioPlotMode=deviation also consider the MC
uncertainties, not just data.
2011-06-04 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Add support for goodness-of-fit calculations to make-plots.
The results are shown in the legend, and one histogram can
be selected to determine the color of the plot margin. See
the documentation for more details.
2011-06-04 Andy Buckley <andy@insectnation.org>
* Adding auto conversion of Histogram2D to DataPointSets in the
AnalysisHandler _normalizeTree method.
2011-06-03 Andy Buckley <andy@insectnation.org>
* Adding a file-weight feature to the Run object, which will
optionally rescale the weights in the provided HepMC files. This
should be useful for e.g. running on multiple differently-weighted
AlpGen HepMC files/streams. The new functionality is used by the
rivet command via an optional weight appended to the filename with
a colon delimiter, e.g. "rivet fifo1.hepmc fifo2.hepmc:2.31"
2011-06-01 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Add BeamThrust projection
2011-05-31 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix LIBS for fastjet-3.0
* Add basic infrastructure for Taylor plots in make-plots
* Fix OPAL_2004_S6132243: They are using charged+neutral.
* Release 1.5.1
2011-05-22 Andy Buckley <andy@insectnation.org>
* Adding plots of stable and decayed PID multiplicities to
MC_GENERIC (useful for sanity-checking generator setups).
* Removing actually-unused ProjectionApplier.fhh forward
declaration header.
2011-05-20 Andy Buckley <andy@insectnation.org>
* Removing import of ipython shell from rivet-rescale, having just
seen it throw a multi-coloured warning message on a student's
lxplus Rivet session!
* Adding support for the compare-histos --no-ratio flag when using
rivet-mkhtml. Adding --rel-ratio, --linear, etc. is an exercise
for the enthusiast ;-)
2011-05-10 Andy Buckley <andy@insectnation.org>
* Internal minor changes to the ProjectionHandler and
ProjectionApplier interfaces, in particular changing the
ProjectionHandler::create() function to be called getInstance and
to return a reference rather than a pointer. The reference change
is to make way for an improved singleton implementation, which
cannot yet be used due to a bug in projection memory
management. The code of the improved singleton is available, but
commented out, in ProjectionManager.hh to allow for easier
migration and to avoid branching.
2011-05-08 Andy Buckley <andy@insectnation.org>
* Extending flat2aida to be able to read from and write to
stdin/out as for aida2flat, and also eliminating the internal
histo parsing representation in favour of the one in
lighthisto. lighthisto's fromFlat also needed a bit of an
overhaul: it has been extended to parse each histo's chunk of
text (including BEGIN and END lines) in fromFlatHisto, and for
fromFlat to parse a collection of histos from a file, in keeping
with the behaviour of fromDPS/fromAIDA. Merging into Professor is
now needed.
* Extending aida2flat to have a better usage message, to accept
input from stdin for command chaining via pipes, and to be a bit
more sensibly internally structured (although it also now has to
hold all histos in memory before writing out -- that shouldn't be
a problem for anything other than truly huge histo files).
2011-05-04 Andy Buckley <andy@insectnation.org>
* compare-histos: If using --mc-errs style, prefer dotted and
dashdotted line styles to dashed, since dashes are often too long
to be distinguishable from solid lines. Even better might be to
always use a solid line for MC errs style, and to add more colours.
* rivet-mkhtml: use a no-mc-errors drawing style by default,
to match the behaviour of compare-histos, which it calls. The
--no-mc-errs option has been replaced with an --mc-errs option.
2011-05-04 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Ignore duplicate files in compare-histos.
2011-04-25 Andy Buckley <andy@insectnation.org>
* Adding some hadron-specific N and sumET vs. |eta| plots to MC_GENERIC.
* Re-adding an explicit attempt to get the beam particles, since
HepMC's IO_HERWIG seems to not always set them even though it's
meant to.
2011-04-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added ATLAS_2011_S9002537 W asymmetry analysis
2011-04-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* deltaR, deltaPhi, deltaEta now available in all combinations of
FourVector, FourMomentum, Vector3, doubles. They also accept jets
and particles as arguments now.
2011-04-13 David Grellscheid <david.grellscheid@durham.ac.uk>
* added ATLAS 8983313: 0-lepton BSM
2011-04-01 Andy Buckley <andy@insectnation.org>
* bin/rivet-mkanalysis: Don't try to download SPIRES or HepData
info if it's not a standard analysis (i.e. if the SPIRES ID is not
known), and make the default .info file validly parseable by YAML,
which was an unfortunate gotcha for anyone writing a first
analysis.
2011-03-31 Andy Buckley <andy@insectnation.org>
* bin/compare-histos: Write more appropriate ratio plot labels
when not comparing to data, and use the default make-plots labels
when comparing to data.
* bin/rivet-mkhtml: Adding a timestamp to the generated pages, and
a -t/--title option to allow setting the main HTML page title on
the command line: otherwise it becomes impossible to tell these
pages apart when you have a lot of them, except by URL!
2011-03-24 Andy Buckley <andy@insectnation.org>
* bin/aida2flat: Adding a -M option to *exclude* histograms whose
paths match a regex. Writing a negative lookahead regex with
positive matching was far too awkward!
2011-03-18 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Core/AnalysisHandler.cc (AnalysisHandler::removeAnalysis):
Fixed strange shared pointer assignment that caused seg-fault.
2011-03-13 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* filling of functions works now in a more intuitive way (I hope).
2011-03-09 Andy Buckley <andy@insectnation.org>
* Version 1.5.0 release!
2011-03-08 Andy Buckley <andy@insectnation.org>
* Adding some extra checks for external packages in make-plots.
2011-03-07 Andy Buckley <andy@insectnation.org>
* Changing the accuracy of the beam energy checking to 1%, to make
the UI a bit more forgiving. It's still best to specify exactly the right
energy of course!
2011-03-01 Andy Buckley <andy@insectnation.org>
* Adding --no-plottitle to compare-histos (+ completion).
* Fixing segfaults in UA1_1990_S2044935 and UA5_1982_S875503.
* Bump ABI version numbers for 1.5.0 release.
* Use AnalysisInfo for storage of the NeedsCrossSection analysis flag.
* Allow field setting in AnalysisInfo.
2011-02-27 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Support LineStyle=dashdotted in make-plots
* New command line option --style for compare-histos. Options are
"default", "bw" and "talk".
* cleaner uninstall
2011-02-26 Andy Buckley <andy@insectnation.org>
* Changing internal storage and return type of Particle::pdgId()
to PdgId, and adding Particle::energy().
* Renaming Analysis::energies() as Analysis::requiredEnergies().
* Adding beam energies into beam consistency checking:
Analysis::isCompatible methods now also require the beam energies
to be provided.
* Removing long-deprecated AnalysisHandler::init() constructor and
AnalysisHandler::removeIncompatibleAnalyses() methods.
2011-02-25 Andy Buckley <andy@insectnation.org>
* Adding --disable-obsolete, which takes its value from the value
of --disable-preliminary by default.
* Replacing RivetUnvalidated and RivetPreliminary plugin libraries
with optionally-configured analysis contents in the
experiment-specific plugin libraries. This avoids issues with
making libraries rebuild consistently when sources were reassigned
between libraries.
2011-02-24 Andy Buckley <andy@insectnation.org>
* Changing analysis plugin registration to fall back through
available paths rather than have RIVET_ANALYSIS_PATH totally
override the built-in paths. The first analysis hook of a given
name to be found is now the one that's used: any duplicates found
will be warned about but unused. getAnalysisLibPaths() now returns
*all* the search paths, in keeping with the new search behaviour.
2011-02-22 Andy Buckley <andy@insectnation.org>
* Moving the definition of the MSG_* macros into the Logging.hh
header. They can't be used everywhere, though, as they depend on
the existence of a this->getLog() method in the call scope. This
move makes them available in e.g. AnalysisHandler and other bits
of framework other than projections and analyses.
* Adding a gentle print-out from the Rivet AnalysisHandler if
preliminary analyses are being used, and strengthening the current
warning if unvalidated analyses are used.
* Adding documentation about the validation "process" and
the (un)validated and preliminary analysis statuses.
* Adding the new RivetPreliminary analysis library, and the
corresponding --disable-preliminary configure flag. Analyses in
this library are subject to change names, histograms, reference
data values, etc. between releases: make sure you check any
dependences on these analyses when upgrading Rivet.
* Change the Python script ref data search behaviours to use Rivet
ref data by default where available, rather than requiring a -R
option. Where relevant, -R is still a valid option, to avoid
breaking legacy scripts, and there is a new --no-rivet-refs option
to turn the default searching *off*.
* Add the prepending and appending optional arguments to the path
searching functions. This will make it easier to combine the
search functions with user-supplied paths in Python scripts.
* Make make-plots killable!
* Adding Rivet version to top of run printout.
* Adding Run::crossSection() and printing out the cross-section in
pb at the end of a Rivet run.
2011-02-22 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Make lighthisto.py aware of 2D histograms
* Adding published versions of the CDF_2008 leading jets and DY
analyses, and marking the preliminary ones as "OBSOLETE".
2011-02-21 Andy Buckley <andy@insectnation.org>
* Adding PDF documentation for path searching and .info/.plot
files, and tidying overfull lines.
* Removing unneeded const declarations from various return by
value path and internal binning functions. Should not affect ABI
compatibility but will force recompilation of external packages
using the RivetPaths.hh and Utils.hh headers.
* Adding findAnalysis*File(fname) functions, to be used by Rivet
scripts and external programs to find files known to Rivet
according to Rivet's (newly standard) lookup rule.
* Changing search path function behaviour to always return *all*
search directories rather than overriding the built-in locations
if the environment variables are set.
2011-02-20 Andy Buckley <andy@insectnation.org>
* Adding the ATLAS 2011 transverse jet shapes analysis.
2011-02-18 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Support for transparency in make-plots
2011-02-18 Frank Siegert <frank.siegert@cern.ch>
* Added ATLAS prompt photon analysis ATLAS_2010_S8914702
2011-02-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Simple NOOP constructor for Thrust projection
* Add CMS event shape analysis. Data read off the plots. We
will get the final numbers when the paper is accepted by
the journal.
2011-02-10 Frank Siegert <frank.siegert@cern.ch>
* Add final version of ATLAS dijet azimuthal decorrelation
2011-02-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* remove ATLAS conf note analyses for which we have final data
* reshuffle histograms in ATLAS minbias analysis to match Hepdata
* small pT-cut fix in ATLAS track based UE analysis
2011-01-31 Andy Buckley <andy@insectnation.org>
* Doc tweaks and adding cmp-by-|p| functions for Jets, to match
those added by Hendrik for Particles.
* Don't sum photons around muons in the D0 2010 Z pT analysis.
2011-01-27 Andy Buckley <andy@insectnation.org>
* Adding ATLAS 2010 min bias and underlying event analyses and data.
2011-01-23 Andy Buckley <andy@insectnation.org>
* Make make-plots write out PDF rather than PS by default.
2011-01-12 Andy Buckley <andy@insectnation.org>
* Fix several rendering and comparison bugs in rivet-mkhtml.
* Allow make-plots to write into an existing directory, at the
user's own risk.
* Make rivet-mkhtml produce PDF-based output rather than PS by
default (most people want PDF these days). Can we do the same
change of default for make-plots?
* Add getAnalysisPlotPaths() function, and use it in compare-histos
* Use proper .info file search path function internally in AnalysisInfo::make.
2011-01-11 Andy Buckley <andy@insectnation.org>
* Clean up ATLAS dijet analysis.
2010-12-30 Andy Buckley <andy@insectnation.org>
* Adding a run timeout option, and small bug-fixes to the event
timeout handling, and making first event timeout work nicely with
the run timeout. Run timeout is intended to be used in conjunction
with timed batch token expiry, of the type that likes to make 0
byte AIDA files on LCG when Grid proxies time out.
2010-12-21 Andy Buckley <andy@insectnation.org>
* Fix the cuts in the CDF 1994 colour coherence analysis.
2010-12-19 Andy Buckley <andy@insectnation.org>
* Fixing CDF midpoint cone jet algorithm default construction to
have an overlap threshold of 0.5 rather than 0.75. This was
recommended by the FastJet manual, and noticed while adding the
ATLAS and CMS cones.
* Adding ATLAS and CMS old iterative cones as "official" FastJets
constructor options (they could always have been used by explicit
instantiation and attachment of a Fastjet plugin object).
* Removing defunct and unused ClosestJetShape projection.
2010-12-16 Andy Buckley <andy@insectnation.org>
* bin/compare-histos, pyext/lighthisto.py: Take ref paths from
rivet module API rather than getting the environment by hand.
* pyext/lighthisto.py: Only read .plot info from the first
matching file (speed-up compare-histos).
2010-12-14 Andy Buckley <andy@insectnation.org>
* Augmenting the physics vector functionality to make FourMomentum
support maths operators with the correct return type (FourMomentum
rather than FourVector).
2010-12-11 Andy Buckley <andy@insectnation.org>
* Adding a --event-timeout option to control the event timeout,
adding it to the completion script, and making sure that the init
time check is turned OFF once successful!
* Adding an 3600 second timeout for initialising an event file. If
it takes longer than (or anywhere close to) this long, chances are
that the event source is inactive for some reason (perhaps
accidentally unspecified and stdin is not active, or the event
generator has died at the other end of the pipe. The reason for
not making it something shorter is that e.g. Herwig++ or Sherpa
can have long initialisation times to set up the MPI handler or to
run the matrix element integration. An timeout after an hour is
still better than a batch job which runs for two days before you
realise that you forgot to generate any events!
2010-12-10 Andy Buckley <andy@insectnation.org>
* Fixing unbooked-histo segfault in UA1_1990_S2044935 at 63 GeV.
2010-12-08 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixes in ATLAS_2010_CONF_083, declaring it validated
* Added ATLAS_2010_CONF_046, only two plots are implemented.
The paper will be out soon, and we don't need the other plots
right now. Data is read off the plots in the note.
* New option "SmoothLine" for HISTOGRAM sections in make-plots
* Changed CustomTicks to CustomMajorTicks and added CustomMinorTicks
in make-plots.
2010-12-07 Andy Buckley <andy@insectnation.org>
* Update the documentation to explain this latest bump to path
lookup behaviours.
* Various improvements to existing path lookups. In particular,
the analysis lib path locations are added to the info and ref
paths to avoid having to set three variables when you have all
three file types in the same personal plugin directory.
* Adding setAnalysisLibPaths and addAnalysisLibPath
functions. rivet --analysis-path{,-append} now use these and work
correctly. Hurrah!
* Add --show-analyses as an alias for --show-analysis, following a
comment at the ATLAS tutorial.
2010-12-07 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Change LegendXPos behaviour in make-plots. Now the top left
corner of the legend is used as anchor point.
2010-12-03 Andy Buckley <andy@insectnation.org>
* 1.4.0 release.
* Add bin-skipping to compare-histos to avoid one use of
rivet-rmgaps (it's still needed for non-plotting post-processing
like Professor).
2010-12-03 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix normalisation issues in UA5 and ALEPH analyses
2010-11-27 Andy Buckley <andy@insectnation.org>
* MathUtils.hh: Adding fuzzyGtrEquals and fuzzyLessEquals, and
tidying up the math utils collection a bit.
* CDF 1994 colour coherence analysis overhauled and
correction/norm factors fixed. Moved to VALIDATED status.
* Adding programmable completion for aida2flat and flat2aida.
* Improvements to programmable completion using the neat _filedir
completion shell function which I just discovered.
2010-11-26 Andy Buckley <andy@insectnation.org>
* Tweak to floating point inRange to use fuzzyEquals for CLOSED
interval equality comparisons.
* Some BibTeX generation improvements, and fixing the ATLAS dijet
BibTeX key.
* Resolution upgrade in PNG make-plots output.
* CDF_2005_S6217184.cc, CDF_2008_S7782535.cc: Updates to use the
new per-jet JetAlg interface (and some other fixes).
* JetAlg.cc: Changed the interface on request to return per-jet
rather than per-event jet shapes, with an extra jet index argument.
* MathUtils.hh: Adding index_between(...) function, which is handy
for working out which bin a value falls in, given a set of bin edges.
2010-11-25 Andy Buckley <andy@insectnation.org>
* Cmp.hh: Adding ASC/DESC (and ANTISORTED) as preferred
non-EQUIVALENT enum value synonyms over misleading
SORTED/UNSORTED.
* Change of rapidity scheme enum name to RapScheme
* Reworking JetShape a bit further: constructor args now avoid
inconsistencies (it was previously possible to define incompatible
range-ends and interval). Internal binning implementation also
reworked to use a vector of bin edges: the bin details are
available via the interface. The general jet pT cuts can be
applied via the JetShape constructor.
* MathUtils.hh: Adding linspace and logspace utility
functions. Useful for defining binnings.
* Adding more general cuts on jet pT and (pseudo)rapidity.
2010-11-11 Andy Buckley <andy@insectnation.org>
* Adding special handling of FourMomentum::mass() for computed
zero-mass vectors for which mass2 can go (very slightly) negative
due to numerical precision.
2010-11-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding ATLAS-CONF-2010-083 conference note. Data is read from plots.
When I run Pythia 6 the bins close to pi/2 are higher than in the note,
so I call this "unvalidated". But then ... the note doesn't specify
a tune or even just a version for the generators in the plots. Not even
if they used Pythia 6 or Pythia 8. Probably 6, since they mention AGILe.
2010-11-10 Andy Buckley <andy@insectnation.org>
* Adding a JetAlg::useInvisibles(bool) mechanism to allow ATLAS
jet studies to include neutrinos. Anyone who chooses to use this
mechanism had better be careful to remove hard neutrinos manually
in the provided FinalState object.
2010-11-09 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding ATLAS-CONF-2010-049 conference note. Data is read from plots.
Fragmentation functions look good, but I can't reproduce the MC lines
(or even the relative differences between them) in the jet cross-section
plots. So consider those unvalidated for now. Oh, and it seems ATLAS
screwed up the error bands in their ratio plots, too. They are
upside-down.
2010-11-07 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding ATLAS-CONF-2010-081 conference note. Data is read from plots.
2010-11-06 Andy Buckley <andy@insectnation.org>
* Deprecating the old JetShape projection and renaming to
ClosestJetShape: the algorithm has a tenuous relationship with
that actually used in the CDF (and ATLAS) jet shape analyses. CDF
analyses to be migrated to the new JetShape projection... and some
of that projection's features, design elements, etc. to be
finished off: we may as well take this opportunity to clear up
what was one of our nastiest pieces of code.
2010-11-05 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding ATLAS-CONF-2010-031 conference note. Data is read from plots.
2010-10-29 Andy Buckley <andy@insectnation.org>
* Making rivet-buildplugin use the same C++ compiler and CXXFLAGS
variable as used for the Rivet system build.
* Fixing NeutralFinalState projection to, erm, actually select
neutral particles (by Hendrik).
* Allow passing a general FinalState reference to the JetShape
projection, rather than requiring a VetoedFS.
2010-10-07 Andy Buckley <andy@insectnation.org>
* Adding a --with-root flag to rivet-buildplugin to add
root-config --libs flags to the plugin build command.
2010-09-24 Andy Buckley <andy@insectnation.org>
* Releasing as Rivet 1.3.0.
* Bundling underscore.sty to fix problems with running make-plots
on dat files generated by compare-histos from AIDA files with
underscores in their names.
2010-09-16 Andy Buckley <andy@insectnation.org>
* Fix error in N_effective definition for weighted profile errors.
2010-08-18 Andy Buckley <andy@insectnation.org>
* Adding MC_GENERIC analysis. NB. Frank Siegert also added MC_HJETS.
2010-08-03 Andy Buckley <andy@insectnation.org>
* Fixing compare-histos treatment of what is now a ref file, and
speeding things up... again. What a mess!
2010-08-02 Andy Buckley <andy@insectnation.org>
* Adding rivet-nopy: a super-simple Rivet C++ command line
interface which avoids Python to make profiling and debugging
easier.
* Adding graceful exception handling to the AnalysisHandler event
loop methods.
* Changing compare-histos behaviour to always show plots for which
there is at least one MC histo. The default behaviour should now
be the correct one in 99% of use cases.
2010-07-30 Andy Buckley <andy@insectnation.org>
* Merging in a fix for shared_ptrs not being compared for
insertion into a set based on raw pointer value.
2010-07-16 Andy Buckley <andy@insectnation.org>
* Adding an explicit library dependency declaration on libHepMC,
and hence removing the -lHepMC from the rivet-config --libs
output.
2010-07-14 Andy Buckley <andy@insectnation.org>
* Adding a manual section on use of Rivet (and AGILe) as
libraries, and how to use the -config scripts to aid compilation.
* FastJets projection now allows setting of a jet area definition,
plus a hacky mapping for getting the area-enabled cluster
sequence. Requested by Pavel Starovoitov & Paolo Francavilla.
* Lots of script updates in last two weeks!
2010-06-30 Andy Buckley <andy@insectnation.org>
* Minimising amount of Log class mapped into SWIG.
* Making Python ext build checks fail with error rather than
warning if it has been requested (or, rather, not explicitly
disabled).
2010-06-28 Andy Buckley <andy@insectnation.org>
* Converting rivet Python module to be a package, with the dlopen
flag setting etc. done around the SWIG generated core wrapper
module (rivet.rivetwrap).
* Requiring Python >= 2.4.0 in rivet scripts (and adding a Python
version checker function to rivet module)
* Adding --epspng option to make-plots (and converting to use subprocess.Popen).
2010-06-27 Andy Buckley <andy@insectnation.org>
* Converting JADE_OPAL analysis to use the fastjet
exclusive_ymerge_*max* function, rather than just
exclusive_ymerge: everything looks good now. It seems that fastjet
>= 2.4.2 is needed for this to work properly.
2010-06-24 Andy Buckley <andy@insectnation.org>
* Making rivet-buildplugin look in its own bin directory when
trying to find rivet-config.
2010-06-23 Andy Buckley <andy@insectnation.org>
* Adding protection and warning about numerical precision issues
in jet mass calculation/histogramming to the MC_JetAnalysis
analysis.
* Numerical precision improvement in calculation of
Vector4::mass2.
* Adding relative scale ratio plot flag to compare-histos
* Extended command completion to rivet-config, compare-histos, and
make-plots.
* Providing protected log messaging macros,
MSG_{TRACE,DEBUG,INFO,WARNING,ERROR} cf. Athena.
* Adding environment-aware functions for Rivet search path list access.
2010-06-21 Andy Buckley <andy@insectnation.org>
* Using .info file beam ID and energy info in HTML and LaTeX documentation.
* Using .info file beam ID and energy info in command-line printout.
* Fixing a couple of references to temporary variables in the
analysis beam info, which had been introduced during refactoring:
have reinstated reference-type returns as the more efficient
solution. This should not affect API compatibility.
* Making SWIG configure-time check include testing for
incompatibilities with the C++ compiler (re. the recurring _const_
char* literals issue).
* Various tweaks to scripts: make-plots and compare-histos
processes are now renamed (on Linux), rivet-config is avoided when
computing the Rivet version,and RIVET_REF_PATH is also set using
the rivet --analysis-path* flags. compare-histos now uses multiple
ref data paths for .aida file globbing.
* Hendrik changed VetoedFinalState comparison to always return
UNDEFINED if vetoing on the results of other FS projections is
being used. This is the only simple way to avoid problems
emanating from the remainingFinalState thing.
2010-06-19 Andy Buckley <andy@insectnation.org>
* Adding --analysis-path and --analysis-path-append command-line
flags to the rivet script, as a "persistent" way to set or extend
the RIVET_ANALYSIS_PATH variable.
* Changing -Q/-V script verbosity arguments to more standard
-q/-v, after Hendrik moaned about it ;)
* Small fix to TinyXML operator precendence: removes a warning,
and I think fixes a small bug.
* Adding plotinfo entries for new jet rapidity and jet mass plots
in MC_JetAnalysis derivatives.
* Moving MC_JetAnalysis base class into a new
libRivetAnalysisTools library, with analysis base class and helper
headers to be stored in the reinstated Rivet/Analyses include
directory.
2010-06-08 Andy Buckley <andy@insectnation.org>
* Removing check for CEDARSTD #define guard, since we no longer
compile against AGILe and don't have to be careful about
duplication.
* Moving crappy closest approach and decay significance functions
from Utils into SVertex, which is the only place they have ever
been used (and is itself almost entirely pointless).
* Overhauling particle ID <-> name system to clear up ambiguities
between enums, ints, particles and beams. There are no more enums,
although the names are still available as const static ints, and
names are now obtained via a singleton class which wraps an STL
map for name/ID lookups in both directions.
2010-05-18 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing factor-of-2 bug in the error calculation when scaling
histograms.
* Fixing D0_2001_S4674421 analysis.
2010-05-11 Andy Buckley <andy@insectnation.org>
* Replacing TotalVisibleMomentum with MissingMomentum in analyses
and WFinder. Using vector ET rather than scalar ET in some places.
2010-05-07 Andy Buckley <andy@insectnation.org>
* Revamping the AnalysisHandler constructor and data writing, with
some LWH/AIDA mangling to bypass the stupid AIDA idea of having to
specify the sole output file and format when making the data
tree. Preferred AnalysisHandler constructor now takes only one arg
-- the runname -- and there is a new AH.writeData(outfile) method
to replace AH.commitData(). Doing this now to begin migration to
more flexible histogramming in the long term.
2010-04-21 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing LaTeX problems (make-plots) on ancient machines, like lxplus.
2010-04-29 Andy Buckley <andy@insectnation.org>
* Fixing (I hope!) the treatment of weighted profile bin errors in LWH.
2010-04-21 Andy Buckley <andy@insectnation.org>
* Removing defunct and unused KtJets and Configuration classes.
* Hiding some internal details from Doxygen.
* Add @brief Doxygen comments to all analyses, projections and
core classes which were missing them.
2010-04-21 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* remove obsolete reference to InitialQuarks from DELPHI_2002
* fix normalisation in CDF_2000_S4155203
2010-04-20 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* bin/make-plots: real support for 2-dim histograms plotted as
colormaps, updated the documentation accordingly.
* fix misleading help comment in configure.ac
2010-04-08 Andy Buckley <andy@insectnation.org>
* bin/root2flat: Adding this little helper script, minimally
modified from one which Holger Schulz made for internal use in
ATLAS.
2010-04-05 Andy Buckley <andy@insectnation.org>
* Using spiresbib in rivet-mkanalysis: analysis templates made
with rivet-mkanalysis will now contain a SPIRES-dumped BibTeX key
and entry if possible!
* Adding BibKey and BibTeX entries to analysis metadata files, and
updating doc build to use them rather than the time-consuming
SPIRES screen-scraping. Added SPIRES BibTeX dumps to all analysis
metadata using new (uninstalled & unpackaged) doc/get-spires-data
script hack.
* Updating metadata files to add Energies, Beams and PtCuts
entries to all of them.
* Adding ToDo, NeedsCrossSection, and better treatment of Beams
and Energies entries in metadata files and in AnalysisInfo and
Analysis interfaces.
2010-04-03 Andy Buckley <andy@insectnation.org>
* Frank Siegert: Update of rivet-mkhtml to conform to improved
compare-histos.
* Frank Siegert: LWH output in precision-8 scientific notation, to
solve a binning precision problem... the first time weve noticed a
problem!
* Improved treatment of data/reference datasets and labels in
compare-histos.
* Rewrite of rivet-mkanalysis in Python to make way for neat
additions.
* Improving SWIG tests, since once again the user's biuld system
must include SWIG (no test to check that it's a 'good SWIG', since
the meaning of that depends on which compiler is being used and we
hope that the user system is consistent... evidence from Finkified
Macs and bloody SLC5 notwithstanding).
2010-03-23 Andy Buckley <andy@insectnation.org>
* Tag as patch release 1.2.1.
2010-03-22 Andy Buckley <andy@insectnation.org>
* General tidying of return arguments and intentionally unused
parameters to keep -Wextra happy (some complaints remain from
TinyXML, FastJet, and HepMC).
* Some extra bug fixes: in FastJets projection with explicit
plugin argument, removing muon veto cut on FoxWolframMoments.
* Adding UNUSED macro to help with places where compiler warnings
can't be helped.
* Turning on -Wextra warnings, and fixing some violations.
2010-03-21 Andy Buckley <andy@insectnation.org>
* Adding MissingMomentum projection, as replacement for ~all uses
of now-deprecated TotalVisibleMomentum projection.
* Fixing bug with TotalVisibleMomentum projection usage in MC_SUSY
analysis.
* Frank Siegert fixed major bug in pTmin param passing to FastJets
projection. D'oh: requires patch release.
2010-03-02 Andy Buckley <andy@insectnation.org>
* Tagging for 1.2.0 release... at last!
2010-03-01 Andy Buckley <andy@insectnation.org>
* Updates to manual, manual generation scripts, analysis info etc.
* Add HepData URL to metadata print-out with rivet --show-analysis
* Fix average Et plot in UA1 analysis to only apply to the tracker
acceptance (but to include neutral particle contributions,
i.e. the region of the calorimeter in the tracker acceptance).
* Use Et rather than pT in filling the scalar Et measure in
TotalVisibleMomentum.
* Fixes to UA1 normalisation (which is rather funny in the paper).
2010-02-26 Andy Buckley <andy@insectnation.org>
* Update WFinder to not place cuts and other restrictions on the
neutrino.
2010-02-11 Andy Buckley <andy@insectnation.org>
* Change analysis loader behaviour to use ONLY RIVET_ANALYSIS_PATH
locations if set, otherwise use ONLY the standard Rivet analysis
install path. Should only impact users of personal plugin
analyses, who should now explicitly set RIVET_ANALYSIS_PATH to
load their analysis... and who can now create personal versions of
standard analyses without getting an error message about duplicate
loading.
2010-01-15 Andy Buckley <andy@insectnation.org>
* Add tests for "stable" heavy flavour hadrons in jets (rather
than just testing for c/b hadrons in the ancestor lists of stable
jet constituents)
2009-12-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New option "RatioPlotMode=deviation" in make-plots.
2009-12-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New option "MainPlot" in make-plots. For people who only want
the ratio plot and nothing else.
* New option "ConnectGaps" in make-plots. Set to 1 if you
want to connect gaps in histograms with a line when ErrorBars=0.
Works both in PLOT and in HISTOGRAM sections.
* Eliminated global variables for coordinates in make-plots and
enabled multithreading.
2009-12-14 Andy Buckley <andy@insectnation.org>
* AnalysisHandler::execute now calls AnalysisHandler::init(event)
if it has not yet been initialised.
* Adding more beam configuration features to Beam and
AnalysisHandler: the setRunBeams(...) methods on the latter now
allows a beam configuration for the run to be specified without
using the Run class.
2009-12-11 Andy Buckley <andy@insectnation.org>
* Removing use of PVertex from few remaining analyses. Still used
by SVertex, which is itself hardly used and could maybe be
removed...
2009-12-10 Andy Buckley <andy@insectnation.org>
* Updating JADE_OPAL to do the histo booking in init(), since
sqrtS() is now available at that stage.
* Renaming and slightly re-engineering all MC_*_* analyses to not
be collider-specific (now the Analysis::sqrtS/beams()) methods
mean that histograms can be dynamically binned.
* Creating RivetUnvalidated.so plugin library for unvalidated
analyses. Unvalidated analyses now need to be explicitly enabled
with a --enable-unvalidated flag on the configure script.
* Various min bias analyses updated and validated.
2009-12-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Propagate SPECIAL and HISTOGRAM sections from .plot files
through compare-histos
* STAR_2006_S6860818: <pT> vs particle mass, validate analysis
2009-12-04 Andy Buckley <andy@insectnation.org>
* Use scaling rather than normalising in DELPHI_1996: this is
generally desirable, since normalizing to 1 for 1/sig dsig/dx
observables isn't correct if any events fall outside the histo
bounds.
* Many fixes to OPAL_2004.
* Improved Hemispheres interface to remove unnecessary consts on
returned doubles, and to also return non-squared versions
of (scaled) hemisphere masses.
* Add "make pyclean" make target at the top level to make it
easier for developers to clean their Python module build when the
API is extended.
* Identify use of unvalidated analyses with a warning message at
runtime.
* Providing Analysis::sqrtS() and Analysis::beams(), and making
sure they're available by the time the init methods are called.
2009-12-02 Andy Buckley <andy@insectnation.org>
* Adding passing of first event sqrt(s) and beams to analysis handler.
* Restructuring running to only use one HepMC input file (no-one
was using multiple ones, right?) and to break down the Run class
to cleanly separate the init and event loop phases. End of file is
now neater.
2009-12-01 Andy Buckley <andy@insectnation.org>
* Adding parsing of beam types and pairs of energies from YAML.
2009-12-01 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing trigger efficiency in CDF_2009_S8233977
2009-11-30 Andy Buckley <andy@insectnation.org>
* Using shared pointers to make I/O object memory management
neater and less error-prone.
* Adding crossSectionPerEvent() method [==
crossSection()/sumOfWeights()] to Analysis. Useful for histogram
scaling since numerator of sumW_passed/sumW_total (to calculate
pass-cuts xsec) is cancelled by dividing histo by sumW_passed.
* Clean-up of Particle class and provision of inline PID::
functions which take a Particle as an argument to avoid having to
explicitly call the Particle::pdgId() method.
2009-11-30 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing division by zero in Profile1D bin errors for
bins with just a single entry.
2009-11-24 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* First working version of STAR_2006_S6860818
2009-11-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding missing CDF_2001_S4751469 plots to uemerge
* New "ShowZero" option in make-plots
* Improving lots of plot defaults
* Fixing typos / non-existing bins in CDF_2001_S4751469 and
CDF_2004_S5839831 reference data
2009-11-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing our compare() for doubles.
2009-11-17 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Zeroth version of STAR_2006_S6860818 analysis (identified
strange particles). Not working yet for unstable particles.
2009-11-11 Andy Buckley <andy@insectnation.org>
* Adding separate jet-oriented and photon-oriented observables to
MC PHOTONJETUE analysis.
* Bug fix in MC leading jets analysis, and general tidying of
leading jet analyses to insert units, etc. (should not affect any
current results)
2009-11-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing last issues in STAR_2006_S6500200 and setting it to
VALIDATED.
* Noramlise STAR_2006_S6870392 to cross-section
2009-11-09 Andy Buckley <andy@insectnation.org>
* Overhaul of jet caching and ParticleBase interface.
* Adding lists of analyses' histograms (obtained by scanning the
plot info files) to the LaTeX documentation.
2009-11-07 Andy Buckley <andy@insectnation.org>
* Adding checking system to ensure that Projections aren't
registered before the init phase of analyses.
* Now that the ProjHandler isn't full of defunct pointers (which
tend to coincidentally point to *new* Projection pointers rather
than undefined memory, hence it wasn't noticed until recently!),
use of a duplicate projection name is now banned with a helpful
message at runtime.
* (Huge) overhaul of ProjectionHandler system to use shared_ptr:
projections are now deleted much more efficiently, naturally
cleaning themselves out of the central repository as they go out
of scope.
2009-11-06 Andy Buckley <andy@insectnation.org>
* Adding Cmp<double> specialisation, using fuzzyEquals().
2009-11-05 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing histogram division code.
2009-11-04 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New analysis STAR_2006_S6500200 (pion and proton pT spectra in
pp collisions at 200 GeV). It is still unclear if they used a cut
in rapidity or pseudorapidity, thus the analysis is declared
"UNDER DEVELOPMENT" and "DO NOT USE".
* Fixing compare() in NeutralFinalState and MergedFinalState
2009-11-04 Andy Buckley <andy@insectnation.org>
* Adding consistence checking on beam ID and sqrt(s) vs. those
from first event.
2009-11-03 Andy Buckley <andy@insectnation.org>
* Adding more assertion checks to linear algebra testing.
2009-11-02 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixing normalisation issue with stacked histograms in
make-plots.
2009-10-30 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* CDF_2009_S8233977: Updating data and axes labels to match final
paper. Normalise to cross-section instead of data.
2009-10-23 Andy Buckley <andy@insectnation.org>
* Fixing Cheese-3 plot in CDF 2004... at last!
2009-10-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix muon veto in CDF_1994_S2952106, CDF_2005_S6217184,
CDF_2008_S7782535, and D0_2004_S5992206
2009-10-19 Andy Buckley <andy@insectnation.org>
* Adding analysis info files for MC SUSY and PHOTONJETUE analyses.
* Adding MC UE analysis in photon+jet events.
2009-10-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Adding new NeutralFinalState projection. Note that this final
state takes E_T instead of p_T as argument (makes more sense for
neutral particles). The compare() method does not yet work as
expected (E_T comparison still missing).
* Adding new MergedFinalState projection. This merges two final
states, removing duplicate particles. Duplicates are identified by
looking at the genParticle(), so users need to take care of any
manually added particles themselves.
* Fixing most open issues with the STAR_2009_UE_HELEN analysis.
There is only one question left, regarding the away region.
* Set the default split-merge value for SISCone in our FastJets
projection to the recommended (but not Fastjet-default!) value of
0.75.
2009-10-17 Andy Buckley <andy@insectnation.org>
* Adding parsing of units in cross-sections passed to the "-x"
flag, i.e. "-x 101 mb" is parsed internally into 1.01e11 pb.
2009-10-16 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Disabling DELPHI_2003_WUD_03_11 in the Makefiles, since I don't
trust the data.
* Getting STAR_2009_UE_HELEN to work.
2009-10-04 Andy Buckley <andy@insectnation.org>
* Adding triggers and other tidying to (still unvalidated)
UA1_1990 analysis.
* Fixing definition of UA5 trigger to not be intrinscally
different for pp and ppbar: this is corrected for (although it
takes some readng to work this out) in the 1982 paper, which I
think is the only one to compare the two modes.
* Moving projection setup and registration into init() method for
remaining analyses.
* Adding trigger implementations as projections for CDF Runs 0 &
1, and for UA5.
2009-10-01 Andy Buckley <andy.buckley@cern.ch>
* Moving projection setup and registration into init() method for
analyses from ALEPH, CDF and the MC_ group.
* Adding generic SUSY validation analysis, based on plots used in
ATLAS Herwig++ validation.
* Adding sorted particle accessors to FinalState (cf. JetAlg).
2009-09-29 Andy Buckley <andy@insectnation.org>
* Adding optional use of args as regex match expressions with
-l/--list-analyses.
2009-09-03 Andy Buckley <andy.buckley@cern.ch>
* Passing GSL include path to compiler, since its absence was
breaking builds on systems with no GSL installation in a standard
location (such as SLC5, for some mysterious reason!)
* Removing lib extension passing to compiler from the configure
script, because Macs and Linux now both use .so extensions for the
plugin analysis modules.
2009-09-02 Andy Buckley <andy@insectnation.org>
* Adding analysis info file path search with RIVET_DATA_PATH
variable (and using this to fix doc build.)
* Improvements to AnalysisLoader path search.
* Moving analysis sources back into single directory, after a
proletarian uprising ;)
2009-09-01 Andy Buckley <andy@insectnation.org>
* Adding WFinder and WAnalysis, based on Z proj and analysis, with
some tidying of the Z code.
* ClusteredPhotons now uses an IdentifiedFS to pick the photons to
be looped over, and only clusters photons around *charged* signal
particles.
2009-08-31 Andy Buckley <andy@insectnation.org>
* Splitting analyses by directory, to make it easier to disable
building of particular analysis group plugin libs.
* Removing/merging headers for all analyses except for the special
MC_JetAnalysis base class.
* Exit with an error message if addProjection is used twice from
the same parent with distinct projections.
2009-08-28 Andy Buckley <andy@insectnation.org>
* Changed naming convention for analysis plugin libraries, since
the loader has changed so much: they must now *start* with the
word "Rivet" (i.e. no lib prefix).
* Split standard plugin analyses into several plugin libraries:
these will eventually move into separate subdirs for extra build
convenience.
* Started merging analysis headers into the source files, now that
we can (the plugin hooks previously forbade this).
* Replacement of analysis loader system with a new one based on
ideas from ThePEG, which uses dlopen-time instantiation of
templated global variables to reduce boilerplate plugin hooks to
one line in analyses.
2009-07-14 Frank Siegert <frank.siegert@durham.ac.uk>
* Replacing in-source histo-booking metadata with .plot files.
2009-07-14 Andy Buckley <andy@insectnation.org>
* Making Python wrapper files copy into place based on bundled
versions for each active HepMC interface (2.3, 2.4 & 2.5), using a
new HepMC version detector test in configure.
* Adding YAML metadata files and parser, removing same metadata
from the analysis classes' source headers.
2009-07-07 Andy Buckley <andy@insectnation.org>
* Adding Jet::hadronicEnergy()
* Adding VisibleFinalState and automatically using it in JetAlg
projections.
* Adding YAML parser for new metadata (and eventually ref data)
files.
2009-07-02 Andy Buckley <andy@insectnation.org>
* Adding Jet::neutralEnergy() (and Jet::totalEnergy() for
convenience/symmetry).
2009-06-25 Andy Buckley <andy@insectnation.org>
* Tidying and small efficiency improvements in CDF_2008_S7541902
W+jets analysis (remove unneeded second stage of jet storing,
sorting the jets twice, using foreach, etc.).
2009-06-24 Andy Buckley <andy@insectnation.org>
* Fixing Jet's containsBottom and containsCharm methods, since B
hadrons are not necessarily to be found in the final
state. Discovered at the same time that HepMC::GenParticle defines
a massively unhelpful copy constructor that actually loses the
tree information; it would be better to hide it entirely!
* Adding RivetHepMC.hh, which defines container-type accessors to
HepMC particles and vertices, making it possible to use Boost
foreach and hence avoiding the usual huge boilerplate for-loops.
2009-06-11 Andy Buckley <andy@insectnation.org>
* Adding --disable-pdfmanual option, to make the bootstrap a bit
more robust.
* Re-enabling D0IL in FastJets: adding 10^-10 to the pTmin removes
the numerical instability!
* Fixing CDF_2004 min/max cone analysis to use calo jets for the
leading jet Et binning. Thanks to Markus Warsinsky
for (re)discovering this bug: I was sure it had been fixed. I'm
optimistic that this will fix the main distributions, although
Swiss Cheese "minus 3" is still likely to be broken. Early tests
look okay, but it'll take more stats before we can remove the "do
not trust" sign.
2009-06-10 Andy Buckley <andy@insectnation.org>
* Providing "calc" methods so that Thrust and Sphericity
projections can be used as calculators without having to use the
projecting/caching system.
2009-06-09 Andy Buckley <andy@insectnation.org>
* 1.1.3 release!
* More doc building and SWIG robustness tweaks.
2009-06-07 Andy Buckley <andy@insectnation.org>
* Make doc build from metadata work even before the library is
installed.
2009-06-07 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix phi rotation in CDF_2008_LEADINGJETS.
2009-06-07 Andy Buckley <andy@insectnation.org>
* Disabling D0 IL midpoint cone (using CDF modpoint instead),
since there seems to be a crashing bug in FastJet's
implementation: we can't release that way, since ~no D0 analyses
will run.
2009-06-03 Andy Buckley <andy@insectnation.org>
* Putting SWIG-generated source files under SVN control to make
life easier for people who we advise to check out the SVN head
version, but who don't have a sufficiently modern copy of SWIG to
* Adding the --disable-analyses option, for people who just want
to use Rivet as a framework for their own analyses.
* Enabling HepMC cross-section reading, now that HepMC 2.5.0 has
been released.
2009-05-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Using gsl-config to locate libgsl
* Fix the paths for linking such that our own libraries are found
before any system libraries, e.g. for the case that there is an
outdated fastjet version installed on the system while we want to
use our own up-to-date version.
* Change dmerge to ymerge in the e+e- analyses using JADE or
DURHAM from fastjet. That's what it is called in fastjet-2.4 now.
2009-05-18 Andy Buckley <andy@insectnation.org>
* Adding use of gsl-config in configure script.
2009-05-16 Andy Buckley <andy@insectnation.org>
* Removing argument to vetoEvent macro, since no weight
subtraction is now needed. It's now just an annotated return, with
built-in debug log message.
* Adding an "open" FinalState, which is only calculated once per
even, then used by all other FSes, avoiding the loop over
non-status 1 particles.
2009-05-15 Andy Buckley <andy@insectnation.org>
* Removing incorrect setting of DPS x-errs in CDF_2008 jet shape
analysis: the DPS autobooking already gets this bit right.
* Using Jet rather than FastJet::PseudoJet where possible, as it
means that the phi ranges match up nicely between Particle and the
Jet object. The FastJet objects are only really needed if you want
to do detailed things like look at split/merge scales for
e.g. diff jet rates or "y-split" analyses.
* Tidying and debugging CDF jet shape analyses and jet shape
plugin... ongoing, but I think I've found at least one real bug,
plus a lot of stuff that can be done a lot more nicely.
* Fully removing deprecated math functions and updating affected
analyses.
2009-05-14 Andy Buckley <andy@insectnation.org>
* Removing redundant rotation in DISKinematics... this was a
legacy of Peter using theta rather than pi-theta in his rotation.
* Adding convenience phi, rho, eta, theta, and perp,perp2 methods
to the 3 and 4 vector classes.
2009-05-12 Andy Buckley <andy@insectnation.org>
* Adding event auto-rotation for events with one proton... more
complete approach?
2009-05-09 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Renaming CDF_2008_NOTE_9337 to CDF_2009_S8233977.
* Numerous small bug fixes in ALEPH_1996_S3486095.
* Adding data for one of the Rick-Field-style STAR UE analyses.
2009-05-08 Andy Buckley <andy@insectnation.org>
* Adding rivet-mkanalysis script, to make generating new analysis
source templates easier.
2009-05-07 Andy Buckley <andy@insectnation.org>
* Adding null vector check to Vector3::azimuthalAngle().
* Fixing definition of HCM/Breit frames in DISKinematics, and
adding asserts to check that the transformation is doing what it
should.
2009-05-05 Andy Buckley <andy@insectnation.org>
* Removing eta and Et cuts from CDF 2000 Z pT analysis, based on
our reading of the paper, and converting most of the analysis to a
call of the ZFinder projection.
2009-05-05 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Support non-default seed_threshold in CDF cone jet algorithms.
* New analyses STAR_2006_S6870392 and STAR_2008_S7993412. In
STAR_2008_S7993412 only the first distribution is filled at the
moment. STAR_2006_S6870392 is normalised to data instead of the
Monte Carlo cross-section, since we don't have that available in
the HepMC stream yet.
2009-05-05 Andy Buckley <andy@insectnation.org>
* Changing Event wrapper to copy whole GenEvents rather than
pointers, use std units if supported in HepMC, and run a
placeholder function for event auto-orientation.
2009-04-28 Andy Buckley <andy@insectnation.org>
* Removing inclusion of IsolationTools header by analyses that
aren't actually using the isolation tools... which is all of
them. Leaving the isolation tools in place for now, as there might
still be use cases for them and there's quite a lot of code there
that deserves a second chance to be used!
2009-04-24 Andy Buckley <andy@insectnation.org>
* Deleting Rivet implementations of TrackJet and D0ILConeJets: the
code from these has now been incorporated into FastJet 2.4.0.
* Removed all mentions of the FastJet JADE patch and the HAVE_JADE
preprocessor macro.
* Bug fix to D0_2008_S6879055 to ensure that cuts compare to both
electron and positron momenta (was just comparing against
electrons, twice, probably thanks to the miracle of cut and
paste).
* Converting all D0 IL Cone jets to use FastJets. Involved tidying
D0_2004 jet azimuthal decorrelation analysis and D0_2008_S6879055
as part of migration away from using the getLorentzJets method,
and removing the D0ILConeJets header from quite a few analyses
that weren't using it at all.
* Updating CDF 2001 to use FastJets in place of TrackJet, and
adding axis labels to its plots.
* Note that ZEUS_2001_S4815815 uses the wrong jet definition: it
should be a cone but curently uses kT.
* Fixing CDF_2005_S6217184 to use correct (midpoint, R=0.7) jet
definition. That this was using a kT definition with R=1.0 was
only made obvious when the default FastJets constructor was
removed.
* Removing FastJets default constructor: since there are now
several good (IRC safe) jet definitions available, there is no
obvious safe default and analyses should have to specify which
they use.
* Moving FastJets constructors into implementation file to reduce
recompilation dependencies, and adding new plugins.
* Ensuring that axis labels actually get written to the output
data file.
2009-04-22 Andy Buckley <andy@insectnation.org>
* Adding explicit FastJet CDF jet alg overlap_threshold
constructor param values, since the default value from 2.3.x is
now removed in version 2.4.0.
* Removing use of HepMC ThreeVector::mag method (in one place
only) since this has been removed in version 2.5.0b.
* Fix to hepmc.i (included by rivet.i) to ignore new HepMC 2.5.0b
GenEvent stream operator.
2009-04-21 Andy Buckley <andy@insectnation.org>
* Dependency on FastJet now requires version 2.4.0 or later. Jade
algorithm is now native.
* Moving all analysis constructors and Projection headers from the
analysis header files into their .cc implementation files, cutting
header dependencies.
* Removing AIDA headers: now using LWH headers only, with
enhancement to use axis labels. This facility is now used by the
histo booking routines, and calling the booking function versions
which don't specify axis labels will result in a runtime warning.
2009-04-07 Andy Buckley <andy@insectnation.org>
* Adding $(DESTDIR) prefix to call to Python module "setup.py
install"
* Moving HepMC SWIG mappings into Python Rivet module for now:
seems to work-around the SL type-mapping bug.
2009-04-03 Andy Buckley <andy@insectnation.org>
* Adding MC analysis for LHC UE: higher-pT replica of Tevatron
2008 leading jets study.
* Adding CDF_1990 pseudorapidity analysis.
* Moving CDF_2001 constructor into implementation file.
* Cleaning up CDF_2008_LEADINGJETS a bit, e.g. using foreach
loops.
* Adding function interface for specifying axis labels in histo
bookings. Currently has no effect, since AIDA doesn't seem to have
a mechanism for axis labels. It really is a piece of crap.
2009-03-18 Andy Buckley <andy@insectnation.org>
* Adding docs "make upload" and other tweaks to make the doc files
fit in with the Rivet website.
* Improving LaTex docs to show email addresses in printable form
and to group analyses by collider or other metadata.
* Adding doc script to include run info in LaTeX docs, and to make
HTML docs.
* Removing WZandh projection, which wasn't generator independent
and whose sole usage was already replaced by ZFinder.
* Improvements to constructors of ZFinder and InvMassFS.
* Changing ExampleTree to use real FS-based Z finding.
2009-03-16 Andy Buckley <andy@insectnation.org>
* Allow the -H histo file spec to give a full name if wanted. If
it doesn't end in the desired extension, it will be added.
* Adding --runname option (and API elements) to choose a run name
to be prepended as a "top level directory" in histo paths. An
empty value results in no extra TLD.
2009-03-06 Andy Buckley <andy@insectnation.org>
* Adding R=0.2 photon clustering to the electrons in the CDF 2000
Z pT analysis.
2009-03-04 Andy Buckley <andy@insectnation.org>
* Fixing use of fastjet-config to not use the user's PATH
variable.
* Fixing SWIG type table for HepMC object interchange.
2009-02-20 Andy Buckley <andy@insectnation.org>
* Adding use of new metadata in command line analysis querying
with the rivet command, and in building the PDF Rivet manual.
* Adding extended metadata methods to the Analysis interface and
the Python wrapper. All standard analyses comply with this new
interface.
2009-02-19 Andy Buckley <andy@insectnation.org>
* Adding usefully-scoped config headers, a Rivet::version()
function which uses them, and installing the generated headers to
fix "external" builds against an installed copy of Rivet. The
version() function has been added to the Python wrapper.
2009-02-05 Andy Buckley <andy@insectnation.org>
* Removing ROOT dependency and linking. Woo! There's no need for
this now, because the front-end accepts no histo format switch and
we'll just use aida2root for output conversions. Simpler this way,
and it avoids about half of our compilation bug reports from 32/64
bit ROOT build confusions.
2009-02-04 Andy Buckley <andy@insectnation.org>
* Adding automatic generation of LaTeX manual entries for the
standard analyses.
2009-01-20 Andy Buckley <andy@insectnation.org>
* Removing RivetGun and TCLAP source files!
2009-01-19 Andy Buckley <andy@insectnation.org>
* Added psyco Python optimiser to rivet, make-plots and
compare-histos.
* bin/aida2root: Added "-" -> "_" mangling, following requests.
2009-01-17 Andy Buckley <andy@insectnation.org>
* 1.1.2 release.
2009-01-15 Andy Buckley <andy@insectnation.org>
* Converting Python build system to bundle SWIG output in tarball.
2009-01-14 Andy Buckley <andy@insectnation.org>
* Converting AIDA/LWH divide function to return a DPS so that bin
width factors don't get all screwed up. Analyses adapted to use
the new division operation (a DPS/DPS divide would also be
nice... but can wait for YODA).
2009-01-06 Andy Buckley <andy@insectnation.org>
* bin/make-plots: Added --png option for making PNG output files,
using 'convert' (after making a PDF --- it's a bit messy)
* bin/make-plots: Added --eps option for output filtering through
ps2eps.
2009-01-05 Andy Buckley <andy@insectnation.org>
* Python: reworking Python extension build to use distutils and
newer m4 Python macros. Probably breaks distcheck but is otherwise
more robust and platform independent (i.e. it should now work on
Macs).
2008-12-19 Andy Buckley <andy@insectnation.org>
* make-plots: Multi-threaded make-plots and cleaned up the LaTeX
building a bit (necessary to remove the implicit single global
state).
2008-12-18 Andy Buckley <andy@insectnation.org>
* make-plots: Made LaTeX run in no-stop mode.
* compare-histos: Updated to use a nicer labelling syntax on the
command line and to successfully build MC-MC plots.
2008-12-16 Andy Buckley <andy@insectnation.org>
* Made LWH bin edge comparisons safe against numerical errors.
* Added Particle comparison functions for sorting.
* Removing most bad things from ExampleTree and tidying up. Marked
WZandh projection for removal.
2008-12-03 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added the two missing observables to the CDF_2008_NOTE_9337 analysis,
i.e. track pT and sum(ET). There is a small difference between our MC
output and the MC plots of the analysis' author, we're still waiting
for the author's comments.
2008-12-02 Andy Buckley <andy@insectnation.org>
* Overloading use of a std::set in the interface, since the
version of SWIG on Sci Linux doesn't have a predefined mapping for
STL sets.
2008-12-02 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixed uemerge. The output was seriously broken by a single line
of debug information in fillAbove(). Also changed uemerge output
to exponential notation.
* Unified ref and mc histos in compare-histos. Histos with one
bin are plotted linear. Option for disabling the ratio plot.
Several fixes for labels, legends, output directories, ...
* Changed rivetgun's fallback directory for parameter files to
$PREFIX/share/AGILe, since that's where the steering files now are.
* Running aida2flat in split mode now produces make-plots compatible
dat-files for direct plotting.
2008-11-28 Andy Buckley <andy@insectnation.org>
* Replaced binreloc with an upgraded and symbol-independent copy.
2008-11-25 Andy Buckley <andy@insectnation.org>
* Added searching of $RIVET_REF_PATH for AIDA reference data
files.
2008-11-24 Andy Buckley <andy@insectnation.org>
* Removing "get"s and other obsfucated syntax from
ProjectionApplier (Projection and Analysis) interfaces.
2008-11-21 Andy Buckley <andy@insectnation.org>
* Using new "global" Jet and V4 sorting functors in
TrackJet. Looks like there was a sorting direction problem before...
* Verbose mode with --list-analyses now shows descriptions as well
as analysis names.
* Moved data/Rivet to data/refdata and moved data/RivetGun
contents to AGILe (since generator steering is no longer a Rivet
thing)
* Added unchecked ratio plots to D0 Run II jet + photon analysis.
* Added D0 inclusive photon analysis.
* Added D0 Z rapidity analysis.
* Tidied up constructor interface and projection chain
implementation of InvMassFinalState.
* Added ~complete set of Jet and FourMomentum sorting functors.
2008-11-20 Andy Buckley <andy@insectnation.org>
* Added IdentifiedFinalState.
* Moved a lot of TrackJet and Jet code into .cc files.
* Fixed a caching bug in Jet: cache flag resets should never be
conditional, since they are then sensitive to initialisation
errors.
* Added quark enum values to ParticleName.
* Rationalised JetAlg interfaces somewhat, with "size()" and
"jets()" methods in the interface.
* Added D0 W charge asymmetry and D0 inclusive jets analyses.
2008-11-18 Andy Buckley <andy@insectnation.org>
* Adding D0 inclusive Z pT shape analysis.
* Added D0 Z + jet pT and photon + jet pT spectrum analyses.
* Lots of tidying up of particle, event, particle name etc.
* Now the first event is used to detect the beam type and remove
incompatible analyses.
2008-11-17 Andy Buckley <andy@insectnation.org>
* Added bash completion for rivetgun.
* Starting to provide stand-alone call methods on projections so
they can be used without the caching infrastructure. This could
also be handy for unit testing.
* Adding functionality (sorting function and built-in sorting
schemes) to the JetAlg interface.
2008-11-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fix floating point number output format in aida2flat and flat2aida
* Added CDF_2002_S4796047: CDF Run-I charged multiplicity distribution
* Renamed CDF_2008_MINBIAS to CDF_2008_NOTE_9337, since the
note is publicly available now.
2008-11-10 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added DELPHI_2003_WUD_03_11: Delphi 4-jet angular distributions.
There is still a problem with the analysis, so don't use it yet.
But I need to commit the code because my laptop is broken ...
2008-11-06 Andy Buckley <andy@insectnation.org>
* Code review: lots of tidying up of projections and analyses.
* Fixes for compatibility with the LLVM C & C++ compiler.
* Change of Particle interface to remove "get"-prefixed method
names.
2008-11-05 Andy Buckley <andy@insectnation.org>
* Adding ability to query analysis metadata from the command line.
* Example of a plugin analyis now in plugindemo, with a make check
test to make sure that the plugin analysis is recognised by the
command line "rivet" tool.
* GCC 4.3 fix to mat-vec tests.
2008-11-04 Andy Buckley <andy@insectnation.org>
* Adding native logger control from Python interface.
2008-11-03 Andy Buckley <andy@insectnation.org>
* Adding bash_completion for rivet executable.
2008-10-31 Andy Buckley <andy@insectnation.org>
* Clean-up of histo titles and analysis code review.
* Added momentum construction functions from FastJet PseudoJets.
2008-10-28 Andy Buckley <andy@insectnation.org>
* Auto-booking of histograms with a name, rather than the HepData
ID 3-tuple is now possible.
* Fix in CDF 2001 pT spectra to get the normalisations to depend
on the pT_lead cutoff.
2008-10-23 Andy Buckley <andy@insectnation.org>
* rivet handles signals neatly, as for rivetgun, so that premature
killing of the analysis process will still result in an analysis
file.
* rivet now accepts cross-section as a command line argument and,
if it is missing and is required, will prompt the user for it
interactively.
2008-10-22 Andy Buckley <andy@insectnation.org>
* rivet (Python interface) now can list analyses, check when
adding analyses that the given names are valid, specify histo file
name, and provide sensibly graded event number logging.
2008-10-20 Andy Buckley <andy@insectnation.org>
* Corrections to CDF 2004 analysis based on correspondance with
Joey Huston. M bias dbns now use whole event within |eta| < 0.7,
and Cheese plots aren't filled at all if there are insufficient
jets (and the correct ETlead is used).
2008-10-08 Andy Buckley <andy@insectnation.org>
* Added AnalysisHandler::commitData() method, to allow the Python
interface to write out a histo file without having to know
anything about the histogramming API.
* Reduced SWIG interface file to just map a subset of Analysis and
AnalysisHandler functionality. This will be the basis for a new
command line interface.
2008-10-06 Andy Buckley <andy@insectnation.org>
* Converted FastJets plugin to use a Boost shared_pointer to the
cached ClusterSequence. The nullness of the pointer is now used to
indicate an empty tracks (and hence jets) set. Once FastJet
natively support empty CSeqs, we can rewrite this a bit more
neatly and ditch the shared_ptr.
2008-10-02 Andy Buckley <andy@insectnation.org>
* The CDF_2004 (Acosta) data file now includes the full range of
pT for the min bias data at both 630 and 1800 GeV. Previously,
only the small low-pT insert plot had been entered into HepData.
2008-09-30 Andy Buckley <andy@insectnation.org>
* Lots of updates to CDF_2004 (Acosta) UE analysis, including
sorting jets by E rather than Et, and factorising transverse cone
code into a function so that it can be called with a random
"leading jet" in min bias mode. Min bias histos are now being
trial-filled just with tracks in the transverse cones, since the
paper is very unclear on this.
* Discovered a serious caching problem in FastJets projection when
an empty tracks vector is passed from the
FinalState. Unfortunately, FastJet provides no API way to solve
the problem, so we'll have to report this upstream. For now, we're
solving this for CDF_2004 by explicitly vetoing events with no
tracks.
* Added Doxygen to the build with target "dox"
* Moved detection of whether cross-section information is needed
into the AnalysisHandler, with dynamic computation by scanning
contained analyses.
* Improved robustness of event reading to detect properly when the
input file is smaller than expected.
2008-09-29 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New analysis: CDF_2000_S4155203
2008-09-23 Andy Buckley <andy@insectnation.org>
* rivetgun can now be built and run without AGILe. Based on a
patch by Frank Siegert.
2008-09-23 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Some preliminary numbers for the CDF_2008_LEADINGJETS analysis
(only transverse region and not all observables. But all we have now.)
2008-09-17 Andy Buckley <andy@insectnation.org>
* Breaking up the mammoth "generate" function, to make Python
mapping easier, among other reasons.
* Added if-zero-return-zero checking to angle mapping functions,
to avoid problems where 1e-19 gets mapped on to 2 pi and then
fails boundary asserts.
* Added HistoHandler singleton class, which will be a central
repository for holding analyses' histogram objects to be accessed
via a user-chosen name.
2008-08-26 Andy Buckley <andy@insectnation.org>
* Allowing rivet-config to return combined flags.
2008-08-14 Andy Buckley <andy@insectnation.org>
* Fixed some g++ 4.3 compilation bugs, including "vector" not
being a valid name for a method which returns a physics vector,
since it clashes with std::vector (which we globally import). Took
the opportunity to rationalise the Jet interface a bit, since
"particle" was used to mean "FourMomentum", and "Particle" types
required a call to "getFullParticle". I removed the "gets" at the
same time, as part of our gradual migration to a coherent naming
policy.
2008-08-11 Andy Buckley <andy@insectnation.org>
* Tidying of FastJets and added new data files from HepData.
2008-08-10 James Monk <jmonk@hep.ucl.ac.uk>
* FastJets now uses user_index property of fastjet::PseudoJet to
reconstruct PID information in jet contents.
2008-08-07 Andy Buckley <andy@insectnation.org>
* Reworking of param file and command line parsing. Tab characters
are now handled by the parser, in a way equivalent to spaces.
2008-08-06 Andy Buckley <andy@insectnation.org>
* Added extra histos and filling to Acosta analysis - all HepData
histos should now be filled, depending on sqrt{s}. Also trialling
use of LaTeX math mode in titles.
2008-08-05 Andy Buckley <andy@insectnation.org>
* More data files for CDF analyses (2 x 2008, 1 x 1994), and moved
the RivetGun AtlasPythia6.params file to more standard
fpythia-atlas.params (and added to the install list).
2008-08-04 Andy Buckley <andy@insectnation.org>
* Reduced size of available options blocks in RivetGun help text
by removing "~" negating variants (which are hardly ever used in
practice) and restricting beam particles to
PROTON, ANTIPROTON,ELECTRON and POSITRON.
* Fixed Et sorting in Acosta analysis.
2008-08-01 Andy Buckley <andy@insectnation.org>
* Added AIDA headers to the install list, since
external (plugin-type) analyses need them to be present for
compilation to succeed.
2008-07-29 Andy Buckley <andy@insectnation.org>
* Fixed missing ROOT compile flags for libRivet.
* Added command line repetition to logging.
2008-07-29 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Included the missing numbers and three more observables
in the CDF_2008_NOTE_9351 analysis.
2008-07-29 Andy Buckley <andy@insectnation.org>
* Fixed wrong flags on rivet-config
2008-07-28 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Renamed CDF_2008_DRELLYAN to CDF_2008_NOTE_9351. Updated
numbers and cuts to the final version of this CDF note.
2008-07-28 Andy Buckley <andy@insectation.org>
* Fixed polar angle calcuation to use atan2.
* Added "mk" prefixes and x/setX convention to math classes.
2008-07-28 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Fixed definition of FourMomentum::pT (it had been returning pT2)
2008-07-27 Andy Buckley <andy@insectnation.org>
* Added better tests for Boost headers.
* Added testing for -ansi, -pedantic and -Wall compiler flags.
2008-07-25 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* updated DELPHI_2002_069_CONF_603 according to information
from the author
2008-07-17 Andy Buckley <andy@insectnation.org>
* Improvements to aida2flat: now can produce one output file per
histo, and there is a -g "gnuplot mode" which comments out the
YODA/make_plot headers to make the format readable by gnuplot.
* Import boost::assign namespace contents into the Rivet namespace
--- provides very useful intuitive collection initialising
functions.
2008-07-15 Andy Buckley <andy.buckley@dur.ac.uk>
* Fixed missing namespace in vector/matrix testing.
* Removed Boost headers: now a system dependency.
* Fixed polarRadius infinite loop.
2008-07-09 Andy Buckley <andy@insectnation.org>
* Fixed definitions of mapAngleMPiToPi, etc. and used them to fix
the Jet::getPhi method.
* Trialling use of "foreach" loop in CDF_2004: it works! Very nice.
2008-07-08 Andy Buckley <andy@insectnation.org>
* Removed accidental reference to an "FS" projection in
FinalStateHCM's compare method. rivetgun -A now works again.
* Added TASSO, SLD and D0_2008 reference data. The TASSO and SLD
papers aren't installed or included in the tarball since there are
currently no plans to implement these analyses.
* Added Rivet namespacing to vector, matrix etc. classes. This
required some re-writing and the opportunity was taken to move
some canonical function definitions inside the classes and to
improve the header structure of the Math area.
2008-07-07 Andy Buckley <andy@insectnation.org>
* Added Rivet namespace to Units.hh and Constants.hh.
* Added Doxygen "@brief" flags to analyses.
* Added "RIVET_" namespacing to all header guards.
* Merged Giulio Lenzi's isolation/vetoing/invmass projections and
D0 2008 analysis.
2008-06-23 Jon Butterworth <J.Butterworth@ucl.ac.uk>
* Modified FastJet to fix ysplit and split and filter.
* Modified ExampleTree to show how to call them.
2008-06-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added first version of the CDF_2008_DRELLYAN analysis described on
http://www-cdf.fnal.gov/physics/new/qcd/abstracts/UEinDY_08.html
There is a small difference between the analysis and this
implementation, but it's too small to be visible.
The fpythia-cdfdrellyan.params parameter file is for this analysis.
* Added first version of the CDF_2008_MINBIAS analysis described on
http://www-cdf.fnal.gov/physics/new/qcd/abstracts/minbias_08.html
The .aida file is read from the plots on the web and will change.
I'm still discussing some open questions about the analysis with
the author.
2008-06-18 Jon Butterworth <J.Butterworth@ucl.ac.uk>
* Added First versions of splitJet and filterJet methods to
fastjet.cc. Not yet tested, buyer beware.
2008-06-18 Andy Buckley <andy@insectnation.org>
* Added extra sorted Jets and Pseudojets methods to FastJets, and
added ptmin argument to the JetAlg getJets() method, requiring a
change to TrackJet.
2008-06-13 Andy Buckley <andy@insectnation.org>
* Fixed processing of "RG" parameters to ensure that invalid
iterators are never used.
2008-06-10 Andy Buckley <andy@insectnation.org>
* Updated AIDA reference files, changing "/HepData" root path to
"/REF". Still missing a couple of reference files due to upstream
problems with the HepData records.
2008-06-09 Andy Buckley <andy@insectnation.org>
* rivetgun now handles termination signals (SIGTERM, SIGINT and
SIGHUP) gracefully, finishing the event loop and finalising
histograms. This means that histograms will always get written
out, even if not all the requested events have been generated.
2008-06-04 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added DELPHI_2002_069_CONF_603 analysis
2008-05-30 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added InitialQuarks projection
* Added OPAL_1998_S3780481 analysis
2008-05-29 Andy Buckley <andy@insectnation.org>
* distcheck compatibility fixes and autotools tweaks.
2008-05-28 Andy Buckley <andy@insectnation.org>
* Converted FastJet to use Boost smart_ptr for its plugin
handling, to solve double-delete errors stemming from the heap
cloning of projections.
* Added (a subset of) Boost headers, particularly the smart
pointers.
2008-05-24 Andy Buckley <andy@insectnation.org>
* Added autopackage spec files.
* Merged these changes into the trunk.
* Added a registerClonedProjection(...) method to
ProjectionHandler: this is needed so that cloned projections will
have valid pointer entries in the ProjectHandler repository.
* Added clone() methods to all projections (need to use this,
since the templated "new PROJ(proj)" approach to cloning can't
handle object polymorphism.
2008-05-19 Andy Buckley <andy@insectnation.org>
* Moved projection-applying functions into ProjectionApplier base
class (from which Projection and Analysis both derive).
* Added Rivet-specific exceptions in place of std::runtime_error.
* Removed unused HepML reference files.
* Added error handling for requested analyses with wrong case
convention / missing name.
2008-05-15 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New analysis PDG_Hadron_Multiplicities
* flat2aida converter
2008-05-15 Andy Buckley <andy@insectnation.org>
* Removed unused mysterious Perl scripts!
* Added RivetGun.HepMC logging of HepMC event details.
2008-05-14 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New analysis DELPHI_1995_S3137023. This analysis contains
the xp spectra of Xi+- and Sigma(1385)+-.
2008-05-13 Andy Buckley <andy@insectnation.org>
* Improved logging interface: log levels are now integers (for
cross-library compatibility and level setting also applies to
existing loggers.
2008-05-09 Andy Buckley <andy@insectnation.org>
* Improvements to robustness of ROOT checks.
* Added --version flag on config scripts and rivetgun.
2008-05-06 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* New UnstableFinalState projection which selects all hadrons,
leptons and real photons including unstable particles.
* In the DELPHI_1996_S3430090 analysis the multiplicities for
pi+/pi- and p0 are filled, using the UnstableFinalState projection.
2008-05-06 Andy Buckley <andy@insectnation.org>
* FastJets projection now protects against the case where no
particles exist in the final state (where FastJet throws an
exception).
* AIDA file writing is now separated from the
AnalysisHandler::finalize method... API users can choose what to
do with the histo objects, be that writing out or further
processing.
2008-04-29 Andy Buckley <andy@insectnation.org>
* Increased default tolerances in floating point comparisons as
they were overly stringent and valid f.p. precision errors were
being treated as significant.
* Implemented remainder of Acosta UE analysis.
* Added proper getEtSum() to Jet.
* Added Et2() member and function to FourMomentum.
* Added aida2flat conversion script.
* Fixed ambiguity in TrackJet algorithm as to how the iteration
continues when tracks are merged into jets in the inner loop.
2008-04-28 Andy Buckley <andy@insectnation.org>
* Merged in major "ProjectionHandler" branch. Projections are now
all stored centrally in the singleton ProjectionHandler container,
rather than as member pointers in projections and analyses. This
also affects the applyProjection mechanism, which is now available
as a templated method on Analysis and Projection. Still a few
wrinkles need to be worked out.
* The branch changes required a comprehensive review of all
existing projections and analyses: lots of tidying up of these
classes, as well as the auxiliary code like math utils, has taken
place. Too much to list and track, unfortunately!
2008-03-28 Andy Buckley <buckley@pc54.hep.ucl.ac.uk>
* Started second CDF UE analysis ("Acosta"): histograms defined.
* Fixed anomalous factor of 2 in LWH conversion from Profile1D
to DataPointSet.
* Added pT distribution histos to CDF 2001 UE analysis.
2008-03-26 Andy Buckley <andy@insectnation.org>
* Removed charged+neutral versions of histograms and projections
from DELPHI analysis since they just duplicate the more robust
charged-only measurements and aren't really of interest for
tuning.
2008-03-10 Andy Buckley <andy@insectnation.org>
* Profile histograms now use error computation with proper
weighting, as described here:
http://en.wikipedia.org/wiki/Weighted_average
2008-02-28 Andy Buckley <andy@insectnation.org>
* Added --enable-jade flag for Professor studies with patched
FastJet.
* Minor fixes to LCG tag generator and gfilt m4 macros.
* Fixed projection slicing issues with Field UE analysis.
* Added Analysis::vetoEvent(e) function, which keeps track of the
correction to the sum of weights due to event vetoing in analysis
classes.
2008-02-26 Andy Buckley <andy@insectnation.org>
* Vector<N> and derived classes now initialise to have zeroed
components when the no-arg constructor is used.
* Added Analysis::scale() function to scale 1D
histograms. Analysis::normalize() uses it internally, and the
DELPHI (A)EEC, whose histo weights are not pure event weights, and
normalised using scale(h, 1/sumEventWeights).
2008-02-21 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Added EEC and AEEC to the DELPHI_1996_S3430090 analysis. The
normalisation of these histograms is still broken (ticket #163).
2008-02-19 Hendrik Hoeth <hendrik.hoeth@cern.ch>
* Many fixes to the DELPHI_1996_S3430090 analysis: bugfix in the
calulation of eigenvalues/eigenvectors in MatrixDiag.hh for the
sphericity, rewrite of Thrust/Major/Minor, fixed scaled momentum,
hemisphere masses, normalisation in single particle events,
final state slicing problems in the projections for Thrust,
Sphericity and Hemispheres.
2008-02-08 Andy Buckley <andy@insectnation.org>
* Applied fixes and extensions to DIS classes, based on
submissions by Dan Traynor.
2008-02-06 Andy Buckley <andy@insectnation.org>
* Made projection pointers used for cut combining into const
pointers. Required some redefinition of the Projection* comparison
operator.
* Temporarily added FinalState member to ChargedFinalState to stop
projection lifetime crash.
2008-02-01 Andy Buckley <andy@insectnation.org>
* Fixed another misplaced factor of bin width in the
Analysis::normalize() method.
2008-01-30 Andy Buckley <andy@insectnation.org>
* Fixed the conversion of IHistogram1D to DPS, both via the
explicit Analysis::normalize() method and the implicit
AnalysisHandler::treeNormalize() route. The root of the problem is
the AIDA choice of the word "height" to represent the sum of
weights in a bin: i.e. the bin width is not taken into account
either in computing bin height or error.
2008-01-22 Andy Buckley <andy@insectnation.org>
* Beam projection now uses HepMC GenEvent::beam_particles() method
to get the beam particles. This is more portable and robust for
C++ generators, and equivalent to the existing "first two" method
for Fortran generators.
2008-01-17 Andy Buckley <andy@insectnation.org>
* Added angle range fix to pseudorapidity function (thanks to
Piergiulio Lenzi).
2008-01-10 Andy Buckley <andy@insectnation.org>
* Changed autobooking plot codes to use zero-padding (gets the
order right in JAS, file browser, ROOT etc.). Also changed the
'ds' part to 'd' for consistency. HepData's AIDA output has been
correspondingly updated, as have the bundled data files.
2008-01-04 Andy Buckley <andy@insectnation.org>
* Tidied up JetShape projection a bit, including making the
constructor params const references. This seems to have sorted the
runtime segfault in the CDF_2005 analysis.
* Added caching of the analysis bin edges from the AIDA file -
each analysis object will now only read its reference file once,
which massively speeds up the rivetgun startup time for analyses
with large numbhers of autobooked histos (e.g. the
DELPHI_1996_S3430090 analysis).
2008-01-02 Andy Buckley <andy@insectnation.org>
* CDF_2001_S4751469 now uses the LossyFinalState projection, with
an 8% loss rate.
* Added LossyFinalState and HadronicFinalState, and fixed a
"polarity" bug in the charged final state projection (it was
keeping only the *uncharged* particles).
* Now using isatty(1) to determine whether or not color escapes
can be used. Also removed --color argument, since it can't have an
effect (TCLAP doesn't do position-based flag toggling).
* Made Python extension build optional (and disabled by default).
2008-01-01 Andy Buckley <andy@insectnation.org>
* Removed some unwanted DEBUG statements, and lowered the level of
some infrastructure DEBUGs to TRACE level.
* Added bash color escapes to the logger system.
2007-12-21 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/LWH/ManagedObject.h: Fixed infinite loop in
encodeForXML cf. ticket #135.
2007-12-20 Andy Buckley <andy@insectnation.org>
* Removed HepPID, HepPDT and Boost dependencies.
* Fixed XML entity encoding in LWH. Updated CDF_2007_S7057202
analysis to not do its own XML encoding of titles.
2007-12-19 Andy Buckley <andy@insectnation.org>
* Changed units header to set GeV = 1 (HepMC convention) and using
units in CDF UE analysis.
2007-12-15 Andy Buckley <andy@insectnation.org>
* Introduced analysis metadata methods for all analyses (and made
them part of the Analysis interface).
2007-12-11 Andy Buckley <andy@insectnation.org>
* Added JetAlg base projection for TrackJet, FastJet etc.
2007-12-06 Andy Buckley <andy@insectnation.org>
* Added checking for Boost library, and the standard Boost test
program for shared_ptr.
* Got basic Python interface running - required some tweaking
since Python and Rivet's uses of dlopen collide (another
RTLD_GLOBAL issue - see
http://muttley.hates-software.com/2006/01/25/c37456e6.html )
2007-12-05 Andy Buckley <andy@insectnation.org>
* Replaced all use of KtJets projection with FastJets
projection. KtJets projection disabled but left undeleted for
now. CLHEP and KtJet libraries removed from configure searches and
Makefile flags.
2007-12-04 Andy Buckley <andy@insectnation.org>
* Param file loading now falls back to the share/RivetGun
directory if a local file can't be found and the provided name has
no directory separators in it.
* Converted TrackJet projection to update the jet centroid with
each particle added, using pT weighting in the eta and phi
averaging.
2007-12-03 Andy Buckley <andy@insectnation.org>
* Merged all command line handling functions into one large parse
function, since only one executable now needs them. This removes a
few awkward memory leaks.
* Removed rivet executable - HepMC file reading functionality will
move into rivetgun.
* Now using HepMC IO_GenEvent format (IO_Ascii and
IO_ExtendedAscii are deprecated). Now requires HepMC >= 2.3.0.
* Added forward declarations of GSL diagonalisation routines,
eliminating need for GSL headers to be installed on build machine.
2007-11-27 Andy Buckley <andy@insectnation.org>
* Removed charge differentiation from Multiplicity projection (use
CFS proj) and updated ExampleAnalysis to produce more useful numbers.
* Introduced binreloc for runtime path determination.
* Fixed several bugs in FinalState, ChargedFinalState, TrackJet
and Field analysis.
* Completed move to new analysis naming scheme.
2007-11-26 Andy Buckley <andy@insectnation.org>
* Removed conditional HAVE_FASTJET bits: FastJet is now compulsory.
* Merging appropriate RivetGun parts into Rivet. RivetGun currently broken.
2007-11-23 Andy Buckley <andy@insectnation.org>
* Renaming analyses to Spires-ID scheme: currently of form
S<SpiresID>, to become <Expt>_<YYYY>_<SpiresID>.
2007-11-20 Andy Buckley <andy@insectnation.org>
* Merged replacement vectors, matrices and boosts into trunk.
2007-11-15 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Analysis.cc, include/Rivet/Analysis.hh: Introduced normalize
function. See ticket #126.
2007-10-31 Andy Buckley <andy@insectnation.org>
* Tagging as 1.0b2 for HERA-LHC meeting.
2007-10-25 Andy Buckley <andy@insectnation.org>
* Added AxesDefinition base interface to Sphericity and Thrust,
used by Hemispheres.
* Exposed BinaryCut class, improved its interface and fixed a few
bugs. It's now used by VetoedFinalState for momentum cuts.
* Removed extra output from autobooking AIDA reader.
* Added automatic DPS booking.
2007-10-12 Andy Buckley <andy@insectnation.org>
* Improved a few features of the build system
2007-10-09 James Monk
* Fixed dylib dlopen on Mac OS X.
2007-10-05 Andy Buckley <andy@insectnation.org>
* Added new reference files.
2007-10-03 Andy Buckley <andy@insectnation.org>
* Fixed bug in configure.ac which led to explicit CXX setting
being ignored.
* Including Logging.hh in Projection.hh, hence new transitive
dependency on Logging.hh being installed. Since this is the normal
behaviour, I don't think this is a problem.
* Fixed segfaulting bug due to use of addProjection() in
locally-scoped contained projections. This isn't a proper fix,
since the whole framework should be designed to avoid the
possibility of bugs like this.
* Added newly built HepML and AIDA reference files for current
analyses.
2007-10-02 Andy Buckley <andy@insectnation.org>
* Fixed possible null-pointer dereference in Particle copy
constructor and copy assignment: this removes one of two blocker
segfaults, the other of which is related to the copy-assignment of
the TotalVisMomentum projection in the ExampleTree analysis.
2007-10-01 Andy Buckley <andy@insectnation.org>
* Fixed portable path to Rivet share directory.
2007-09-28 Andy Buckley <andy@insectnation.org>
* Added more functionality to the rivet-config script: now has
libdir, includedir, cppflags, ldflags and ldlibs options.
2007-09-26 Andy Buckley <andy@insectnation.org>
* Added the analysis library closer function to the
AnalysisHandler finalize() method, and also moved the analysis
delete loop into AnalysisHandler::finalize() so as not to try
deleting objects whose libraries have already closed.
* Replaced the RivetPaths.cc.in method for portable paths with
something using -D defines - much simpler!
2007-09-21 Lars Sonnenschein <sonne@mail.cern.ch>
* Added HepEx0505013 analysis and JetShape projection (some fixes
by AB.)
* Added GetLorentzJets member function to D0 RunII cone jet projection
2007-09-21 Andy Buckley <andy@insectnation.org>
* Fixed lots if bugs and bad practice in HepEx0505013 (to make it
compile-able!)
* Downclassed the log messages from the Test analysis to DEBUG
level.
* Added isEmpty() method to final state projection.
* Added testing for empty final state and useful debug log
messages to sphericity projection.
2007-09-20 Andy Buckley <andy@insectnation.org>
* Added Hemispheres projection, which calculates event hemisphere
masses and broadenings.
2007-09-19 Andy Buckley <andy@insectnation.org>
* Added an explicit copy assignment operator to Particle: the
absence of one of these was responsible for the double-delete
error.
* Added a "fuzzy equals" utility function for float/double types
to Utils.hh (which already contains a variety of handy little
functions).
* Removed deprecated Beam::operator().
* Added ChargedFinalState projection and de-pointered the
contained FinalState projection in VetoedFinalState.
2007-09-18 Andy Buckley <andy@insectnation.org>
* Major bug fixes to the regularised version of the sphericity
projection (and hence the Parisi tensor projection). Don't trust
C & D param results from any previous version!
* Added extra methods to thrust and sphericity projections to get
the oblateness and the sphericity basis (currently returns dummy
axes since I can't yet work out how to get the similarity
transform eigenvectors from CLHEP)
2007-09-14 Andy Buckley <andy@insectnation.org>
* Merged in a branch of pluggable analysis mechanisms.
2007-06-25 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Fixed some bugs in the root output for DataPoint.h
2007-06-25 Andy Buckley <andy@insectnation.org>
* include/Rivet/**/Makefile.am: No longer installing headers for
"internal" functionality.
* include/Rivet/Projections/*.hh: Removed the private restrictions
on copy-assignment operators.
2007-06-18 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/LWH/Tree.h: Fixed minor bug in listObjectNames.
* include/LWH/DataPointSet.h: Fixed setCoordinate functions so
that they resize the vector of DataPoints if it initially was
empty.
* include/LWH/DataPoint.h: Added constructor taking a vector of
measuremts.
2007-06-16 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/LWH/Tree.h: Implemented the listObjectNames and ls
functions.
* include/Rivet/Projections/FinalStateHCM.hh,
include/Rivet/Projections/VetoedFinalState.hh: removed
_theParticles and corresponding access function. Use base class
variable instead.
* include/Rivet/Projections/FinalState.hh: Made _theParticles
protected.
2007-06-13 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Projections/FinalStateHCM.cc,
src/Projections/DISKinematics.cc: Equality checks using
GenParticle::operator== changed to check for pointer equality.
* include/Rivet/Analysis/HepEx9506012.hh: Uses modified DISLepton
projection.
* include/Rivet/Particle.hh: Added member function to check if a
GenParticle is associated.
* include/Rivet/Projections/DISLepton.hh,
src/Projections/DISLepton.cc: Fixed bug in projection. Introduced
final state projection to limit searching for scattered
lepton. Still not properly tested.
2007-06-08 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/Rivet/Projections/PVertex.hh,
src/Projections/PVertex.cc: Fixed the projection to simply get the
signal_process_vertex from the GenEvent. This is the way it should
work. If the GenEvent does not have a signal_process_vertex
properly set up in this way, the problem is with the class that
fills the GenEvent.
2007-06-06 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Merged TotalVisibleMomentum and CalMET
* Added pT ranges to Vetoed final state projection
2007-05-27 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Fixed initialization of VetoedFinalStateProjection in ExampleTree
2007-05-27 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/Rivet/Projections/KtJets.*: Make sure the KtEvent is
deleted properly.
2007-05-26 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Added leptons to the ExampleTree.
* Added TotalVisibleEnergy projection, and added output to ExampleTree.
2007-05-25 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Added a charged lepton projection
2007-05-23 Andy Buckley <andy@insectnation.org>
* src/Analysis/HepEx0409040.cc: Changed range of the histograms to
the "pi" range rather than the "128" range.
* src/Analysis/Analysis.cc: Fixed a bug in the AIDA path building.
Histogram auto-booking now works.
2007-05-23 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Analysis/HepEx9506012.cc: Now uses the histogram booking
function in the Analysis class.
2007-05-23 Jon Butterworth <jmb@hep.ucl.ac.uk>
* Fixed bug in PRD65092002 (was failing on zero jets)
2007-05-23 Andy Buckley <andy@insectnation.org>
* Added (but haven't properly tested) a VetoedFinalState projection.
* Added normalize() method for AIDA 1D histograms.
* Added configure checking for Mac OS X version, and setting the
development target flag accordingly.
2007-05-22 Andy Buckley <andy@insectnation.org>
* Added an ostream method for AnalysisName enums.
* Converted Analyses and Projections to use projection lists, cuts
and beam constraints.
* Added beam pair combining to the BeamPair sets of Projections
by finding set meta-intersections.
* Added methods to Cuts, Analysis and Projection to make Cut
definition easier.
* Fixed default fall-through in cut handling switch statement and
now using -numeric_limits<double>::max() rather than min()
* Added more control of logging presentation via static flag
methods on Log.
2007-05-13 Andy Buckley <andy@insectnation.org>
* Added self-consistency checking mechanisms for Cuts and Beam
* Re-implemented the cut-handling part of RivetInfo as a Cuts class.
* Changed names of Analysis and Projection name() and handler()
methods to getName() and getHandler() to be more consistent with
the rest of the public method names in those classes.
2007-05-02 Andy Buckley <andy@insectnation.org>
* Added auto-booking of histogram bins from AIDA XML files. The
AIDA files are located via a C++ function which is generated from
RivetPaths.cc.in by running configure.
2007-04-18 Andy Buckley <andy@insectnation.org>
* Added a preliminary version of the Rick Field UE analysis, under
the name PRD65092002.
2007-04-19 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Analysis/HepEx0409040.cc: The reason this did not compile
under gcc-4 is that some iterators into a vector were wrongly
assued to be pointers and were initialized to 0 and later compared
to 0. I've changed this to initialize to end() of the
corresponding vector and to compare with the same end() later.
2007-04-05 Andy Buckley <andy@insectnation.org>
* Lots of name changes in anticipation of the MCNet
school. RivetHandler is now AnalysisHandler (since that's what it
does!), BeamParticle has become ParticleName, and RivetInfo has
been split into Cut and BeamConstraint portions.
* Added BeamConstraint mechanism, which can be used to determine
if an analysis is compatible with the beams being used in the
generator. The ParticleName includes an "ANY" wildcard for this
purpose.
2006-03-19 Andy Buckley <andy@insectnation.org>
* Added "rivet" executable which can read in HepMC ASCII dump
files and apply Rivet analyses on the events.
2007-02-24 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Projections/KtJets.cc: Added comparison of member variables
in compare() function
* all: Merged changes from polymorphic-projections branch into
trunk
2007-02-17 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* all: projections and analysis handlers: All projections which
uses other projctions now has a pointer rather than a copy of
those projections to allow for polymorphism. The constructors has
also been changed to require the used projections themselves,
rather than the arguments needed to construct them.
2007-02-17 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Projections/FinalState.cc,
include/Rivet/Projections/FinalState.icc (Rivet),
include/Rivet/Projections/FinalState.hh: Added cut in transverse
momentum on the particles to be included in the final state.
2007-02-06 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* include/LWH/HistogramFactory.h: Fixed divide-by-zero in divide
function. Also fixed bug in error calculation in divide
function. Introduced checkBin function to make sure two histograms
are equal even if they have variable bin widths.
* include/LWH/Histogram1D.h: In normalize(double), do not do anything
if the sum of the bins are zero to avoid dividing by zero.
2007-01-20 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* src/Test/testLWH.cc: Modified to output files using the Tree.
* configure.ac: Removed AC_CONFIG_AUX_DIR([include/Rivet/Config])
since the directory does not exist anymore.
2006-12-21 Andy Buckley <andy@insectnation.org>
* Rivet will now conditionally install the AIDA and LWH headers if
it can't find them when configure'ing.
* Started integrating Leif's LWH package to fulfill the AIDA
duties.
* Replaced multitude of CLHEP wrapper headers with a single
RivetCLHEP.h header.
2006-11-20 Andy Buckley <andy@insectnation.org>
* Introduced log4cpp logging.
* Added analysis enum, which can be used as input to an analysis
factory by Rivet users.
2006-11-02 Andy Buckley <andy@insectnation.org>
* Yet more, almost pointless, administrative moving around of
things with the intention of making the structure a bit
better-defined:
* The RivetInfo and RivetHandler classes have been
moved from src/Analysis into src as they are really the main Rivet
interface classes. The Rivet.h header has also been moved into the
"header root".
* The build of a single shared library in lib has been disabled,
with the library being built instead in src.
2006-10-14 Andy Buckley <andy@insectnation.org>
* Introduced a minimal subset of the Sherpa math tools, such as
Vector{3,4}D, Matrix, etc. The intention is to eventually cut the
dependency on CLHEP.
2006-07-28 Andy Buckley <andy@insectnation.org>
* Moving things around: all sources now in directories under src
2006-06-04 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* Analysis/Examples/HZ95108.*: Now uses CentralEtHCM. Also set GeV
units on the relevant histograms.
* Projections/CentralEtHCM.*: Making a special class just to get
out one number - the summed Et in the central rapidity bin - may
seem like an overkill. But in case some one else might nees it...
2006-06-03 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* Analysis/Examples/HZ95108.*: Added the hz95108 energy flow
analysis from HZtool.
* Projections/DISLepton.*: Since many HERA measurements do not
care if we have electron or positron beam, it is now possible to
specify lepton or anti-lepton.
* Projections/Event.*: Added member and access function for the
weight of an event (taken from the GenEvent object.weights()[0].
* Analysis/RivetHandler.*: Now depends explicitly on the AIDA
interface. An AIDA analysis factory must be specified in the
constructor, where a tree and histogram factory is automatically
created. Added access functions to the relevant AIDA objects.
* Analysis/AnalysisBase.*: Added access to the RivetHandler and
its AIDA factories.
2005-12-27 Leif Lönnblad <Leif.Lonnblad@thep.lu.se>
* configure.ac: Added -I$THEPEGPATH/include to AM_CPPFLAGS.
* Config/Rivet.h: Added some std incudes and using std::
declaration.
* Analysis/RivetInfo.*: Fixed some bugs. The RivetInfo facility
now works, although it has not been thoroughly tested.
* Analysis/Examples/TestMultiplicity.*: Re-introduced
FinalStateHCM for testing purposes but commented it away again.
* .: Made a number of changes to implement handling of RivetInfo
objects.
diff --git a/bin/make-plots b/bin/make-plots
--- a/bin/make-plots
+++ b/bin/make-plots
@@ -1,2598 +1,2598 @@
#! /usr/bin/env python
"""\
Usage: %prog [options] file.dat [file2.dat ...]
TODO
* Optimise output for e.g. lots of same-height bins in a row
* Add a RatioFullRange directive to show the full range of error bars + MC envelope in the ratio
* Tidy LaTeX-writing code -- faster to compile one doc only, then split it?
* Handle boolean values flexibly (yes, no, true, false, etc. as well as 1, 0)
"""
##
## This program is copyright by Hendrik Hoeth <hoeth@linta.de> and
## the Rivet team https://rivet.hepforge.org. It may be used
## for scientific and private purposes. Patches are welcome, but please don't
## redistribute changed versions yourself.
##
## Check the Python version
import sys
if sys.version_info[:3] < (2,6,0):
print "make-plots requires Python version >= 2.6.0... exiting"
sys.exit(1)
## Try to rename the process on Linux
try:
import ctypes
libc = ctypes.cdll.LoadLibrary('libc.so.6')
libc.prctl(15, 'make-plots', 0, 0, 0)
except Exception, e:
pass
import os, logging, re
import tempfile
import getopt
import string
from math import *
## Regex patterns
pat_begin_block = re.compile(r'^#+\s*BEGIN ([A-Z0-9_]+) ?(\S+)?')
pat_end_block = re.compile('^#+\s*END ([A-Z0-9_]+)')
pat_comment = re.compile('^#|^\s*$')
pat_property = re.compile('^(\w+?)=(.*)$')
pat_path_property = re.compile('^(\S+?)::(\w+?)=(.*)$')
def fuzzyeq(a, b, tolerance=1e-6):
"Fuzzy equality comparison function for floats, with given fractional tolerance"
# if type(a) is not float or type(a) is not float:
# print a, b
if (a == 0 and abs(b) < 1e-12) or (b == 0 and abs(a) < 1e-12):
return True
return 2.0*abs(a-b)/abs(a+b) < tolerance
def is_end_marker(line, blockname):
m = pat_end_block.match(line)
return m and m.group(1) == blockname
def is_comment(line):
return pat_comment.match(line) is not None
class Inputdata(object):
def __init__(self, filename):
self.filename=filename+".dat"
self.histos = {}
self.special = {}
self.functions = {}
self.description = {}
self.pathdescriptions = []
self.description['is2dim'] = False
f = open(filename+'.dat')
for line in f:
m = pat_begin_block.match(line)
if m:
name, path = m.group(1,2)
if path is None and name != 'PLOT':
raise Exception('BEGIN sections need a path name.')
## Pass the reading of the block to separate functions
if name == 'PLOT':
self.read_input(f);
elif name == 'SPECIAL':
self.special[path] = Special(f)
elif name == 'HISTOGRAM' or name == 'HISTOGRAM2D':
self.histos[path] = Histogram(f)
self.histos[path].path = path
self.description['is2dim'] = self.histos[path].is2dim
elif name == 'HISTO1D':
self.histos[path] = Histo1D(f)
elif name == 'HISTO2D':
self.histos[path] = Histo2D(f)
self.description['is2dim'] = True
elif name == 'FUNCTION':
self.functions[path] = Function(f)
# elif is_comment(line):
# continue
# else:
# self.read_path_based_input(line)
f.close()
self.apply_config_files(opts.CONFIGFILES)
self.description['PlotSizeX'] = 10.
if self.description['is2dim']:
self.description['PlotSizeX'] -= 1.5
self.description['PlotSizeY'] = 6.
if self.description.has_key('PlotSize') and self.description['PlotSize']!='':
plotsizex,plotsizey = self.description['PlotSize'].split(',')
self.description['PlotSizeX'] = float(plotsizex)
self.description['PlotSizeY'] = float(plotsizey)
del self.description['PlotSize']
self.description['RatioPlotSizeY'] = 0.
if self.description.has_key('MainPlot') and self.description['MainPlot']=='0':
self.description['RatioPlot'] = '1'
self.description['PlotSizeY'] = 0.
if self.description.has_key('RatioPlot') and self.description['RatioPlot']=='1':
if self.description.has_key('RatioPlotYSize') and self.description['RatioPlotYSize']!='':
self.description['RatioPlotSizeY'] = float(self.description['RatioPlotYSize'])
else:
if self.description.has_key('MainPlot') and self.description['MainPlot']=='0':
self.description['RatioPlotSizeY'] = 6.
else:
self.description['RatioPlotSizeY'] = 3.
self.description['LogX'] = self.description.has_key('LogX') and self.description['LogX']=='1'
self.description['LogY'] = self.description.has_key('LogY') and self.description['LogY']=='1'
self.description['LogZ'] = self.description.has_key('LogZ') and self.description['LogZ']=='1'
if self.description.has_key('Rebin'):
for i in self.histos:
self.histos[i].description['Rebin'] = self.description['Rebin']
histoordermap = {}
histolist = self.histos.keys()
if self.description.has_key('DrawOnly'):
histolist = filter(self.histos.keys().count, self.description['DrawOnly'].strip().split())
for histo in histolist:
order = 0
if self.histos[histo].description.has_key('PlotOrder'):
order = int(self.histos[histo].description['PlotOrder'])
if not order in histoordermap:
histoordermap[order] = []
histoordermap[order].append(histo)
sortedhistolist = []
for i in sorted(histoordermap.keys()):
sortedhistolist.extend(histoordermap[i])
self.description['DrawOnly']=sortedhistolist
# inherit various values from histograms if not explicitly
# set.
for k in ['LogX', 'LogY', 'LogZ', 'XLabel', 'YLabel', 'ZLabel']:
self.inherit_from_histos(k)
return
def inherit_from_histos(self, k):
# note: this will inherit the key from a random histogram:
# only use if you're sure all histograms have this key!
if not self.description.has_key(k):
h = self.histos.itervalues().next()
if h.description.has_key(k):
self.description[k] = h.description[k]
return
def read_input(self, f):
for line in f:
if is_end_marker(line, 'PLOT'):
break
elif is_comment(line):
continue
m = pat_property.match(line)
if m:
prop, value = m.group(1,2)
if prop in self.description:
logging.debug("Overwriting property %s = %s -> %s" % (prop, self.description[prop], value))
## Use strip here to deal with DOS newlines containing \r
self.description[prop.strip()] = value.strip()
def apply_config_files(self, conffiles):
if conffiles is not None:
for filename in conffiles:
cf = open(filename,'r')
lines = cf.readlines()
for i in range(0, len(lines)):
## First evaluate PLOT sections
m = pat_begin_block.match(lines[i])
if m and m.group(1) == 'PLOT' and re.match(m.group(2),self.filename):
while i<len(lines)-1:
i = i+1
if is_end_marker(lines[i], 'PLOT'):
break
elif is_comment(lines[i]):
continue
m = pat_property.match(lines[i])
if m:
prop, value = m.group(1,2)
if prop in self.description:
logging.debug("Overwriting from conffile property %s = %s -> %s" % (prop, self.description[prop], value))
## Use strip here to deal with DOS newlines containing \r
self.description[prop.strip()] = value.strip()
elif is_comment(lines[i]):
continue
else:
## Then evaluate path-based settings, e.g. for HISTOGRAMs
m = pat_path_property.match(lines[i])
if m:
regex, prop, value=m.group(1,2,3)
for obj_dict in [self.special, self.histos, self.functions]:
for path, obj in obj_dict.iteritems():
if re.match(regex, path):
## Use strip here to deal with DOS newlines containing \r
obj.description.update({prop.strip() : value.strip()})
cf.close()
class Plot(object):
def __init__(self,inputdata):
pass
def set_normalization(self,inputdata):
for method in ['NormalizeToIntegral', 'NormalizeToSum']:
if inputdata.description.has_key(method):
for i in inputdata.description['DrawOnly']:
if not inputdata.histos[i].description.has_key(method):
inputdata.histos[i].description[method] = inputdata.description[method]
if inputdata.description.has_key('Scale'):
for i in inputdata.description['DrawOnly']:
inputdata.histos[i].description['Scale'] = float(inputdata.description['Scale'])
for i in inputdata.description['DrawOnly']:
inputdata.histos[i].mangle_input()
def stack_histograms(self,inputdata):
if inputdata.description.has_key('Stack'):
foo=[]
for i in inputdata.description['Stack'].strip().split():
if i in inputdata.histos.keys():
foo.append(i)
previous=''
for i in foo:
if previous!='':
inputdata.histos[i].add(inputdata.histos[previous])
previous=i
def set_histo_options(self,inputdata):
if inputdata.description.has_key('ConnectGaps'):
for i in inputdata.histos.keys():
if not inputdata.histos[i].description.has_key('ConnectGaps'):
inputdata.histos[i].description['ConnectGaps']=inputdata.description['ConnectGaps']
def set_borders(self,inputdata):
self.set_xmax(inputdata)
self.set_xmin(inputdata)
self.set_ymax(inputdata)
self.set_ymin(inputdata)
self.set_zmax(inputdata)
self.set_zmin(inputdata)
inputdata.description['Borders']=(self.xmin, self.xmax, self.ymin, self.ymax, self.zmin, self.zmax)
def set_xmin(self,inputdata):
if inputdata.description.has_key('XMin'):
self.xmin = float(inputdata.description['XMin'])
else:
self.xmin = min(inputdata.histos[i].getXMin() for i in inputdata.description['DrawOnly'])
def set_xmax(self,inputdata):
#print inputdata.description
if inputdata.description.has_key('XMax'):
self.xmax = float(inputdata.description['XMax'])
else:
#print inputdata.description['DrawOnly']
self.xmax = max(inputdata.histos[i].getXMax() for i in inputdata.description['DrawOnly'])
def set_ymin(self,inputdata):
if inputdata.description.has_key('YMin'):
self.ymin = float(inputdata.description['YMin'])
else:
foo=[]
for i in inputdata.description['DrawOnly']:
foo.append(inputdata.histos[i].getYMin(self.xmin, self.xmax, inputdata.description['LogY']))
if inputdata.description['is2dim']:
self.ymin=min(foo)
else:
showzero = True
if inputdata.description.has_key('ShowZero'):
if inputdata.description['ShowZero']=='0':
showzero = False
if showzero:
if min(foo) > -1e-4:
self.ymin = 0
else:
self.ymin = 1.1*min(foo)
else:
if min(foo) < -1e-4:
self.ymin = 1.1*min(foo)
elif min(foo) < 1e-4:
self.ymin = 0
else:
self.ymin = 0.9*min(foo)
if inputdata.description['LogY']:
foo=[item for item in foo if item>0.0]
if len(foo)==0:
if self.ymax==0:
self.ymax=1
foo.append(2e-7*self.ymax)
fullrange = opts.FULL_RANGE
if inputdata.description.has_key('FullRange'):
if inputdata.description['FullRange']=='1':
fullrange = True
else:
fullrange = False
if fullrange:
self.ymin = min(foo)/1.7
else:
self.ymin = max(min(foo)/1.7, 2e-7*self.ymax)
if self.ymin==self.ymax:
self.ymin-=1
self.ymax+=1
def set_ymax(self,inputdata):
if inputdata.description.has_key('YMax'):
self.ymax = float(inputdata.description['YMax'])
else:
foo=[]
for i in inputdata.description['DrawOnly']:
foo.append(inputdata.histos[i].getYMax(self.xmin, self.xmax))
if inputdata.description['is2dim']:
self.ymax=max(foo)
else:
if inputdata.description['LogY']:
self.ymax=1.7*max(foo)
else:
self.ymax=1.1*max(foo)
def set_zmin(self,inputdata):
if inputdata.description.has_key('ZMin'):
self.zmin = float(inputdata.description['ZMin'])
else:
foo=[]
for i in inputdata.description['DrawOnly']:
foo.append(inputdata.histos[i].getZMin(self.xmin, self.xmax, self.ymin, self.ymax))
if not foo:
self.zmin = min(foo)
else:
showzero = True
if inputdata.description.has_key('ShowZero'):
if inputdata.description['ShowZero']=='0':
showzero = False
if showzero:
if min(foo) > -1e-4:
self.zmin = 0
else:
self.zmin = 1.1*min(foo)
else:
if min(foo) < -1e-4:
self.zmin = 1.1*min(foo)
elif min(foo) < 1e-4:
self.zmin = 0
else:
self.zmin = 0.9*min(foo)
if inputdata.description['LogZ']:
foo=[item for item in foo if item>0.0]
if len(foo)==0:
if self.zmax==0:
self.zmax=1
foo.append(2e-7*self.zmax)
fullrange = opts.FULL_RANGE
if inputdata.description.has_key('FullRange'):
if inputdata.description['FullRange']=='1':
fullrange = True
else:
fullrange = False
if fullrange:
self.zmin = min(foo)/1.7
else:
self.zmin = max(min(foo)/1.7, 2e-7*self.zmax)
if self.zmin==self.zmax:
self.zmin-=1
self.zmax+=1
def set_zmax(self,inputdata):
if inputdata.description.has_key('ZMax'):
self.zmax = float(inputdata.description['ZMax'])
else:
foo=[]
for i in inputdata.description['DrawOnly']:
foo.append(inputdata.histos[i].getZMax(self.xmin, self.xmax, self.ymin, self.ymax))
if foo:
self.zmax = max(foo)
else:
self.zmax = 1
def draw(self):
pass
def write_header(self,inputdata):
if inputdata.description.has_key('LeftMargin') and inputdata.description['LeftMargin']!='':
inputdata.description['LeftMargin'] = float(inputdata.description['LeftMargin'])
else:
inputdata.description['LeftMargin'] = 1.4
if inputdata.description.has_key('RightMargin') and inputdata.description['RightMargin']!='':
inputdata.description['RightMargin'] = float(inputdata.description['RightMargin'])
else:
inputdata.description['RightMargin'] = 0.35
if inputdata.description.has_key('TopMargin') and inputdata.description['TopMargin']!='':
inputdata.description['TopMargin'] = float(inputdata.description['TopMargin'])
else:
inputdata.description['TopMargin'] = 0.65
if inputdata.description.has_key('BottomMargin') and inputdata.description['BottomMargin']!='':
inputdata.description['BottomMargin'] = float(inputdata.description['BottomMargin'])
else:
inputdata.description['BottomMargin'] = 0.95
if inputdata.description['is2dim']:
inputdata.description['RightMargin'] += 1.5
papersizex = inputdata.description['PlotSizeX'] + 0.1 + \
inputdata.description['LeftMargin'] + inputdata.description['RightMargin']
papersizey = inputdata.description['PlotSizeY'] + inputdata.description['RatioPlotSizeY'] + 0.1 + \
inputdata.description['TopMargin'] + inputdata.description['BottomMargin']
#
out = ""
out += '\\documentclass{article}\n'
if opts.OUTPUT_FONT == "MINION":
out += ('\\usepackage{minion}\n')
elif opts.OUTPUT_FONT == "PALATINO_OSF":
out += ('\\usepackage[osf,sc]{mathpazo}\n')
elif opts.OUTPUT_FONT == "PALATINO":
out += ('\\usepackage{mathpazo}\n')
elif opts.OUTPUT_FONT == "TIMES":
out += ('\\usepackage{mathptmx}\n')
elif opts.OUTPUT_FONT == "HELVETICA":
out += ('\\renewcommand{\\familydefault}{\\sfdefault}\n')
out += ('\\usepackage{sfmath}\n')
out += ('\\usepackage{helvet}\n')
out += ('\\usepackage[symbolgreek]{mathastext}\n')
for pkg in opts.LATEXPKGS:
out += ('\\usepackage{%s}\n' % pkg)
out += ('\\usepackage{pst-all}\n')
out += ('\\selectcolormodel{rgb}\n')
out += ('\\usepackage{amsmath}\n')
out += ('\\usepackage{amssymb}\n')
out += ('\\usepackage{relsize}\n')
out += ('\\usepackage[dvips,\n')
out += (' left=%4.3fcm, right=0cm,\n' %(inputdata.description['LeftMargin']-0.45,))
out += (' top=%4.3fcm, bottom=0cm,\n' %(inputdata.description['TopMargin']-0.30,))
out += (' paperwidth=%scm,paperheight=%scm\n' %(papersizex,papersizey))
out += (']{geometry}\n')
out += ('\\begin{document}\n')
out += ('\\pagestyle{empty}\n')
out += ('\\SpecialCoor\n')
out += ('\\begin{pspicture}(0,0)(0,0)\n')
out += ('\\psset{xunit=%scm}\n' %(inputdata.description['PlotSizeX']))
if inputdata.description['is2dim']:
if inputdata.description.has_key('ColorSeries') and inputdata.description['ColorSeries']!='':
colorseries = inputdata.description['ColorSeries']
else:
colorseries = '{hsb}{grad}[rgb]{0,0,1}{-.700,0,0}'
out += ('\\definecolorseries{gradientcolors}%s\n' %colorseries)
out += ('\\resetcolorseries[130]{gradientcolors}\n')
return out
def write_footer(self):
out = ""
out += ('\\end{pspicture}\n')
out += ('\\end{document}\n')
return out
class MainPlot(Plot):
def __init__(self, inputdata):
self.set_normalization(inputdata)
self.stack_histograms(inputdata)
if (inputdata.description.has_key('GofLegend') and inputdata.description['GofLegend']=='1') or \
(inputdata.description.has_key('GofFrame') and inputdata.description['GofFrame']!='') and not \
(inputdata.description.has_key('TaylorPlot') and inputdata.description['TaylorPlot']=='1'):
self.calculate_gof(inputdata)
self.set_histo_options(inputdata)
self.set_borders(inputdata)
self.yoffset = inputdata.description['PlotSizeY']
self.coors = Coordinates(inputdata)
def draw(self, inputdata):
out = ""
out += ('\n%\n% MainPlot\n%\n')
out += ('\\psset{yunit=%scm}\n' %(self.yoffset))
out += ('\\rput(0,-1){%\n')
out += ('\\psset{yunit=%scm}\n' %(inputdata.description['PlotSizeY']))
out += self._draw(inputdata)
out += ('}\n')
return out
def _draw(self, inputdata):
out = ""
if inputdata.description.has_key('DrawSpecialFirst') and inputdata.description['DrawSpecialFirst']=='1':
for i in inputdata.special.keys():
out += inputdata.special[i].draw(self.coors)
if inputdata.description.has_key('DrawFunctionFirst') and inputdata.description['DrawFunctionFirst']=='1':
for i in inputdata.functions.keys():
out += inputdata.functions[i].draw(self.coors)
for i in inputdata.description['DrawOnly']:
out += inputdata.histos[i].draw(self.coors)
else:
for i in inputdata.description['DrawOnly']:
out += inputdata.histos[i].draw(self.coors)
for i in inputdata.functions.keys():
out += inputdata.functions[i].draw(self.coors)
else:
if inputdata.description.has_key('DrawFunctionFirst') and inputdata.description['DrawFunctionFirst']=='1':
for i in inputdata.functions.keys():
out += inputdata.functions[i].draw(self.coors)
for i in inputdata.description['DrawOnly']:
out += inputdata.histos[i].draw(self.coors)
else:
for i in inputdata.description['DrawOnly']:
out += inputdata.histos[i].draw(self.coors)
for i in inputdata.functions.keys():
out += inputdata.functions[i].draw(self.coors)
for i in inputdata.special.keys():
out += inputdata.special[i].draw(self.coors)
if inputdata.description.has_key('Legend') and inputdata.description['Legend']=='1':
legend = Legend(inputdata.description,inputdata.histos,inputdata.functions)
out += legend.draw()
if inputdata.description['is2dim']:
colorscale = Colorscale(inputdata.description,self.coors)
out += colorscale.draw()
frame = Frame()
out += frame.draw(inputdata)
if inputdata.description.has_key('XMajorTickMarks') and inputdata.description['XMajorTickMarks']!='':
xcustommajortickmarks=int(inputdata.description['XMajorTickMarks'])
else:
xcustommajortickmarks=-1
if inputdata.description.has_key('XMinorTickMarks') and inputdata.description['XMinorTickMarks']!='':
xcustomminortickmarks=int(inputdata.description['XMinorTickMarks'])
else:
xcustomminortickmarks=-1
xcustommajorticks=[]
xcustomminorticks=[]
# # TODO: remove XCustomTicks after 2011-12-31:
# if inputdata.description.has_key('XCustomTicks') and inputdata.description['XCustomTicks']!='':
# logging.warning('Warning: XCustomTicks is deprecated. Use XCustomMajorTicks instead.')
# inputdata.description['XCustomMajorTicks']=inputdata.description['XCustomTicks']
if inputdata.description.has_key('XCustomMajorTicks') and inputdata.description['XCustomMajorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['XCustomMajorTicks'].strip().split('\t')
if not len(FOO) % 2:
for i in range(0,len(FOO),2):
xcustommajorticks.append({'Value': float(FOO[i]), 'Label': FOO[i+1]})
if inputdata.description.has_key('XCustomMinorTicks') and inputdata.description['XCustomMinorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['XCustomMinorTicks'].strip().split('\t')
for i in range(len(FOO)):
xcustomminorticks.append({'Value': float(FOO[i])})
xticks = XTicks(inputdata.description, self.coors)
if (inputdata.description.has_key('RatioPlot') and inputdata.description['RatioPlot']=='1') or (inputdata.description.has_key('PlotTickLabels') and inputdata.description['PlotTickLabels']=='0'):
drawlabels=False
else:
drawlabels=True
out += xticks.draw(custommajortickmarks=xcustommajortickmarks,\
customminortickmarks=xcustomminortickmarks,\
custommajorticks=xcustommajorticks,\
customminorticks=xcustomminorticks,\
drawlabels=drawlabels)
if inputdata.description.has_key('YMajorTickMarks') and inputdata.description['YMajorTickMarks']!='':
ycustommajortickmarks=int(inputdata.description['YMajorTickMarks'])
else:
ycustommajortickmarks=-1
if inputdata.description.has_key('YMinorTickMarks') and inputdata.description['YMinorTickMarks']!='':
ycustomminortickmarks=int(inputdata.description['YMinorTickMarks'])
else:
ycustomminortickmarks=-1
ycustommajorticks=[]
ycustomminorticks=[]
# # TODO: remove YCustomTicks after 2011-12-31:
# if inputdata.description.has_key('YCustomTicks') and inputdata.description['YCustomTicks']!='':
# logging.warning('Warning: YCustomTicks is deprecated. Use YCustomMajorTicks instead.')
# inputdata.description['YCustomMajorTicks']=inputdata.description['YCustomTicks']
if inputdata.description.has_key('YCustomMajorTicks') and inputdata.description['YCustomMajorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['YCustomMajorTicks'].strip().split('\t')
if not len(FOO)%2:
for i in range(0,len(FOO),2):
ycustommajorticks.append({'Value': float(FOO[i]), 'Label': FOO[i+1]})
if inputdata.description.has_key('YCustomMinorTicks') and inputdata.description['YCustomMinorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['YCustomMinorTicks'].strip().split('\t')
for i in range(len(FOO)):
ycustomminorticks.append({'Value': float(FOO[i])})
yticks = YTicks(inputdata.description, self.coors)
out += yticks.draw(custommajortickmarks=ycustommajortickmarks,\
customminortickmarks=ycustomminortickmarks,\
custommajorticks=ycustommajorticks,\
customminorticks=ycustomminorticks)
labels = Labels(inputdata.description)
if inputdata.description.has_key('RatioPlot') and inputdata.description['RatioPlot']=='1':
out += labels.draw(['Title','YLabel'])
else:
if not inputdata.description['is2dim']:
out += labels.draw(['Title','XLabel','YLabel'])
else:
out += labels.draw(['Title','XLabel','YLabel','ZLabel'])
return out
def calculate_gof(self, inputdata):
- refdata = None
- if inputdata.description.has_key('GofReference') and inputdata.description['GofReference']!='':
- refdata = inputdata.description['GofReference']
- elif inputdata.description.has_key('RatioPlotReference') and inputdata.description['RatioPlotReference']!='':
- refdata = inputdata.description['RatioPlotReference']
+ refdata = inputdata.description.get('GofReference')
+ if refdata is None:
+ refdata = inputdata.description.get('RatioPlotReference')
- if refdata==None:
+ if refdata is None:
inputdata.description['GofLegend'] = '0'
inputdata.description['GofFrame'] = ''
return
def pickcolor(gof):
color=None
colordefs = {}
for i in inputdata.description.setdefault('GofFrameColor', '0:green 3:yellow 6:red!70').strip().split():
foo = i.split(':')
if len(foo)!=2: continue
colordefs[float(foo[0])] = foo[1]
for i in sorted(colordefs.keys()):
if gof>=i:
color=colordefs[i]
return color
inputdata.description.setdefault('GofLegend','0')
inputdata.description.setdefault('GofFrame','')
inputdata.description.setdefault('FrameColor',None)
for i in inputdata.description['DrawOnly']:
if i==refdata: continue
if inputdata.description['GofLegend']!='1' and i!=inputdata.description['GofFrame']: continue
if inputdata.description.has_key('GofType') and inputdata.description['GofType']!='chi2':
return
gof = inputdata.histos[i].getChi2(inputdata.histos[refdata])
if i==inputdata.description['GofFrame'] and inputdata.description['FrameColor']==None:
inputdata.description['FrameColor']=pickcolor(gof)
if inputdata.histos[i].description.setdefault('Title', '')!='':
inputdata.histos[i].description['Title'] += ', '
inputdata.histos[i].description['Title'] += '$\\chi^2/n={}$%1.2f' %gof
class TaylorPlot(Plot):
def __init__(self, inputdata):
self.refdata = inputdata.description['TaylorPlotReference']
self.calculate_taylorcoordinates(inputdata)
def calculate_taylorcoordinates(self,inputdata):
foo=inputdata.description['DrawOnly'].pop(inputdata.description['DrawOnly'].index(self.refdata))
inputdata.description['DrawOnly'].append(foo)
for i in inputdata.description['DrawOnly']:
print i
print 'meanbinval = ', inputdata.histos[i].getMeanBinValue()
print 'sigmabinval = ', inputdata.histos[i].getSigmaBinValue()
print 'chi2/nbins = ', inputdata.histos[i].getChi2(inputdata.histos[self.refdata])
print 'correlation = ', inputdata.histos[i].getCorrelation(inputdata.histos[self.refdata])
print 'distance = ', inputdata.histos[i].getRMSdistance(inputdata.histos[self.refdata])
class RatioPlot(Plot):
def __init__(self, inputdata):
self.refdata = inputdata.description['RatioPlotReference']
self.yoffset = inputdata.description['PlotSizeY'] + inputdata.description['RatioPlotSizeY']
inputdata.description['RatioPlotStage'] = True
inputdata.description['PlotSizeY'] = inputdata.description['RatioPlotSizeY']
inputdata.description['LogY'] = False
if inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='deviation':
inputdata.description['YLabel']='$(\\text{MC}-\\text{data})$'
inputdata.description['YMin']=-3.5
inputdata.description['YMax']=3.5
elif inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='datamc':
inputdata.description['YLabel']='Data/MC'
inputdata.description['YMin']=0.5
inputdata.description['YMax']=1.5
else:
inputdata.description['YLabel']='MC/Data'
inputdata.description['YMin']=0.5
inputdata.description['YMax']=1.5
if inputdata.description.has_key('RatioPlotYLabel'):
inputdata.description['YLabel']=inputdata.description['RatioPlotYLabel']
inputdata.description['YLabel']='\\rput(-%s,0){%s}'%(0.5*inputdata.description['PlotSizeY']/inputdata.description['PlotSizeX'],inputdata.description['YLabel'])
if inputdata.description.has_key('RatioPlotYMin'):
inputdata.description['YMin']=inputdata.description['RatioPlotYMin']
if inputdata.description.has_key('RatioPlotYMax'):
inputdata.description['YMax']=inputdata.description['RatioPlotYMax']
if not inputdata.description.has_key('RatioPlotErrorBandColor'):
inputdata.description['RatioPlotErrorBandColor']='yellow'
if not inputdata.description.has_key('RatioPlotSameStyle') or inputdata.description['RatioPlotSameStyle']=='0':
inputdata.histos[self.refdata].description['ErrorBandColor']=inputdata.description['RatioPlotErrorBandColor']
inputdata.histos[self.refdata].description['ErrorBands']='1'
inputdata.histos[self.refdata].description['ErrorBars']='0'
inputdata.histos[self.refdata].description['LineStyle']='solid'
inputdata.histos[self.refdata].description['LineColor']='black'
inputdata.histos[self.refdata].description['LineWidth']='0.3pt'
inputdata.histos[self.refdata].description['PolyMarker']=''
inputdata.histos[self.refdata].description['ConnectGaps']='1'
self.calculate_ratios(inputdata)
self.set_borders(inputdata)
self.coors = Coordinates(inputdata)
def draw(self, inputdata):
out = ""
out += ('\n%\n% RatioPlot\n%\n')
out += ('\\psset{yunit=%scm}\n' %(self.yoffset))
out += ('\\rput(0,-1){%\n')
out += ('\\psset{yunit=%scm}\n' %(inputdata.description['PlotSizeY']))
out += self._draw(inputdata)
out += ('}\n')
return out
def calculate_ratios(self,inputdata):
foo=inputdata.description['DrawOnly'].pop(inputdata.description['DrawOnly'].index(self.refdata))
if inputdata.histos[self.refdata].description.has_key('ErrorBands') and inputdata.histos[self.refdata].description['ErrorBands']=='1':
inputdata.description['DrawOnly'].insert(0,foo)
else:
inputdata.description['DrawOnly'].append(foo)
for i in inputdata.description['DrawOnly']:
if i!=self.refdata:
if inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='deviation':
inputdata.histos[i].deviation(inputdata.histos[self.refdata])
elif inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='datamc':
inputdata.histos[i].dividereverse(inputdata.histos[self.refdata])
inputdata.histos[i].description['ErrorBars']='1'
else:
inputdata.histos[i].divide(inputdata.histos[self.refdata])
if inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='deviation':
inputdata.histos[self.refdata].deviation(inputdata.histos[self.refdata])
elif inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='datamc':
inputdata.histos[self.refdata].dividereverse(inputdata.histos[self.refdata])
else:
inputdata.histos[self.refdata].divide(inputdata.histos[self.refdata])
def _draw(self, inputdata):
out = ""
for i in inputdata.description['DrawOnly']:
if inputdata.description.has_key('RatioPlotMode') and inputdata.description['RatioPlotMode']=='datamc':
if i!=self.refdata:
out += inputdata.histos[i].draw(self.coors)
else:
out += inputdata.histos[i].draw(self.coors)
frame = Frame()
out += frame.draw(inputdata)
# TODO: so much duplication with MainPlot... yuck!
if inputdata.description.has_key('XMajorTickMarks') and inputdata.description['XMajorTickMarks']!='':
xcustommajortickmarks=int(inputdata.description['XMajorTickMarks'])
else:
xcustommajortickmarks=-1
if inputdata.description.has_key('XMinorTickMarks') and inputdata.description['XMinorTickMarks']!='':
xcustomminortickmarks=int(inputdata.description['XMinorTickMarks'])
else:
xcustomminortickmarks=-1
xcustommajorticks=[]
xcustomminorticks=[]
# # TODO: remove XCustomTicks after 2011-12-31:
# if inputdata.description.has_key('XCustomTicks') and inputdata.description['XCustomTicks']!='':
# logging.warning('Warning: XCustomTicks is deprecated. Use XCustomMajorTicks instead.')
# inputdata.description['XCustomMajorTicks']=inputdata.description['XCustomTicks']
if inputdata.description.has_key('XCustomMajorTicks') and inputdata.description['XCustomMajorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['XCustomMajorTicks'].strip().split('\t')
if not len(FOO)%2:
for i in range(0,len(FOO),2):
xcustommajorticks.append({'Value': float(FOO[i]), 'Label': FOO[i+1]})
if inputdata.description.has_key('XCustomMinorTicks') and inputdata.description['XCustomMinorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['XCustomMinorTicks'].strip().split('\t')
for i in range(len(FOO)):
xcustomminorticks.append({'Value': float(FOO[i])})
xticks = XTicks(inputdata.description, self.coors)
if inputdata.description.has_key('RatioPlotTickLabels') and inputdata.description['RatioPlotTickLabels']=='0':
drawlabels=False
else:
drawlabels=True
out += xticks.draw(custommajortickmarks=xcustommajortickmarks,\
customminortickmarks=xcustomminortickmarks,\
custommajorticks=xcustommajorticks,\
customminorticks=xcustomminorticks,
drawlabels=drawlabels)
if inputdata.description.has_key('YMajorTickMarks') and inputdata.description['YMajorTickMarks']!='':
ycustommajortickmarks=int(inputdata.description['YMajorTickMarks'])
else:
ycustommajortickmarks=-1
if inputdata.description.has_key('YMinorTickMarks') and inputdata.description['YMinorTickMarks']!='':
ycustomminortickmarks=int(inputdata.description['YMinorTickMarks'])
else:
ycustomminortickmarks=-1
ycustommajorticks=[]
ycustomminorticks=[]
# # TODO: remove YCustomTicks after 2011-12-31:
# if inputdata.description.has_key('YCustomTicks') and inputdata.description['YCustomTicks']!='':
# logging.warning('Warning: YCustomTicks is deprecated. Use YCustomMajorTicks instead.')
# inputdata.description['YCustomMajorTicks']=inputdata.description['YCustomTicks']
if inputdata.description.has_key('YCustomMajorTicks') and inputdata.description['YCustomMajorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['YCustomMajorTicks'].strip().split('\t')
if not len(FOO)%2:
for i in range(0,len(FOO),2):
ycustommajorticks.append({'Value': float(FOO[i]), 'Label': FOO[i+1]})
if inputdata.description.has_key('YCustomMinorTicks') and inputdata.description['YCustomMinorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=inputdata.description['YCustomMinorTicks'].strip().split('\t')
for i in range(len(FOO)):
ycustomminorticks.append({'Value': float(FOO[i])})
yticks = YTicks(inputdata.description, self.coors)
out += yticks.draw(custommajortickmarks=ycustommajortickmarks,\
customminortickmarks=ycustomminortickmarks,\
custommajorticks=ycustommajorticks,\
customminorticks=ycustomminorticks)
if inputdata.description.has_key('MainPlot') and inputdata.description['MainPlot']=='0':
if inputdata.description.has_key('Legend') and inputdata.description['Legend']=='1':
legend = Legend(inputdata.description,inputdata.histos,inputdata.functions)
out += legend.draw()
labels = Labels(inputdata.description)
if inputdata.description.has_key('MainPlot') and inputdata.description['MainPlot']=='0':
out += labels.draw(['Title','XLabel','YLabel'])
else:
out += labels.draw(['XLabel','YLabel'])
return out
class Legend(object):
def __init__(self, description, histos, functions):
self.histos = histos
self.functions = functions
self.description = description
def draw(self):
out = ""
out += '\n%\n% Legend\n%\n'
out += '\\rput[tr](%s,%s){%%\n' % (self.getLegendXPos(), self.getLegendYPos())
ypos = -0.05*6/self.description['PlotSizeY']
legendordermap = {}
legendlist = self.description['DrawOnly']+self.functions.keys()
if self.description.has_key('LegendOnly'):
legendlist = []
for legend in self.description['LegendOnly'].strip().split():
if legend in self.histos.keys() or legend in self.functions.keys():
legendlist.append(legend)
for legend in legendlist:
order = 0
if self.histos.has_key(legend) and self.histos[legend].description.has_key('LegendOrder'):
order = int(self.histos[legend].description['LegendOrder'])
if self.functions.has_key(legend) and self.functions[legend].description.has_key('LegendOrder'):
order = int(self.functions[legend].description['LegendOrder'])
if not order in legendordermap:
legendordermap[order] = []
legendordermap[order].append(legend)
foo=[]
for i in sorted(legendordermap.keys()):
foo.extend(legendordermap[i])
rel_xpos_sign = 1.0
if self.getLegendAlign()=='r':
rel_xpos_sign = -1.0
xpos1 = -0.10*rel_xpos_sign
xpos2 = -0.02*rel_xpos_sign
for i in foo:
if self.histos.has_key(i):
drawobject=self.histos[i]
elif self.functions.has_key(i):
drawobject=self.functions[i]
else:
continue
title = drawobject.getTitle()
if title == '':
continue
else:
out += ('\\rput[B%s](%s,%s){%s}\n' %(self.getLegendAlign(),rel_xpos_sign*0.1,ypos,title))
out += ('\\rput[B%s](%s,%s){%s\n' %(self.getLegendAlign(),rel_xpos_sign*0.1,ypos,'%'))
if drawobject.getErrorBands():
out += ('\\psframe[linewidth=0pt,linestyle=none,fillstyle=solid,fillcolor=%s,opacity=%s]' %(drawobject.getErrorBandColor(),drawobject.getErrorBandOpacity()))
out += ('(%s, 0.033)(%s, 0.001)\n' %(xpos1, xpos2))
out += ('\\psline[linestyle=' + drawobject.getLineStyle() \
+ ', linecolor=' + drawobject.getLineColor() \
+ ', linewidth=' + drawobject.getLineWidth() \
+ ', strokeopacity=' + drawobject.getLineOpacity() \
+ ', opacity=' + drawobject.getFillOpacity())
if drawobject.getLineDash()!='':
out += (', dash=' + drawobject.getLineDash())
if drawobject.getFillStyle()!='none':
out += (', fillstyle=' + drawobject.getFillStyle() \
+ ', fillcolor=' + drawobject.getFillColor() \
+ ', hatchcolor=' + drawobject.getHatchColor() \
+ ']{C-C}(%s, 0.030)(%s, 0.030)(%s, 0.004)(%s, 0.004)(%s, 0.030)\n' \
%(xpos1, xpos2, xpos2, xpos1, xpos1))
else:
out += ('](%s, 0.016)(%s, 0.016)\n' %(xpos1, xpos2))
if drawobject.getPolyMarker() != '':
out += (' \\psdot[dotstyle=' + drawobject.getPolyMarker() \
+ ', dotsize=' + drawobject.getDotSize() \
+ ', dotscale=' + drawobject.getDotScale() \
+ ', linecolor=' + drawobject.getLineColor() \
+ ', linewidth=' + drawobject.getLineWidth() \
+ ', linestyle=' + drawobject.getLineStyle() \
+ ', fillstyle=' + drawobject.getFillStyle() \
+ ', fillcolor=' + drawobject.getFillColor() \
+ ', strokeopacity=' + drawobject.getLineOpacity() \
+ ', opacity=' + drawobject.getFillOpacity() \
+ ', hatchcolor=' + drawobject.getHatchColor())
if drawobject.getFillStyle()!='none':
out += ('](%s, 0.028)\n' % (rel_xpos_sign*-0.06))
else:
out += ('](%s, 0.016)\n' % (rel_xpos_sign*-0.06))
out += ('}\n')
ypos -= 0.075*6/self.description['PlotSizeY']
if self.description.has_key('CustomLegend'):
for i in self.description['CustomLegend'].strip().split('\\\\'):
out += ('\\rput[B%s](%s,%s){%s}\n' %(self.getLegendAlign(),rel_xpos_sign*0.1,ypos,i))
ypos -= 0.075*6/self.description['PlotSizeY']
out += ('}\n')
return out
def getLegendXPos(self):
if self.description.has_key('LegendXPos'):
return self.description['LegendXPos']
else:
if self.getLegendAlign()=='r':
return '0.95'
else:
return '0.53'
def getLegendYPos(self):
if self.description.has_key('LegendYPos'):
return self.description['LegendYPos']
else:
return '0.93'
def getLegendAlign(self):
if self.description.has_key('LegendAlign'):
return self.description['LegendAlign']
else:
return 'l'
class Colorscale(object):
def __init__(self, description, coors):
self.description = description
self.coors = coors
def draw(self):
out = ''
out += '\n%\n% Colorscale\n%\n'
out += '\\rput(1,0){\n'
out += ' \\psset{xunit=4mm}\n'
out += ' \\rput(0.5,0){\n'
out += ' \\psset{yunit=0.0076923, linestyle=none, fillstyle=solid}\n'
out += ' \\multido{\\ic=0+1,\\id=1+1}{130}{\n'
out += ' \\psframe[fillcolor={gradientcolors!![\\ic]},dimen=inner,linewidth=0.1pt](0, \\ic)(1, \\id)\n'
out += ' }\n'
out += ' }\n'
out += ' \\rput(0.5,0){\n'
out += ' \\psframe[linewidth=0.3pt,dimen=middle](0,0)(1,1)\n'
# TODO: so much parsing duplication with MainPlot... yuck!
if self.description.has_key('ZMajorTickMarks') and self.description['ZMajorTickMarks']!='':
zcustommajortickmarks=int(self.description['ZMajorTickMarks'])
else:
zcustommajortickmarks=-1
if self.description.has_key('ZMinorTickMarks') and self.description['ZMinorTickMarks']!='':
zcustomminortickmarks=int(self.description['ZMinorTickMarks'])
else:
zcustomminortickmarks=-1
zcustommajorticks=[]
zcustomminorticks=[]
# # TODO: remove ZCustomTicks after 2011-12-31:
# if self.description.has_key('ZCustomTicks') and self.description['ZCustomTicks']!='':
# logging.warning('Warning: ZCustomTicks is deprecated. Use ZCustomMajorTicks instead.')
# self.description['ZCustomMajorTicks']=self.description['ZCustomTicks']
if self.description.has_key('ZCustomMajorTicks') and self.description['ZCustomMajorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=self.description['ZCustomMajorTicks'].strip().split('\t')
if not len(FOO)%2:
for i in range(0,len(FOO),2):
zcustommajorticks.append({'Value': float(FOO[i]), 'Label': FOO[i+1]})
if self.description.has_key('ZCustomMinorTicks') and self.description['ZCustomMinorTicks']!='':
# TODO: Would be nice to have less invisible separation of the custom ticks than split on tabs
FOO=self.description['ZCustomMinorTicks'].strip().split('\t')
for i in range(len(FOO)):
zcustomminorticks.append({'Value': float(FOO[i])})
zticks = ZTicks(self.description, self.coors)
out += zticks.draw(custommajortickmarks=zcustommajortickmarks,\
customminortickmarks=zcustomminortickmarks,\
custommajorticks=zcustommajorticks,\
customminorticks=zcustomminorticks)
out += ' }\n'
out += '}\n'
return out
class Labels(object):
def __init__(self, description):
self.description = description
def draw(self, axis=[]):
out = ""
out += ('\n%\n% Labels\n%\n')
if self.description.has_key('Title') and (axis.count('Title') or axis==[]):
out += ('\\rput(0,1){\\rput[lB](0, 1.7\\labelsep){\\normalsize '+self.description['Title']+'}}\n')
if self.description.has_key('XLabel') and (axis.count('XLabel') or axis==[]):
xlabelsep=4.7
if self.description.has_key('XLabelSep'):
xlabelsep=float(self.description['XLabelSep'])
out += ('\\rput(1,0){\\rput[rB](0,-%4.3f\\labelsep){\\normalsize '%(xlabelsep) +self.description['XLabel']+'}}\n')
if self.description.has_key('YLabel') and (axis.count('YLabel') or axis==[]):
ylabelsep=6.5
if self.description.has_key('YLabelSep'):
ylabelsep=float(self.description['YLabelSep'])
out += ('\\rput(0,1){\\rput[rB]{90}(-%4.3f\\labelsep,0){\\normalsize '%(ylabelsep) +self.description['YLabel']+'}}\n')
if self.description.has_key('ZLabel') and (axis.count('ZLabel') or axis==[]):
zlabelsep=5.3
if self.description.has_key('ZLabelSep'):
zlabelsep=float(self.description['ZLabelSep'])
out += ('\\rput(1,1){\\rput(%4.3f\\labelsep,0){\\psset{xunit=4mm}\\rput[lB]{270}(1.5,0){\\normalsize '%(zlabelsep) +self.description['ZLabel']+'}}}\n')
return out
class Special(object):
def __init__(self, f):
self.description = {}
self.data = []
self.read_input(f)
def read_input(self, f):
for line in f:
if is_end_marker(line, 'SPECIAL'):
break
elif is_comment(line):
continue
else:
self.data.append(line)
def draw(self,coors):
out = ""
out += ('\n%\n% Special\n%\n')
import re
regex = re.compile(r'^(.*?)(\\physics[xy]?coor)\(\s?([0-9\.eE+-]+)\s?,\s?([0-9\.eE+-]+)\s?\)(.*)')
# TODO: More precise number string matching, something like this:
# num = r"-?[0-9]*(?:\.[0-9]*)(?:[eE][+-]?\d+]"
# regex = re.compile(r'^(.*?)(\\physics[xy]?coor)\(\s?(' + num + ')\s?,\s?(' + num + ')\s?\)(.*)')
for i in xrange(len(self.data)):
while regex.search(self.data[i]):
match = regex.search(self.data[i])
xcoor, ycoor = float(match.group(3)), float(match.group(4))
if match.group(2)[1:] in ["physicscoor", "physicsxcoor"]:
xcoor = coors.phys2frameX(xcoor)
if match.group(2)[1:] in ["physicscoor", "physicsycoor"]:
ycoor = coors.phys2frameY(ycoor)
line = "%s(%f, %f)%s" % (match.group(1), xcoor, ycoor, match.group(5))
self.data[i] = line
out += self.data[i]+'\n'
return out
class DrawableObject(object):
def __init__(self, f):
pass
def getTitle(self):
if self.description.has_key('Title'):
return self.description['Title']
else:
return ''
def getLineStyle(self):
if self.description.has_key('LineStyle'):
## I normally like there to be "only one way to do it", but providing
## this dashdotted/dotdashed synonym just seems humane ;-)
if self.description['LineStyle'] in ('dashdotted', 'dotdashed'):
self.description['LineStyle']='dashed'
self.description['LineDash']='3pt 3pt .8pt 3pt'
return self.description['LineStyle']
else:
return 'solid'
def getLineDash(self):
if self.description.has_key('LineDash'):
# Check if LineStyle=='dashdotted' before returning something
self.getLineStyle()
return self.description['LineDash']
else:
return ''
def getLineWidth(self):
if self.description.has_key('LineWidth'):
return self.description['LineWidth']
else:
return '0.8pt'
def getLineColor(self):
if self.description.has_key('LineColor'):
return self.description['LineColor']
else:
return 'black'
def getLineOpacity(self):
if self.description.has_key('LineOpacity'):
return self.description['LineOpacity']
else:
return '1.0'
def getFillColor(self):
if self.description.has_key('FillColor'):
return self.description['FillColor']
else:
return 'white'
def getFillOpacity(self):
if self.description.has_key('FillOpacity'):
return self.description['FillOpacity']
else:
return '1.0'
def getHatchColor(self):
if self.description.has_key('HatchColor'):
return self.description['HatchColor']
else:
return 'black'
def getFillStyle(self):
if self.description.has_key('FillStyle'):
return self.description['FillStyle']
else:
return 'none'
def getPolyMarker(self):
if self.description.has_key('PolyMarker'):
return self.description['PolyMarker']
else:
return ''
def getDotSize(self):
if self.description.has_key('DotSize'):
return self.description['DotSize']
else:
return '2pt 2'
def getDotScale(self):
if self.description.has_key('DotScale'):
return self.description['DotScale']
else:
return '1'
def getErrorBars(self):
if self.description.has_key('ErrorBars'):
return bool(int(self.description['ErrorBars']))
else:
return False
def getErrorBands(self):
if self.description.has_key('ErrorBands'):
return bool(int(self.description['ErrorBands']))
else:
return False
def getErrorBandColor(self):
if self.description.has_key('ErrorBandColor'):
return self.description['ErrorBandColor']
else:
return 'yellow'
def getErrorBandOpacity(self):
if self.description.has_key('ErrorBandOpacity'):
return self.description['ErrorBandOpacity']
else:
return '1.0'
def getSmoothLine(self):
if self.description.has_key('SmoothLine'):
return bool(int(self.description['SmoothLine']))
else:
return False
def startclip(self):
return '\\psclip{\\psframe[linewidth=0, linestyle=none](0,0)(1,1)}\n'
def stopclip(self):
return '\\endpsclip\n'
def startpsset(self):
out = ""
out += ('\\psset{linecolor='+self.getLineColor()+'}\n')
out += ('\\psset{linewidth='+self.getLineWidth()+'}\n')
out += ('\\psset{linestyle='+self.getLineStyle()+'}\n')
out += ('\\psset{fillstyle='+self.getFillStyle()+'}\n')
out += ('\\psset{fillcolor='+self.getFillColor()+'}\n')
out += ('\\psset{hatchcolor='+self.getHatchColor()+'}\n')
out += ('\\psset{strokeopacity='+self.getLineOpacity()+'}\n')
out += ('\\psset{opacity='+self.getFillOpacity()+'}\n')
if self.getLineDash()!='':
out += ('\\psset{dash='+self.getLineDash()+'}\n')
return out
def stoppsset(self):
out = ""
out += ('\\psset{linecolor=black}\n')
out += ('\\psset{linewidth=0.8pt}\n')
out += ('\\psset{linestyle=solid}\n')
out += ('\\psset{fillstyle=none}\n')
out += ('\\psset{fillcolor=white}\n')
out += ('\\psset{hatchcolor=black}\n')
out += ('\\psset{strokeopacity=1.0}\n')
out += ('\\psset{opacity=1.0}\n')
return out
class Function(DrawableObject):
def __init__(self, f):
self.description = {}
self.read_input(f)
def read_input(self, f):
self.code='def plotfunction(x):\n'
iscode=False
for line in f:
if is_end_marker(line, 'FUNCTION'):
break
elif is_comment(line):
continue
else:
m = pat_property.match(line)
if iscode:
self.code+=' '+line
elif m:
prop, value = m.group(1,2)
if prop=='Code':
iscode=True
else:
self.description[prop] = value
if not iscode:
print '++++++++++ ERROR: No code in function'
else:
foo = compile(self.code, '<string>', 'exec')
exec(foo)
self.plotfunction = plotfunction
def draw(self,coors):
out = ""
out += self.startclip()
out += self.startpsset()
xmin = coors.xmin()
if self.description.has_key('XMin') and self.description['XMin']:
xmin = float(self.description['XMin'])
xmax=coors.xmax()
if self.description.has_key('XMax') and self.description['XMax']:
xmax=float(self.description['XMax'])
# TODO: Space sample points logarithmically if LogX=1
dx = (xmax-xmin)/500.
x = xmin-dx
out += '\\pscurve'
if self.description.has_key('FillStyle') and self.description['FillStyle']!='none':
out += '(%s,%s)\n' % (coors.strphys2frameX(xmin),coors.strphys2frameY(coors.ymin()))
while x < (xmax+2*dx):
y = self.plotfunction(x)
out += ('(%s,%s)\n' % (coors.strphys2frameX(x), coors.strphys2frameY(y)))
x += dx
if self.description.has_key('FillStyle') and self.description['FillStyle']!='none':
out += '(%s,%s)\n' % (coors.strphys2frameX(xmax),coors.strphys2frameY(coors.ymin()))
out += self.stoppsset()
out += self.stopclip()
return out
class Histogram(DrawableObject):
def __init__(self, f, p=None):
self.description = {}
self.is2dim = False
self.data = []
self.read_input_data(f)
self.sigmabinvalue = None
self.meanbinvalue = None
self.path = p
def read_input_data(self, f):
for line in f:
if is_end_marker(line, 'HISTOGRAM'):
break
elif is_comment(line):
continue
else:
line = line.rstrip()
m = pat_property.match(line)
if m:
prop, value = m.group(1,2)
self.description[prop] = value
else:
## Detect symm errs
linearray = line.split()
if len(linearray) == 4:
self.data.append({'LowEdge': float(linearray[0]),
'UpEdge': float(linearray[1]),
'Content': float(linearray[2]),
'Error': [float(linearray[3]),float(linearray[3])]})
## Detect asymm errs
elif len(linearray) == 5:
self.data.append({'LowEdge': float(linearray[0]),
'UpEdge': float(linearray[1]),
'Content': float(linearray[2]),
'Error': [float(linearray[3]),float(linearray[4])]})
## Detect two-dimensionality
elif len(linearray) in [6,7]:
self.is2dim = True
# If asymm z error, use the max or average of +- error
err = float(linearray[5])
if len(linearray) == 7:
if self.description.get("ShowMaxZErr", 1):
err = max(err, float(linearray[6]))
else:
err = 0.5 * (err + float(linearray[6]))
self.data.append({'LowEdge': [float(linearray[0]), float(linearray[2])],
'UpEdge': [float(linearray[1]), float(linearray[3])],
'Content': float(linearray[4]),
'Error': err})
## Unknown histo format
else:
raise RuntimeError("Unknown HISTOGRAM data line format with %d entries" % len(linearray))
def mangle_input(self):
if (self.description.has_key('NormalizeToIntegral') and self.description['NormalizeToIntegral']=='1') or \
(self.description.has_key('NormalizeToSum') and self.description['NormalizeToSum']=='1'):
if (self.description.has_key('NormalizeToIntegral') and self.description['NormalizeToIntegral']=='1') and \
(self.description.has_key('NormalizeToSum') and self.description['NormalizeToSum']=='1'):
print 'Can\'t normalize to Integral and to Sum at the same time. Will normalize to the Sum.'
foo = 0
for i in range(len(self.data)):
if self.description.has_key('NormalizeToSum') and self.description['NormalizeToSum']=='1':
foo += self.data[i]['Content']
else:
foo += self.data[i]['Content']*(self.data[i]['UpEdge']-self.data[i]['LowEdge'])
for i in range(len(self.data)):
self.data[i]['Content'] /= foo
self.data[i]['Error'][0] /= foo
self.data[i]['Error'][1] /= foo
if self.description.has_key('Scale') and self.description['Scale']!='':
scale = float(self.description['Scale'])
for i in range(len(self.data)):
self.data[i]['Content'] *= scale
self.data[i]['Error'][0] *= scale
self.data[i]['Error'][1] *= scale
if self.description.has_key('Rebin') and self.description['Rebin']!='':
rebin=int(self.description['Rebin'])
errortype = "stat"
if self.description.has_key('ErrorType') and self.description['ErrorType']!='':
errortype = self.description['ErrorType']
newdata=[]
if rebin>=2:
for i in range(0,(len(self.data)/rebin)*rebin,rebin):
foo=0.
barl=0.
baru=0.
for j in range(rebin):
binwidth=self.data[i+j]['UpEdge']-self.data[i+j]['LowEdge']
foo +=self.data[i+j]['Content']*binwidth
if errortype=="stat":
barl+=(binwidth*self.data[i+j]['Error'][0])**2
baru+=(binwidth*self.data[i+j]['Error'][1])**2
elif errortype=="env":
barl+=(self.data[i+j]['Content']-self.data[i+j]['Error'][0])*binwidth
baru+=(self.data[i+j]['Content']+self.data[i+j]['Error'][1])*binwidth
else:
logging.error("Rebinning for ErrorType not implemented.")
sys.exit(1)
newbinwidth=self.data[i+rebin-1]['UpEdge']-self.data[i]['LowEdge']
newcentral=foo/newbinwidth
if errortype=="stat":
newerror=[sqrt(barl)/newbinwidth,sqrt(baru)/newbinwidth]
elif errortype=="env":
newerror=[(foo-barl)/newbinwidth,(baru-foo)/newbinwidth]
newdata.append({'LowEdge': self.data[i]['LowEdge'],
'UpEdge': self.data[i+rebin-1]['UpEdge'],
'Content': newcentral,
'Error': newerror})
self.data=newdata
def add(self, name):
if len(self.data) != len(name.data):
print '+++ Error in Histogram.add() for %s: different numbers of bins' % self.path
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
self.data[i]['Content'] += name.data[i]['Content']
self.data[i]['Error'][0] = sqrt(self.data[i]['Error'][0]**2 + name.data[i]['Error'][0]**2)
self.data[i]['Error'][1] = sqrt(self.data[i]['Error'][1]**2 + name.data[i]['Error'][1]**2)
else:
print '+++ Error in Histogram.add() for %s: binning of histograms differs' % self.path
def divide(self, name):
if len(self.data) != len(name.data):
print '+++ Error in Histogram.divide() for %s: different numbers of bins' % self.path
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
try:
self.data[i]['Error'][0] /= name.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Error'][0]=0.
try:
self.data[i]['Error'][1] /= name.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Error'][1]=0.
try:
self.data[i]['Content'] /= name.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Content']=1.
# self.data[i]['Error'][0] = sqrt(self.data[i]['Error'][0]**2 + name.data[i]['Error'][0]**2)
# self.data[i]['Error'][1] = sqrt(self.data[i]['Error'][1]**2 + name.data[i]['Error'][1]**2)
else:
print '+++ Error in Histogram.divide() for %s: binning of histograms differs' % self.path
def dividereverse(self, name):
if len(self.data) != len(name.data):
print '+++ Error in Histogram.dividereverse() for %s: different numbers of bins' % self.path
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
try:
self.data[i]['Error'][0] = name.data[i]['Error'][0]/self.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Error'][0]=0.
try:
self.data[i]['Error'][1] = name.data[i]['Error'][1]/self.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Error'][1]=0.
try:
self.data[i]['Content'] = name.data[i]['Content']/self.data[i]['Content']
except ZeroDivisionError:
self.data[i]['Content']=1.
else:
print '+++ Error in Histogram.dividereverse(): binning of histograms differs'
def deviation(self, name):
if len(self.data) != len(name.data):
print '+++ Error in Histogram.deviation() for %s: different numbers of bins' % self.path
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
self.data[i]['Content'] -= name.data[i]['Content']
try:
self.data[i]['Content'] /= 0.5*sqrt((name.data[i]['Error'][0] + name.data[i]['Error'][1])**2 + \
(self.data[i]['Error'][0] + self.data[i]['Error'][1])**2)
except ZeroDivisionError:
self.data[i]['Content'] = 0.0
try:
self.data[i]['Error'][0] /= name.data[i]['Error'][0]
except ZeroDivisionError:
self.data[i]['Error'][0] = 0.0
try:
self.data[i]['Error'][1] /= name.data[i]['Error'][1]
except ZeroDivisionError:
self.data[i]['Error'][1] = 0.0
else:
print '+++ Error in Histogram.deviation() for %s: binning of histograms differs' % self.path
def getChi2(self, name):
chi2 = 0.
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
try:
chi2 += (self.data[i]['Content']-name.data[i]['Content'])**2/((0.5*self.data[i]['Error'][0]+0.5*self.data[i]['Error'][1])**2 + (0.5*name.data[i]['Error'][0]+0.5*name.data[i]['Error'][1])**2)
except ZeroDivisionError:
pass
else:
print '+++ Error in Histogram.getChi2() for %s: binning of histograms differs' % self.path
return chi2/len(self.data)
def getSigmaBinValue(self):
if self.sigmabinvalue==None:
self.sigmabinvalue = 0.
sumofweights = 0.
for i in range(len(self.data)):
if self.is2dim:
binwidth = abs( (self.data[i]['UpEdge'][0] - self.data[i]['LowEdge'][0])
*(self.data[i]['UpEdge'][1] - self.data[i]['LowEdge'][1]))
else:
binwidth = abs(self.data[i]['UpEdge'] - self.data[i]['LowEdge'])
self.sigmabinvalue += binwidth*(self.data[i]['Content']-self.getMeanBinValue())**2
sumofweights += binwidth
self.sigmabinvalue = sqrt(self.sigmabinvalue/sumofweights)
return self.sigmabinvalue
def getMeanBinValue(self):
if self.meanbinvalue==None:
self.meanbinvalue = 0.
sumofweights = 0.
for i in range(len(self.data)):
if self.is2dim:
binwidth = abs( (self.data[i]['UpEdge'][0] - self.data[i]['LowEdge'][0])
*(self.data[i]['UpEdge'][1] - self.data[i]['LowEdge'][1]))
else:
binwidth = abs(self.data[i]['UpEdge'] - self.data[i]['LowEdge'])
self.meanbinvalue += binwidth*self.data[i]['Content']
sumofweights += binwidth
self.meanbinvalue /= sumofweights
return self.meanbinvalue
def getCorrelation(self, name):
correlation = 0.
sumofweights = 0.
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
if self.is2dim:
binwidth = abs( (self.data[i]['UpEdge'][0] - self.data[i]['LowEdge'][0])
* (self.data[i]['UpEdge'][1] - self.data[i]['LowEdge'][1]) )
else:
binwidth = abs(self.data[i]['UpEdge'] - self.data[i]['LowEdge'])
correlation += binwidth * ( self.data[i]['Content'] - self.getMeanBinValue() ) \
* ( name.data[i]['Content'] - name.getMeanBinValue() )
sumofweights += binwidth
else:
print '+++ Error in Histogram.getCorrelation(): binning of histograms differs' % self.path
correlation /= sumofweights
try:
correlation /= self.getSigmaBinValue()*name.getSigmaBinValue()
except ZeroDivisionError:
correlation = 0
return correlation
def getRMSdistance(self,name):
distance = 0.
sumofweights = 0.
for i in range(len(self.data)):
if fuzzyeq(self.data[i]['LowEdge'], name.data[i]['LowEdge']) and \
fuzzyeq(self.data[i]['UpEdge'], name.data[i]['UpEdge']):
if self.is2dim:
binwidth = abs( (self.data[i]['UpEdge'][0] - self.data[i]['LowEdge'][0])
* (self.data[i]['UpEdge'][1] - self.data[i]['LowEdge'][1]) )
else:
binwidth = abs(self.data[i]['UpEdge'] - self.data[i]['LowEdge'])
distance += binwidth * ( (self.data[i]['Content'] - self.getMeanBinValue())
-(name.data[i]['Content'] - name.getMeanBinValue()))**2
sumofweights += binwidth
else:
print '+++ Error in Histogram.getRMSdistance() for %s: binning of histograms differs' % self.path
distance = sqrt(distance/sumofweights)
return distance
def draw(self,coors):
seen_nan = False
out = ""
out += self.startclip()
out += self.startpsset()
#
if self.is2dim:
for i in range(len(self.data)):
out += ('\\psframe')
color=int(129*coors.phys2frameZ(self.data[i]['Content']))
if self.data[i]['Content']>coors.zmax():
color=129
if self.data[i]['Content']<coors.zmin():
color=0
out += ('[linewidth=0pt, linestyle=none, fillstyle=solid, fillcolor={gradientcolors!!['+str(color)+']}]')
out += ('(' + coors.strphys2frameX(self.data[i]['LowEdge'][0]) + ', ' \
+ coors.strphys2frameY(self.data[i]['LowEdge'][1]) + ')(' \
+ coors.strphys2frameX(self.data[i]['UpEdge'][0]) + ', ' \
+ coors.strphys2frameY(self.data[i]['UpEdge'][1]) + ')\n')
else:
if self.getErrorBands():
self.description['SmoothLine']=0
for i in range(len(self.data)):
out += ('\\psframe[dimen=inner,linewidth=0pt,linestyle=none,fillstyle=solid,fillcolor=%s,opacity=%s]' %(self.getErrorBandColor(),self.getErrorBandOpacity()))
out += ('(' + coors.strphys2frameX(self.data[i]['LowEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']-self.data[i]['Error'][0]) + ')(' \
+ coors.strphys2frameX(self.data[i]['UpEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']+self.data[i]['Error'][1]) + ')\n')
if self.getErrorBars():
for i in range(len(self.data)):
if isnan(self.data[i]['Content']) or isnan(self.data[i]['Error'][0]) or isnan(self.data[i]['Error'][1]):
seen_nan = True
continue
if self.data[i]['Content']==0. and self.data[i]['Error']==[0.,0.]:
continue
out += ('\psline')
out += ('(' + coors.strphys2frameX(self.data[i]['LowEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')(' \
+ coors.strphys2frameX(self.data[i]['UpEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')\n')
out += ('\psline')
bincenter = coors.strphys2frameX(.5*(self.data[i]['LowEdge']+self.data[i]['UpEdge']))
out += ('(' + bincenter + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']-self.data[i]['Error'][0]) + ')(' \
+ bincenter + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']+self.data[i]['Error'][1]) + ')\n')
if self.getSmoothLine():
out += ('\psbezier')
else:
out += ('\psline')
if (self.getFillStyle() != 'none'): # make sure that filled areas go all the way down to the x-axis
if (coors.phys2frameX(self.data[0]['LowEdge']) > 1e-4):
out += '(' + coors.strphys2frameX(self.data[0]['LowEdge']) + ', -0.1)\n'
else:
out += '(-0.1, -0.1)\n'
for i in range(len(self.data)):
if isnan(self.data[i]['Content']):
seen_nan = True
continue
if self.getSmoothLine():
out += ('(' + coors.strphys2frameX(0.5*(self.data[i]['LowEdge']+self.data[i]['UpEdge'])) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')\n')
else:
out += ('(' + coors.strphys2frameX(self.data[i]['LowEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')(' \
+ coors.strphys2frameX(self.data[i]['UpEdge']) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')\n')
## Join/separate data points, with vertical/diagonal lines
if (i+1 < len(self.data)): #< If this is not the last point
if self.description.get('ConnectBins', '1') != '1':
out += ('\\psline')
else:
## If bins are joined, but there is a gap in binning, choose whether to fill the gap
if (abs(coors.phys2frameX(self.data[i]['UpEdge']) - coors.phys2frameX(self.data[i+1]['LowEdge'])) > 1e-4):
if self.description.get('ConnectGaps', '0') != '1':
out += ('\\psline')
# TODO: Perhaps use a new dashed line to fill the gap?
if (self.getFillStyle() != 'none'): # make sure that filled areas go all the way down to the x-axis
if (coors.phys2frameX(self.data[-1]['UpEdge']) < 1-1e-4):
out += '(' + coors.strphys2frameX(self.data[-1]['UpEdge']) + ', -0.1)\n'
else:
out += '(1.1, -0.1)\n'
#
if self.getPolyMarker() != '':
for i in range(len(self.data)):
if isnan(self.data[i]['Content']):
seen_nan = True
continue
if self.data[i]['Content']==0. and self.data[i]['Error']==[0.,0.]:
continue
out += ('\\psdot[dotstyle=%s,dotsize=%s,dotscale=%s](' %(self.getPolyMarker(),self.getDotSize(),self.getDotScale()) \
+ coors.strphys2frameX(.5*(self.data[i]['LowEdge']+self.data[i]['UpEdge'])) + ', ' \
+ coors.strphys2frameY(self.data[i]['Content']) + ')\n')
out += self.stoppsset()
out += self.stopclip()
if seen_nan:
print "WARNING: NaN-valued value or error bar!"
return out
# def is2dimensional(self):
# return self.is2dim
def getXMin(self):
if self.is2dim:
return min([self.data[i]['LowEdge'][0] for i in range(len(self.data))])
else:
return min([self.data[i]['LowEdge'] for i in range(len(self.data))])
def getXMax(self):
if self.is2dim:
return max([self.data[i]['UpEdge'][0] for i in range(len(self.data))])
else:
return max([self.data[i]['UpEdge'] for i in range(len(self.data))])
def getYMin(self, xmin, xmax, logy):
if self.is2dim:
return min([self.data[i]['LowEdge'][1] for i in range(len(self.data))])
else:
yvalues = []
for i in range(len(self.data)):
if ((self.data[i]['UpEdge'] > xmin or self.data[i]['LowEdge'] >= xmin) and \
(self.data[i]['LowEdge'] < xmax or self.data[i]['UpEdge'] <= xmax)):
foo = 0
if self.getErrorBars() or self.getErrorBands():
foo = self.data[i]['Content']-self.data[i]['Error'][0]
else:
foo = self.data[i]['Content']
if logy:
if foo>0: yvalues.append(foo)
else:
yvalues.append(foo)
if len(yvalues) > 0:
return min(yvalues)
else:
return self.data[0]['Content']
def getYMax(self, xmin, xmax):
if self.is2dim:
return max([self.data[i]['UpEdge'][1] for i in range(len(self.data))])
else:
yvalues = []
for i in range(len(self.data)):
if ((self.data[i]['UpEdge'] > xmin or self.data[i]['LowEdge'] >= xmin) and \
(self.data[i]['LowEdge'] < xmax or self.data[i]['UpEdge'] <= xmax)):
if self.getErrorBars() or self.getErrorBands():
yvalues.append(self.data[i]['Content']+self.data[i]['Error'][1])
else:
yvalues.append(self.data[i]['Content'])
if len(yvalues) > 0:
return max(yvalues)
else:
return self.data[0]['Content']
def getZMin(self, xmin, xmax, ymin, ymax):
if not self.is2dim:
return 0
zvalues = []
for i in range(len(self.data)):
if (self.data[i]['UpEdge'][0] > xmin and self.data[i]['LowEdge'][0] < xmax) and \
(self.data[i]['UpEdge'][1] > ymin and self.data[i]['LowEdge'][1] < ymax):
zvalues.append(self.data[i]['Content'])
return min(zvalues)
def getZMax(self, xmin, xmax, ymin, ymax):
if not self.is2dim:
return 0
zvalues = []
for i in range(len(self.data)):
if (self.data[i]['UpEdge'][0] > xmin and self.data[i]['LowEdge'][0] < xmax) and \
(self.data[i]['UpEdge'][1] > ymin and self.data[i]['LowEdge'][1] < ymax):
zvalues.append(self.data[i]['Content'])
return max(zvalues)
class Histo1D(Histogram):
def read_input_data(self, f):
for line in f:
if is_end_marker(line, 'HISTO1D'):
break
elif is_comment(line):
continue
else:
line = line.rstrip()
m = pat_property.match(line)
if m:
prop, value = m.group(1,2)
self.description[prop] = value
else:
linearray = line.split()
## Detect symm errs
if len(linearray) == 4:
self.data.append({'LowEdge': float(linearray[0]),
'UpEdge': float(linearray[1]),
'Content': float(linearray[2]),
'Error': [float(linearray[3]),float(linearray[3])]})
## Detect asymm errs
elif len(linearray) == 5:
self.data.append({'LowEdge': float(linearray[0]),
'UpEdge': float(linearray[1]),
'Content': float(linearray[2]),
'Error': [float(linearray[3]),float(linearray[4])]})
## Not sure what this is for... auto-compatibility with YODA format? Urgh
elif len(linearray) == 8:
self.data.append({'LowEdge': float(linearray[0]),
'UpEdge': float(linearray[1]),
'Content': float(linearray[2]),
'Error': [float(linearray[3]),float(linearray[3])]})
else:
raise Exception('Histo1D does not have 8 columns.'+line)
class Histo2D(Histogram):
def read_input_data(self, f):
self.is2dim = True #< Should really be done in a constructor, but this is easier for now...
for line in f:
if is_end_marker(line, 'HISTO2D'):
break
elif is_comment(line):
continue
else:
line = line.rstrip()
m = pat_property.match(line)
if m:
prop, value = m.group(1,2)
self.description[prop] = value
else:
linearray = line.split()
if len(linearray) in [6,7]:
# If asymm z error, use the max or average of +- error
err = float(linearray[5])
if len(linearray) == 7:
if self.description.get("ShowMaxZErr", 1):
err = max(err, float(linearray[6]))
else:
err = 0.5 * (err + float(linearray[6]))
self.data.append({'LowEdge': [float(linearray[0]), float(linearray[2])],
'UpEdge': [float(linearray[1]), float(linearray[3])],
'Content': float(linearray[4]),
'Error': err})
else:
raise Exception('Histo1D does not have 6 or 7 columns.'+line)
class Frame(object):
def __init__(self):
self.framelinewidth = '0.3pt'
def draw(self,inputdata):
out = ('\n%\n% Frame\n%\n')
if inputdata.description.has_key('FrameColor') and inputdata.description['FrameColor']!=None:
color = inputdata.description['FrameColor']
# We want to draw this frame only once, so set it to False for next time:
inputdata.description['FrameColor']=None
# Calculate how high and wide the overall plot is
height = [0,0]
width = inputdata.description['PlotSizeX']
if inputdata.description.has_key('RatioPlot') and inputdata.description['RatioPlot']=='1':
height[1] = -inputdata.description['RatioPlotSizeY']
if not (inputdata.description.has_key('MainPlot') and inputdata.description['MainPlot']=='0'):
height[0] = inputdata.description['PlotSizeY']
else:
height[0] = -height[1]
height[1] = 0
# Get the margin widths
left = inputdata.description['LeftMargin']+0.1
right = inputdata.description['RightMargin']+0.1
top = inputdata.description['TopMargin']+0.1
bottom = inputdata.description['BottomMargin']+0.1
#
out += ('\\rput(0,1){\\psline[linewidth=%scm,linecolor=%s](%scm,%scm)(%scm,%scm)}\n' %(top, color, -left, top/2, width+right, top/2))
out += ('\\rput(0,%scm){\\psline[linewidth=%scm,linecolor=%s](%scm,%scm)(%scm,%scm)}\n' %(height[1], bottom, color, -left, -bottom/2, width+right, -bottom/2))
out += ('\\rput(0,0){\\psline[linewidth=%scm,linecolor=%s](%scm,%scm)(%scm,%scm)}\n' %(left, color, -left/2, height[1]-0.05, -left/2, height[0]+0.05))
out += ('\\rput(1,0){\\psline[linewidth=%scm,linecolor=%s](%scm,%scm)(%scm,%scm)}\n' %(right, color, right/2, height[1]-0.05, right/2, height[0]+0.05))
out += ('\\psframe[linewidth='+self.framelinewidth+',dimen=middle](0,0)(1,1)\n')
return out
class Ticks(object):
def __init__(self, description, coors):
self.majorticklinewidth = '0.3pt'
self.minorticklinewidth = '0.3pt'
self.majorticklength = '9pt'
self.minorticklength = '4pt'
self.description = description
self.coors = coors
def draw_ticks(self, min, max, plotlog=False, custommajorticks=[], customminorticks=[], custommajortickmarks=-1, customminortickmarks=-1, drawlabels=True, twosided=False):
out = ""
if plotlog:
if min <= 0 or max <= 0:
raise Exception("Cannot place log axis min or max tick <= 0")
if len(custommajorticks) ==0:
x=int(log10(min))
n_labels=0
while (x<log10(max)+1):
if 10**x>=min:
ticklabel=10**x
if ticklabel>min and ticklabel<max:
out += self.draw_majortick(ticklabel,twosided)
if drawlabels:
out += self.draw_majorticklabel(ticklabel)
n_labels+=1
if ticklabel==min or ticklabel==max:
if drawlabels:
out += self.draw_majorticklabel(ticklabel)
n_labels+=1
for i in range(2,10):
ticklabel=i*10**(x-1)
if ticklabel>min and ticklabel<max:
out += self.draw_minortick(ticklabel,twosided)
if drawlabels and n_labels==0:
if (i+1)*10**(x-1)<max: # some special care for the last minor tick
out += self.draw_minorticklabel(ticklabel)
else:
out += self.draw_minorticklabel(ticklabel, last=True)
x+=1
elif (custommajorticks!=[] or customminorticks!=[]):
for i in range(len(custommajorticks)):
value=custommajorticks[i]['Value']
label=custommajorticks[i]['Label']
if value>=min and value<=max:
out += self.draw_majortick(value,twosided)
if drawlabels:
out += self.draw_majorticklabel(value, label=label)
for i in range(len(customminorticks)):
value=customminorticks[i]['Value']
if value>=min and value<=max:
out += self.draw_minortick(value,twosided)
else:
xrange = max-min
digits = int(log10(xrange))+1
if (xrange < 1):
digits -= 1
foo = int(xrange/(10**(digits-1)))
if (foo/9. > 0.5):
tickmarks = 10
elif (foo/9. > 0.2):
tickmarks = 5
elif (foo/9. > 0.1):
tickmarks = 2
if (custommajortickmarks>-1):
if custommajortickmarks not in [1, 2, 5, 10, 20]:
print '+++ Error in Ticks.draw_ticks(): MajorTickMarks must be in [1, 2, 5, 10, 20]'
else:
#if custommajortickmarks==1: custommajortickmarks=10
tickmarks = custommajortickmarks
if (tickmarks == 2 or tickmarks == 20):
minortickmarks = 3
else:
minortickmarks = 4
if (customminortickmarks>-1):
minortickmarks = customminortickmarks
#
x = 0
while (x > min*10**digits):
x -= tickmarks*100**(digits-1)
while (x <= max*10**digits):
if (x >= min*10**digits-tickmarks*100**(digits-1)):
ticklabel = 1.*x/10**digits
if (int(ticklabel) == ticklabel):
ticklabel = int(ticklabel)
if (float(ticklabel-min)/xrange >= -1e-5):
if (fabs(ticklabel-min)/xrange > 1e-5 and fabs(ticklabel-max)/xrange > 1e-5):
out += self.draw_majortick(ticklabel,twosided)
if drawlabels:
out += self.draw_majorticklabel(ticklabel)
xminor = x
for i in range(minortickmarks):
xminor += 1.*tickmarks*100**(digits-1)/(minortickmarks+1)
ticklabel = 1.*xminor/10**digits
if (ticklabel > min and ticklabel < max):
if (fabs(ticklabel-min)/xrange > 1e-5 and fabs(ticklabel-max)/xrange > 1e-5):
out += self.draw_minortick(ticklabel,twosided)
x += tickmarks*100**(digits-1)
return out
def draw(self):
pass
def draw_minortick(self, ticklabel, twosided):
pass
def draw_majortick(self, ticklabel, twosided):
pass
def draw_majorticklabel(self, ticklabel):
pass
def draw_minorticklabel(self, value, label='', last=False):
return ''
def get_ticklabel(self, value, plotlog=False, minor=False, lastminor=False):
label=''
prefix = ""
if plotlog:
bar = int(log10(value))
if bar<0:
sign='-'
else:
sign='\\,'
if minor: # The power of ten is only to be added to the last minor tick label
if lastminor:
label = str(int(value/(10**bar))) + "\cdot" + '10$^{'+sign+'\\text{'+str(abs(bar))+'}}$'
else:
label = str(int(value/(10**bar))) # The naked prefactor
else:
if bar==0:
label = '1'
else:
label = '10$^{'+sign+'\\text{'+str(abs(bar))+'}}$'
else:
if fabs(value) < 1e-10: value=0
label=str(value)
return label
class XTicks(Ticks):
def draw(self, custommajorticks=[], customminorticks=[], custommajortickmarks=-1, customminortickmarks=-1,drawlabels=True):
twosided = False
if self.description.has_key('XTwosidedTicks') and self.description['XTwosidedTicks']=='1':
twosided = True
out = ""
out += ('\n%\n% X-Ticks\n%\n')
out += ('\\def\\majortickmarkx{\\psline[linewidth='+self.majorticklinewidth+'](0,0)(0,'+self.majorticklength+')}%\n')
out += ('\\def\\minortickmarkx{\\psline[linewidth='+self.minorticklinewidth+'](0,0)(0,'+self.minorticklength+')}%\n')
uselog = self.description['LogX'] and (self.coors.xmin() > 0 and self.coors.xmax() > 0)
out += self.draw_ticks(self.coors.xmin(), self.coors.xmax(),\
plotlog=uselog,\
custommajorticks=custommajorticks,\
customminorticks=customminorticks,\
custommajortickmarks=custommajortickmarks,\
customminortickmarks=customminortickmarks,\
drawlabels=drawlabels,\
twosided=twosided)
return out
def draw_minortick(self, ticklabel, twosided):
out = ''
out += '\\rput('+self.coors.strphys2frameX(ticklabel)+', 0){\\minortickmarkx}\n'
if twosided:
out += '\\rput{180}('+self.coors.strphys2frameX(ticklabel)+', 1){\\minortickmarkx}\n'
return out
def draw_minorticklabel(self, value, label='', last=False):
if label=='':
label=self.get_ticklabel(value,self.description['LogX'], minor=True, lastminor=last)
if last: # Some more indentation for the last minor label
return ('\\rput('+self.coors.strphys2frameX(value)+', 0){\\rput[B](1.9\\labelsep,-2.3\\labelsep){\\strut{}'+label+'}}\n')
else:
return ('\\rput('+self.coors.strphys2frameX(value)+', 0){\\rput[B](0,-2.3\\labelsep){\\strut{}'+label+'}}\n')
def draw_majortick(self, ticklabel, twosided):
out = ''
out += '\\rput('+self.coors.strphys2frameX(ticklabel)+', 0){\\majortickmarkx}\n'
if twosided:
out += '\\rput{180}('+self.coors.strphys2frameX(ticklabel)+', 1){\\majortickmarkx}\n'
return out
def draw_majorticklabel(self, value, label=''):
if label=='':
label=self.get_ticklabel(value, self.description['LogX'] and (self.coors.xmin() > 0 and self.coors.xmax() > 0))
return ('\\rput('+self.coors.strphys2frameX(value)+', 0){\\rput[B](0,-2.3\\labelsep){\\strut{}'+label+'}}\n')
class YTicks(Ticks):
def draw(self, custommajorticks=[], customminorticks=[], custommajortickmarks=-1, customminortickmarks=-1):
twosided = False
if self.description.has_key('YTwosidedTicks') and self.description['YTwosidedTicks']=='1':
twosided = True
out = ""
out += ('\n%\n% Y-Ticks\n%\n')
out += ('\\def\\majortickmarky{\\psline[linewidth='+self.majorticklinewidth+'](0,0)('+self.majorticklength+',0)}%\n')
out += ('\\def\\minortickmarky{\\psline[linewidth='+self.minorticklinewidth+'](0,0)('+self.minorticklength+',0)}%\n')
uselog = self.description['LogY'] and (self.coors.ymin() > 0 and self.coors.ymax() > 0)
out += self.draw_ticks(self.coors.ymin(), self.coors.ymax(),\
plotlog=uselog,\
custommajorticks=custommajorticks,\
customminorticks=customminorticks,\
custommajortickmarks=custommajortickmarks,\
customminortickmarks=customminortickmarks,\
twosided=twosided)
return out
def draw_minortick(self, ticklabel, twosided):
out = ''
out += '\\rput(0, '+self.coors.strphys2frameY(ticklabel)+'){\\minortickmarky}\n'
if twosided:
out += '\\rput{180}(1, '+self.coors.strphys2frameY(ticklabel)+'){\\minortickmarky}\n'
return out
def draw_majortick(self, ticklabel, twosided):
out = ''
out += '\\rput(0, '+self.coors.strphys2frameY(ticklabel)+'){\\majortickmarky}\n'
if twosided:
out += '\\rput{180}(1, '+self.coors.strphys2frameY(ticklabel)+'){\\majortickmarky}\n'
return out
def draw_majorticklabel(self, value, label=''):
if label=='':
label=self.get_ticklabel(value, self.description['LogY'] and (self.coors.ymin() > 0 and self.coors.ymax() > 0))
if self.description.has_key('RatioPlotMode') and self.description['RatioPlotMode']=='deviation' \
and self.description.has_key('RatioPlotStage') and self.description['RatioPlotStage']:
return ('\\uput[180]{0}(0, '+self.coors.strphys2frameY(value)+'){\\strut{}'+label+'\\,$\\sigma$}\n')
else:
return ('\\uput[180]{0}(0, '+self.coors.strphys2frameY(value)+'){\\strut{}'+label+'}\n')
class ZTicks(Ticks):
def __init__(self, description, coors):
self.majorticklinewidth = '0.3pt'
self.minorticklinewidth = '0.3pt'
self.majorticklength = '6pt'
self.minorticklength = '2.6pt'
self.description = description
self.coors = coors
def draw(self, custommajorticks=[], customminorticks=[], custommajortickmarks=-1, customminortickmarks=-1):
out = ""
out += ('\n%\n% Z-Ticks\n%\n')
out += ('\\def\\majortickmarkz{\\psline[linewidth='+self.majorticklinewidth+'](0,0)('+self.majorticklength+',0)}%\n')
out += ('\\def\\minortickmarkz{\\psline[linewidth='+self.minorticklinewidth+'](0,0)('+self.minorticklength+',0)}%\n')
out += self.draw_ticks(self.coors.zmin(), self.coors.zmax(),\
plotlog=self.description['LogZ'],\
custommajorticks=custommajorticks,\
customminorticks=customminorticks,\
custommajortickmarks=custommajortickmarks,\
customminortickmarks=customminortickmarks,\
twosided=False)
return out
def draw_minortick(self, ticklabel, twosided):
return '\\rput{180}(1, '+self.coors.strphys2frameZ(ticklabel)+'){\\minortickmarkz}\n'
def draw_majortick(self, ticklabel, twosided):
return '\\rput{180}(1, '+self.coors.strphys2frameZ(ticklabel)+'){\\majortickmarkz}\n'
def draw_majorticklabel(self, value, label=''):
if label=='':
label=self.get_ticklabel(value,self.description['LogZ'])
if self.description.has_key('RatioPlotMode') and self.description['RatioPlotMode']=='deviation' \
and self.description.has_key('RatioPlotStage') and self.description['RatioPlotStage']:
return ('\\uput[0]{0}(1, '+self.coors.strphys2frameZ(value)+'){\\strut{}'+label+'\\,$\\sigma$}\n')
else:
return ('\\uput[0]{0}(1, '+self.coors.strphys2frameZ(value)+'){\\strut{}'+label+'}\n')
class Coordinates(object):
def __init__(self, inputdata):
self.description = inputdata.description
def phys2frameX(self, x):
if self.description['LogX']:
if x>0:
result = 1.*(log10(x)-log10(self.xmin()))/(log10(self.xmax())-log10(self.xmin()))
else:
return -10
else:
result = 1.*(x-self.xmin())/(self.xmax()-self.xmin())
if (fabs(result) < 1e-4):
return 0
else:
return min(max(result,-10),10)
def phys2frameY(self, y):
if self.description['LogY']:
if y > 0 and self.ymin() > 0 and self.ymax() > 0:
result = 1.*(log10(y)-log10(self.ymin()))/(log10(self.ymax())-log10(self.ymin()))
else:
return -10
else:
result = 1.*(y-self.ymin())/(self.ymax()-self.ymin())
if (fabs(result) < 1e-4):
return 0
else:
return min(max(result,-10),10)
def phys2frameZ(self, z):
if self.description['LogZ']:
if z>0:
result = 1.*(log10(z)-log10(self.zmin()))/(log10(self.zmax())-log10(self.zmin()))
else:
return -10
else:
result = 1.*(z-self.zmin())/(self.zmax()-self.zmin())
if (fabs(result) < 1e-4):
return 0
else:
return min(max(result,-10),10)
# TODO: Add frame2phys functions (to allow linear function sampling in the frame space rather than the physical space)
def strphys2frameX(self, x):
return str(self.phys2frameX(x))
def strphys2frameY(self, y):
return str(self.phys2frameY(y))
def strphys2frameZ(self, z):
return str(self.phys2frameZ(z))
def xmin(self):
return self.description['Borders'][0]
def xmax(self):
return self.description['Borders'][1]
def ymin(self):
return self.description['Borders'][2]
def ymax(self):
return self.description['Borders'][3]
def zmin(self):
return self.description['Borders'][4]
def zmax(self):
return self.description['Borders'][5]
####################
def try_cmd(args):
"Run the given command + args and return True/False if it succeeds or not"
import subprocess
try:
subprocess.check_output(args, stderr=subprocess.STDOUT)
return True
except AttributeError:
# Python <=2.6 does not provide check_output
return True
except:
return False
def have_cmd(cmd):
return try_cmd(["which", cmd])
import shutil, subprocess
def process_datfile(datfile):
global opts
if not os.access(datfile, os.R_OK):
raise Exception("Could not read data file '%s'" % datfile)
dirname = os.path.dirname(datfile)
datfile = os.path.basename(datfile)
filename = datfile.replace('.dat','')
## Create a temporary directory
cwd = os.getcwd()
datpath = os.path.join(cwd, dirname, datfile)
tempdir = tempfile.mkdtemp('.make-plots')
tempdatpath = os.path.join(tempdir, datfile)
shutil.copy(datpath, tempdir)
## Make TeX file
inputdata = Inputdata(os.path.join(dirname,filename))
texpath = os.path.join(tempdir, '%s.tex' % filename)
texfile = open(texpath, 'w')
p = Plot(inputdata)
texfile.write(p.write_header(inputdata))
if inputdata.description.get('MainPlot', '1') == '1':
mp = MainPlot(inputdata)
texfile.write(mp.draw(inputdata))
- if not inputdata.description.get('is2dim', False) and inputdata.description.get('RatioPlot', '1') == '1':
+ if not inputdata.description.get('is2dim', False) and \
+ inputdata.description.get('RatioPlot', '1') == '1' and \
+ inputdata.description.get('RatioPlotReference') is not None:
rp = RatioPlot(inputdata)
texfile.write(rp.draw(inputdata))
texfile.write(p.write_footer())
texfile.close()
if opts.OUTPUT_FORMAT != "TEX":
## Check for the required programs
latexavailable = have_cmd("latex")
dvipsavailable = have_cmd("dvips")
convertavailable = have_cmd("convert")
ps2pnmavailable = have_cmd("ps2pnm")
pnm2pngavailable = have_cmd("pnm2png")
# TODO: It'd be nice to be able to control the size of the PNG between thumb and full-size...
# currently defaults (and is used below) to a size suitable for thumbnails
def mkpng(infile, outfile, density=100):
if convertavailable:
pngcmd = ["convert", "-flatten", "-density", str(density), infile, "-quality", "100", "-sharpen", "0x1.0", outfile]
logging.debug(" ".join(pngcmd))
pngproc = subprocess.Popen(pngcmd, stdout=subprocess.PIPE, cwd=tempdir)
pngproc.wait()
else:
raise Exception("Required PNG maker program (convert) not found")
# elif ps2pnmavailable and pnm2pngavailable:
# pstopnm = "pstopnm -stdout -xsize=461 -ysize=422 -xborder=0.01 -yborder=0.01 -portrait " + infile
# p1 = subprocess.Popen(pstopnm.split(), stdout=subprocess.PIPE, stderr=open("/dev/null", "w"), cwd=tempdir)
# p2 = subprocess.Popen(["pnmtopng"], stdin=p1.stdout, stdout=open("%s/%s.png" % (tempdir, outfile), "w"), stderr=open("/dev/null", "w"), cwd=tempdir)
# p2.wait()
# else:
# raise Exception("Required PNG maker programs (convert, or ps2pnm and pnm2png) not found")
## Run LaTeX (in no-stop mode)
logging.debug(os.listdir(tempdir))
texcmd = ["latex", "\scrollmode\input", texpath]
logging.debug("TeX command: " + " ".join(texcmd))
texproc = subprocess.Popen(texcmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=tempdir)
logging.debug(texproc.communicate()[0])
logging.debug(os.listdir(tempdir))
## Run dvips
dvcmd = ["dvips", filename]
if not logging.getLogger().isEnabledFor(logging.DEBUG):
dvcmd.append("-q")
## Handle Minion Font
if opts.OUTPUT_FONT == "MINION":
dvcmd.append('-Pminion')
## Choose format
# TODO: Rationalise... this is a mess! Maybe we can use tex2pix?
if opts.OUTPUT_FORMAT == "PS":
dvcmd += ["-o", "%s.ps" % filename]
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
dvproc.wait()
elif opts.OUTPUT_FORMAT == "PDF":
dvcmd.append("-f")
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
cnvproc = subprocess.Popen(["ps2pdf", "-"], stdin=dvproc.stdout, stdout=subprocess.PIPE, cwd=tempdir)
f = open(os.path.join(tempdir, "%s.pdf" % filename), "w")
f.write(cnvproc.communicate()[0])
f.close()
elif opts.OUTPUT_FORMAT == "EPS":
dvcmd.append("-f")
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
cnvproc = subprocess.Popen(["ps2eps"], stdin=dvproc.stdout, stderr=subprocess.PIPE, stdout=subprocess.PIPE, cwd=tempdir)
f = open(os.path.join(tempdir, "%s.eps" % filename), "w")
f.write(cnvproc.communicate()[0])
f.close()
elif opts.OUTPUT_FORMAT == "PNG":
dvcmd.append("-f")
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
pngcmd = ["convert", "-flatten", "-density", "100", "-", "-quality", "100", "-sharpen", "0x1.0", "%s.png" % filename]
logging.debug(" ".join(pngcmd))
pngproc = subprocess.Popen(pngcmd, stdin=dvproc.stdout, stdout=subprocess.PIPE, cwd=tempdir)
pngproc.wait()
elif opts.OUTPUT_FORMAT == "PSPNG":
dvcmd += ["-o", "%s.ps" % filename]
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
dvproc.wait()
mkpng("%s.ps" % filename, "%s.png" % filename)
elif opts.OUTPUT_FORMAT == "PDFPNG":
dvcmd.append("-f")
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
cnvproc = subprocess.Popen(["ps2pdf", "-"], stdin=dvproc.stdout, stdout=subprocess.PIPE, cwd=tempdir)
f = open(os.path.join(tempdir, "%s.pdf" % filename), "w")
f.write(cnvproc.communicate()[0])
f.close()
logging.debug(os.listdir(tempdir))
mkpng("%s.pdf" % filename, "%s.png" % filename)
elif opts.OUTPUT_FORMAT == "EPSPNG":
dvcmd.append("-f")
logging.debug(" ".join(dvcmd))
dvproc = subprocess.Popen(dvcmd, stdout=subprocess.PIPE, cwd=tempdir)
cnvproc = subprocess.Popen(["ps2eps"], stdin=dvproc.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=tempdir)
f = open(os.path.join(tempdir, "%s.eps" % filename), "w")
f.write(cnvproc.communicate()[0])
f.close()
mkpng("%s.eps" % filename, "%s.png" % filename)
else:
logging.error("Unknown format: %s" % opts.OUTPUT_FORMAT)
sys.exit(1)
logging.debug(os.listdir(tempdir))
## Copy results back to main dir
outbasename = filename
outname = outbasename + "." + opts.OUTPUT_FORMAT.lower()
## TODO: Make this neater: if "PNG" in opts.OUTPUT_FORMAT: ...
if opts.OUTPUT_FORMAT == "PSPNG":
outpath = os.path.join(tempdir, outbasename+".ps")
shutil.copy(outpath, os.path.join(cwd,dirname))
outpath = os.path.join(tempdir, outbasename+".png")
shutil.copy(outpath, os.path.join(cwd,dirname))
elif opts.OUTPUT_FORMAT == "PDFPNG":
outpath = os.path.join(tempdir, outbasename+".pdf")
shutil.copy(outpath, os.path.join(cwd,dirname))
outpath = os.path.join(tempdir, outbasename+".png")
shutil.copy(outpath, os.path.join(cwd,dirname))
elif opts.OUTPUT_FORMAT == "EPSPNG":
outpath = os.path.join(tempdir, outbasename+".eps")
shutil.copy(outpath, os.path.join(cwd,dirname))
outpath = os.path.join(tempdir, outbasename+".png")
shutil.copy(outpath, os.path.join(cwd,dirname))
else:
outpath = os.path.join(tempdir, outname)
if os.path.exists(outpath):
shutil.copy(outpath, os.path.join(cwd,dirname))
else:
logging.error("No output file '%s' from processing %s" % (outname, datfile))
## Clean up
if opts.NO_CLEANUP:
logging.info('Keeping temp-files in %s' % tempdir)
else:
shutil.rmtree(tempdir, ignore_errors=True)
## Wrapper for a process thread which attempts to process datfiles until empty
import threading, Queue
class MkPlotThread( threading.Thread ):
def run(self):
global opts
global datfiles
global RECVD_KILL_SIGNAL
while True:
if RECVD_KILL_SIGNAL is not None:
## Empty the queue
while not datfiles.empty():
dummy = datfiles.get_nowait()
break
try:
datfile = datfiles.get_nowait()
rem = datfiles.qsize()
logging.info("Plotting %s (%d remaining)" % (datfile, rem))
process_datfile(datfile)
except Queue.Empty, e:
#print "%s ending." % self.getName()
break
- except Exception, e:
- print "Error: %s" % str(e)
- import traceback
- logging.debug(traceback.format_exc())
- #exit(1)
+ # except Exception, e:
+ # print "Error: %s" % str(e)
+ # import traceback
+ # logging.debug(traceback.format_exc())
+ # #exit(1)
####################
if __name__ == '__main__':
## Try to rename the process on Linux
try:
import ctypes
libc = ctypes.cdll.LoadLibrary('libc.so.6')
libc.prctl(15, 'make-plots', 0, 0, 0)
except Exception:
pass
## Try to use Psyco optimiser
try:
import psyco
psyco.full()
except ImportError:
pass
## Find number of (virtual) processing units
numcores = os.sysconf('SC_NPROCESSORS_ONLN')
if numcores is None:
numcores = 1
## Parse command line options
from optparse import OptionParser, OptionGroup
parser = OptionParser(usage=__doc__)
parser.add_option("-n", "-j", "--num-threads", dest="NUM_THREADS", type="int",
default=numcores, help="max number of threads to be used [%s]" % numcores)
parser.add_option("--palatino", dest="OUTPUT_FONT", action="store_const", const="PALATINO", default="PALATINO",
help="Use Palatino as font (default).")
parser.add_option("--cm", dest="OUTPUT_FONT", action="store_const", const="CM", default="PALATINO",
help="Use Computer Modern as font.")
parser.add_option("--times", dest="OUTPUT_FONT", action="store_const", const="TIMES", default="PALATINO",
help="Use Times as font.")
parser.add_option("--minion", dest="OUTPUT_FONT", action="store_const", const="MINION", default="PALATINO",
help="Use Adobe Minion Pro as font. Note: You need to set TEXMFHOME first.")
parser.add_option("--helvetica", dest="OUTPUT_FONT", action="store_const", const="HELVETICA", default="PALATINO",
help="Use Helvetica as font.")
parser.add_option("--ps", dest="OUTPUT_FORMAT", action="store_const", const="PS", default="PDF",
help="Create PostScript output (default).")
parser.add_option("--pdf", dest="OUTPUT_FORMAT", action="store_const", const="PDF", default="PDF",
help="Create PDF output.")
parser.add_option("--eps", dest="OUTPUT_FORMAT", action="store_const", const="EPS", default="PDF",
help="Create Encapsulated PostScript output.")
parser.add_option("--png", dest="OUTPUT_FORMAT", action="store_const", const="PNG", default="PDF",
help="Create PNG output.")
parser.add_option("--pspng", dest="OUTPUT_FORMAT", action="store_const", const="PSPNG", default="PDF",
help="Create PS and PNG output.")
parser.add_option("--pdfpng", dest="OUTPUT_FORMAT", action="store_const", const="PDFPNG", default="PDF",
help="Create PDF and PNG output.")
parser.add_option("--epspng", dest="OUTPUT_FORMAT", action="store_const", const="EPSPNG", default="PDF",
help="Create EPS and PNG output.")
parser.add_option("--tex", dest="OUTPUT_FORMAT", action="store_const", const="TEX", default="PDF",
help="Create TeX/LaTeX output.")
parser.add_option("--no-cleanup", dest="NO_CLEANUP", action="store_true", default=False,
help="Keep temporary directory and print its filename.")
parser.add_option("--full-range", dest="FULL_RANGE", action="store_true", default=False,
help="Plot full y range in LogY plots.")
parser.add_option("-c", "--config", dest="CONFIGFILES", action="append", default=None,
help="Plot config file to be used. Overrides internal config blocks.")
verbgroup = OptionGroup(parser, "Verbosity control")
verbgroup.add_option("-v", "--verbose", action="store_const", const=logging.DEBUG, dest="LOGLEVEL",
default=logging.INFO, help="print debug (very verbose) messages")
verbgroup.add_option("-q", "--quiet", action="store_const", const=logging.WARNING, dest="LOGLEVEL",
default=logging.INFO, help="be very quiet")
parser.add_option_group(verbgroup)
opts, args = parser.parse_args()
logging.basicConfig(level=opts.LOGLEVEL, format="%(message)s")
## Check for no args
if len(args) == 0:
logging.error(parser.get_usage())
sys.exit(2)
## Test for external programs (kpsewhich, latex, dvips, ps2pdf/ps2eps, and convert)
opts.LATEXPKGS = []
if opts.OUTPUT_FORMAT != "TEX":
try:
## latex
if not have_cmd("latex"):
logging.error("ERROR: required program 'latex' could not be found. Exiting...")
sys.exit(1)
## dvips
if not have_cmd("dvips"):
logging.error("ERROR: required program 'dvips' could not be found. Exiting...")
sys.exit(1)
## ps2pdf / ps2eps
if "PDF" in opts.OUTPUT_FORMAT:
if not have_cmd("ps2pdf"):
logging.error("ERROR: required program 'ps2pdf' (for PDF output) could not be found. Exiting...")
sys.exit(1)
elif "EPS" in opts.OUTPUT_FORMAT:
if not have_cmd("ps2eps"):
logging.error("ERROR: required program 'ps2eps' (for EPS output) could not be found. Exiting...")
sys.exit(1)
## PNG output converter
if "PNG" in opts.OUTPUT_FORMAT:
if not have_cmd("convert"):
logging.error("ERROR: required program 'convert' (for PNG output) could not be found. Exiting...")
sys.exit(1)
## kpsewhich: required for LaTeX package testing
if not have_cmd("kpsewhich"):
logging.warning("WARNING: required program 'kpsewhich' (for LaTeX package checks) could not be found")
else:
## Check minion font
if opts.OUTPUT_FONT == "MINION":
p = subprocess.Popen(["kpsewhich", "minion.sty"], stdout=subprocess.PIPE)
p.wait()
if p.returncode != 0:
logging.warning('Warning: Using "--minion" requires minion.sty to be installed. Ignoring it.')
opts.OUTPUT_FONT = "PALATINO"
## Check for HEP LaTeX packages
# TODO: remove HEP-specifics/non-standards?
for pkg in ["hepnicenames", "hepunits", "underscore"]:
p = subprocess.Popen(["kpsewhich", "%s.sty" % pkg], stdout=subprocess.PIPE)
p.wait()
if p.returncode == 0:
opts.LATEXPKGS.append(pkg)
## Check for Palatino old style figures and small caps
if opts.OUTPUT_FONT == "PALATINO":
p = subprocess.Popen(["kpsewhich", "ot1pplx.fd"], stdout=subprocess.PIPE)
p.wait()
if p.returncode == 0:
opts.OUTPUT_FONT = "PALATINO_OSF"
except Exception, e:
logging.warning("Problem while testing for external packages. I'm going to try and continue without testing, but don't hold your breath...")
## Fill queue
datfiles = Queue.Queue(maxsize=-1)
plotword = "plot"
if len(args) > 1:
plotword = "plots"
logging.info("Making %d %s" % (len(args), plotword))
for d in args:
datfiles.put(d)
## Set up signal handling
import signal
RECVD_KILL_SIGNAL = None
def handleKillSignal(signum, frame):
"Declare us as having been signalled, and return to default handling behaviour"
global RECVD_KILL_SIGNAL
logging.critical("Signal handler called with signal " + str(signum))
RECVD_KILL_SIGNAL = signum
signal.signal(signum, signal.SIG_DFL)
## Signals to handle
signal.signal(signal.SIGINT, handleKillSignal)
signal.signal(signal.SIGTERM, handleKillSignal)
signal.signal(signal.SIGHUP, handleKillSignal)
signal.signal(signal.SIGUSR2, handleKillSignal)
## Run threads
for threadnum in range(opts.NUM_THREADS):
procthread = MkPlotThread()
#procthread.daemon = True
procthread.start()
import time
while not datfiles.empty() and not RECVD_KILL_SIGNAL:
time.sleep(0.25)
diff --git a/bin/rivet b/bin/rivet
--- a/bin/rivet
+++ b/bin/rivet
@@ -1,499 +1,506 @@
#! /usr/bin/env python
"""\
Run Rivet analyses on inputted events from file or Unix pipe
Examples:
%prog [options] <hepmcfile> [<hepmcfile2> ...]
my_generator -o myfifo & \ %prog [options] myfifo
agile-runmc <genname> -n 100k -o- | %prog [options]
ENVIRONMENT:
* RIVET_ANALYSIS_PATH: list of paths to be searched for plugin
analysis libraries at runtime
* RIVET_REF_PATH: list of paths to be searched for reference
data files
* RIVET_INFO_PATH: list of paths to be searched for analysis
metadata files
"""
import os, sys
## Load the rivet module
try:
import rivet
except:
## If rivet loading failed, try to bootstrap the Python path!
try:
# TODO: Is this a good idea? Maybe just notify the user that their PYTHONPATH is wrong?
import commands
modname = sys.modules[__name__].__file__
binpath = os.path.dirname(modname)
rivetconfigpath = os.path.join(binpath, "rivet-config")
rivetpypath = commands.getoutput(rivetconfigpath + " --pythonpath")
sys.path.append(rivetpypath)
import rivet
except:
sys.stderr.write("The rivet Python module could not be loaded: is your PYTHONPATH set correctly?\n")
sys.exit(1)
rivet.util.check_python_version()
rivet.util.set_process_name("rivet")
-import time, logging, signal
+import time, datetime, logging, signal
## Parse command line options
from optparse import OptionParser, OptionGroup
parser = OptionParser(usage=__doc__, version="rivet v%s" % rivet.version())
anagroup = OptionGroup(parser, "Analysis handling")
anagroup.add_option("-a", "--analysis", dest="ANALYSES", action="append",
default=[], metavar="ANA",
help="add an analysis to the processing list.")
anagroup.add_option("--list-analyses", "--list", dest="LIST_ANALYSES", action="store_true",
default=False, help="show the list of available analyses' names. With -v, it shows the descriptions, too")
anagroup.add_option("--list-used-analyses", action="store_true", dest="LIST_USED_ANALYSES",
default=False, help="list the analyses used by this command (after subtraction of inappropriate ones)")
anagroup.add_option("--show-analysis", "--show-analyses", "--show", dest="SHOW_ANALYSES", action="append",
default=[], help="show the details of an analysis")
anagroup.add_option("--analysis-path", dest="ANALYSIS_PATH", metavar="PATH", default=None,
help="specify the analysis search path (cf. $RIVET_ANALYSIS_PATH).")
# TODO: remove/deprecate the append?
anagroup.add_option("--analysis-path-append", dest="ANALYSIS_PATH_APPEND", metavar="PATH", default=None,
help="append to the analysis search path (cf. $RIVET_ANALYSIS_PATH).")
anagroup.add_option("--pwd", dest="ANALYSIS_PATH_PWD", action="store_true", default=False,
help="append the current directory (pwd) to the analysis search path (cf. $RIVET_ANALYSIS_PATH).")
# TODO: add control for more paths?
parser.add_option_group(anagroup)
extragroup = OptionGroup(parser, "Extra run settings")
extragroup.add_option("-H", "--histo-file", dest="HISTOFILE",
default="Rivet.yoda", help="specify the output histo file path (default = %default)")
extragroup.add_option("-x", "--cross-section", dest="CROSS_SECTION",
default=None, metavar="XS",
help="specify the signal process cross-section in pb")
extragroup.add_option("-n", "--nevts", dest="MAXEVTNUM", type="int",
default=None, metavar="NUM",
help="restrict the max number of events to read.")
extragroup.add_option("--runname", dest="RUN_NAME", default=None, metavar="NAME",
help="give an optional run name, to be prepended as a 'top level directory' in histo paths")
extragroup.add_option("--ignore-beams", dest="IGNORE_BEAMS", action="store_true", default=False,
help="Ignore input event beams when checking analysis compatibility. "
"Warning: some analyses may not work with incorrect beams")
parser.add_option_group(extragroup)
timinggroup = OptionGroup(parser, "Timeouts and periodic operations")
timinggroup.add_option("--event-timeout", dest="EVENT_TIMEOUT", type="int",
default=21600, metavar="NSECS",
help="max time in whole seconds to wait for an event to be generated from the specified source (default = %default)")
timinggroup.add_option("--run-timeout", dest="RUN_TIMEOUT", type="int",
default=None, metavar="NSECS",
help="max time in whole seconds to wait for the run to finish. This can be useful on batch systems such "
"as the LCG Grid where tokens expire on a fixed wall-clock and can render long Rivet runs unable to write "
"out the final histogram file (default = unlimited)")
timinggroup.add_option("--histo-interval", dest="HISTO_WRITE_INTERVAL", type=int,
default=None, help="[experimental!] specify the number of events between histogram file updates. "
"Default is to only write out at the end of the run. Note that intermediate histograms will be those "
"from the analyze step only: analysis finalizing is currently not executed until the end of the run.")
parser.add_option_group(timinggroup)
verbgroup = OptionGroup(parser, "Verbosity control")
parser.add_option("-l", dest="NATIVE_LOG_STRS", action="append",
default=[], help="set a log level in the Rivet library")
verbgroup.add_option("-v", "--verbose", action="store_const", const=logging.DEBUG, dest="LOGLEVEL",
default=logging.INFO, help="print debug (very verbose) messages")
verbgroup.add_option("-q", "--quiet", action="store_const", const=logging.WARNING, dest="LOGLEVEL",
default=logging.INFO, help="be very quiet")
parser.add_option_group(verbgroup)
opts, args = parser.parse_args()
## Configure logging
logging.basicConfig(level=opts.LOGLEVEL, format="%(message)s")
## Control native Rivet library logger
for l in opts.NATIVE_LOG_STRS:
name, level = None, None
try:
name, level = l.split("=")
except:
name = "Rivet"
level = l
## Fix name
if name != "Rivet" and not name.startswith("Rivet."):
name = "Rivet." + name
try:
## Get right error type
level = rivet.LEVELS.get(level.upper(), None)
logging.debug("Setting log level: %s %d" % (name, level))
rivet.setLogLevel(name, level)
except:
logging.warning("Couldn't process logging string '%s'" % l)
## Parse supplied cross-section
if opts.CROSS_SECTION is not None:
xsstr = opts.CROSS_SECTION
try:
opts.CROSS_SECTION = float(xsstr)
except:
import re
suffmatch = re.search(r"[^\d.]", xsstr)
if not suffmatch:
raise ValueError("Bad cross-section string: %s" % xsstr)
factor = base = None
suffstart = suffmatch.start()
if suffstart != -1:
base = xsstr[:suffstart]
suffix = xsstr[suffstart:].lower()
if suffix == "mb":
factor = 1e+9
elif suffix == "mub":
factor = 1e+6
elif suffix == "nb":
factor = 1e+3
elif suffix == "pb":
factor = 1
elif suffix == "fb":
factor = 1e-3
elif suffix == "ab":
factor = 1e-6
if factor is None or base is None:
raise ValueError("Bad cross-section string: %s" % xsstr)
xs = float(base) * factor
opts.CROSS_SECTION = xs
## Print the available CLI options!
#if opts.LIST_OPTIONS:
# for o in parser.option_list:
# print o.get_opt_string()
# sys.exit(0)
## Set up signal handling
RECVD_KILL_SIGNAL = None
def handleKillSignal(signum, frame):
"Declare us as having been signalled, and return to default handling behaviour"
global RECVD_KILL_SIGNAL
logging.critical("Signal handler called with signal " + str(signum))
RECVD_KILL_SIGNAL = signum
signal.signal(signum, signal.SIG_DFL)
## Signals to handle
signal.signal(signal.SIGTERM, handleKillSignal);
signal.signal(signal.SIGHUP, handleKillSignal);
signal.signal(signal.SIGINT, handleKillSignal);
signal.signal(signal.SIGUSR1, handleKillSignal);
signal.signal(signal.SIGUSR2, handleKillSignal);
try:
signal.signal(signal.SIGXCPU, handleKillSignal);
except:
pass
## Override/modify analysis search path
if opts.ANALYSIS_PATH:
rivet.setAnalysisLibPaths(opts.ANALYSIS_PATH.split(":"))
if opts.ANALYSIS_PATH_APPEND:
for ap in opts.ANALYSIS_PATH_APPEND.split(":"):
rivet.addAnalysisLibPath(ap)
if opts.ANALYSIS_PATH_PWD:
rivet.addAnalysisLibPath(".")
## List of analyses
all_analyses = rivet.AnalysisLoader.analysisNames()
if opts.LIST_ANALYSES:
## Treat args as case-insensitive regexes if present
regexes = None
if args:
import re
regexes = [re.compile(arg, re.I) for arg in args]
for aname in all_analyses:
if not regexes:
toshow = True
else:
toshow = False
for regex in regexes:
if regex.search(aname):
toshow = True
break
if toshow:
msg = aname
if opts.LOGLEVEL <= logging.INFO:
a = rivet.AnalysisLoader.getAnalysis(aname)
st = "" if a.status() == "VALIDATED" else ("[" + a.status() + "] ")
msg = "%-25s %s" % (aname, st + rivet.util.detex(a.summary()))
print msg
sys.exit(0)
## Show analyses' details
if len(opts.SHOW_ANALYSES) > 0:
toshow = []
for i, a in enumerate(opts.SHOW_ANALYSES):
a_up = a.upper()
if a_up in all_analyses and a_up not in toshow:
toshow.append(a_up)
else:
## Treat as a case-insensitive regex
import re
regex = re.compile(a, re.I)
for ana in all_analyses:
if regex.search(ana) and a_up not in toshow:
toshow.append(ana)
## Show the matching analyses' details
import textwrap
for i, name in enumerate(sorted(toshow)):
ana = rivet.AnalysisLoader.getAnalysis(name)
print ""
print name
print len(name) * "="
print ""
print rivet.util.detex(ana.summary())
print ""
print "Status: %s" % ana.status()
print ""
if ana.inspireId():
print "Inspire ID: %s" % ana.inspireId()
print "Inspire URL: http://inspire-hep.net/record/%s" % ana.inspireId()
print "HepData URL: http://hepdata.cedar.ac.uk/view/ins%s" % ana.inspireId()
elif ana.spiresId():
print "Spires ID: %s" % ana.spiresId()
print "Inspire URL: http://inspire-hep.net/search?p=find+key+%s" % ana.spiresId()
print "HepData URL: http://hepdata.cedar.ac.uk/view/irn%s" % ana.spiresId()
if ana.experiment():
print "Experiment: %s" % ana.experiment(),
if ana.collider():
print "(%s)" % ana.collider()
if ana.year():
print "Year of publication: %s" % ana.year()
print "Authors:"
for a in ana.authors():
print " " + a
print ""
print "Description:"
twrap = textwrap.TextWrapper(width=75, initial_indent=2*" ", subsequent_indent=2*" ")
print twrap.fill(rivet.util.detex(ana.description()))
print ""
if ana.requiredBeams():
def pid_to_str(pid):
if pid == 11:
return "e-"
elif pid == -11:
return "e+"
elif pid == 2212:
return "p+"
elif pid == -2212:
return "p-"
elif pid == 10000:
return "*"
else:
return str(pid)
beamstrs = []
for bp in ana.requiredBeams():
beamstrs.append(pid_to_str(bp[0]) + " " + pid_to_str(bp[1]))
print "Beams:", ", ".join(beamstrs)
if ana.requiredEnergies():
print "Beam energies:", "; ".join(["(%0.1f, %0.1f)" % (epair[0], epair[1]) for epair in ana.requiredEnergies()]), "GeV"
else:
print "Beam energies: ANY"
if ana.runInfo():
print "Run details:"
twrap = textwrap.TextWrapper(width=75, initial_indent=2*" ", subsequent_indent=4*" ")
for l in ana.runInfo().split("\n"):
print twrap.fill(l)
if ana.references():
print ""
print "References:"
for r in ana.references():
url = None
if r.startswith("arXiv:"):
code = r.split()[0].replace("arXiv:", "")
url = "http://arxiv.org/abs/" + code
elif r.startswith("doi:"):
code = r.replace("doi:", "")
url = "http://dx.doi.org/" + code
if url is not None:
r += " - " + url
print " %s" % r
if i+1 < len(toshow):
print "\n"
sys.exit(0)
## Identify HepMC files/streams
## TODO: check readability, deal with stdin
if len(args) > 0:
HEPMCFILES = args
else:
HEPMCFILES = ["-"]
## Event number logging
def logNEvt(n, starttime, maxevtnum):
if n % 10000 == 0:
nevtloglevel = logging.CRITICAL
elif n % 1000 == 0:
nevtloglevel = logging.WARNING
elif n % 100 == 0:
nevtloglevel = logging.INFO
else:
nevtloglevel = logging.DEBUG
- timecurrent = time.time()
- timeelapsed = timecurrent - starttime
- if maxevtnum is None:
- logging.log(nevtloglevel, "Event %d (%d s elapsed)" % (n, timeelapsed))
- else:
- timeleft = (maxevtnum-n)*timeelapsed/n
- eta = time.strftime("%a %b %d %H:%M", time.localtime(timecurrent + timeleft))
- logging.log(nevtloglevel, "Event %d (%d s elapsed / %d s left) -> ETA: %s" %
- (n, timeelapsed, timeleft, eta))
+ currenttime = datetime.datetime.now().replace(microsecond=0)
+ elapsedtime = currenttime - starttime
+ logging.log(nevtloglevel, "Event %d (%s elapsed)" % (n, str(elapsedtime)))
+ # if maxevtnum is None:
+ # logging.log(nevtloglevel, "Event %d (%s elapsed)" % (n, str(elapsedtime)))
+ # else:
+ # remainingtime = (maxevtnum-n) * elapsedtime.total_seconds() / float(n)
+ # eta = time.strftime("%a %b %d %H:%M", datetime.localtime(currenttime + remainingtime))
+ # logging.log(nevtloglevel, "Event %d (%d s elapsed / %d s left) -> ETA: %s" %
+ # (n, elapsedtime, remainingtime, eta))
## Set up analysis handler
RUNNAME = opts.RUN_NAME or ""
ah = rivet.AnalysisHandler(RUNNAME)
ah.setIgnoreBeams(opts.IGNORE_BEAMS)
for a in opts.ANALYSES:
#a_up = a.upper()
## Print warning message and exit if not a valid analysis name
if not a in all_analyses:
logging.warning("'%s' is not a known Rivet analysis! Do you need to set RIVET_ANALYSIS_PATH or use the --pwd switch?\n" % a)
# TODO: lay out more neatly, or even try for a "did you mean XXXX?" heuristic?
logging.warning("There are %d currently available analyses:\n" % len(all_analyses) + ", ".join(all_analyses))
sys.exit(1)
logging.debug("Adding analysis '%s'" % a)
ah.addAnalysis(a)
## Read and process events
run = rivet.Run(ah)
if opts.CROSS_SECTION is not None:
logging.info("User-supplied cross-section = %e pb" % opts.CROSS_SECTION)
run.setCrossSection(opts.CROSS_SECTION)
if opts.LIST_USED_ANALYSES is not None:
run.setListAnalyses(opts.LIST_USED_ANALYSES)
## Print platform type
import platform
-logging.info("Rivet %s running on machine %s (%s)" % (rivet.version(), platform.node(), platform.machine()))
+starttime = datetime.datetime.now().replace(microsecond=0)
+logging.info("Rivet %s running on machine %s (%s) at %s" % \
+ (rivet.version(), platform.node(), platform.machine(), str(starttime)))
def min_nonnull(a, b):
"A version of min which considers None to always be greater than a real number"
rtn = min(a, b)
if rtn is not None:
return rtn
if a is not None:
return a
return b
## Set up an event timeout handler
class TimeoutException(Exception):
pass
if opts.EVENT_TIMEOUT or opts.RUN_TIMEOUT:
def evttimeouthandler(signum, frame):
logging.warn("It has taken more than %d secs to get an event! Is the input event stream working?" %
min_nonnull(opts.EVENT_TIMEOUT, opts.RUN_TIMEOUT))
raise TimeoutException("Event timeout")
signal.signal(signal.SIGALRM, evttimeouthandler)
## Init run based on one event
hepmcfile = HEPMCFILES[0]
## Apply a file-level weight derived from the filename
hepmcfileweight = 1.0
if ":" in hepmcfile:
hepmcfile, hepmcfileweight = hepmcfile.rsplit(":", 1)
hepmcfileweight = float(hepmcfileweight)
try:
if opts.EVENT_TIMEOUT or opts.RUN_TIMEOUT:
signal.alarm(min_nonnull(opts.EVENT_TIMEOUT, opts.RUN_TIMEOUT))
init_ok = run.init(hepmcfile, hepmcfileweight)
signal.alarm(0)
if not init_ok:
logging.error("Failed to initialise using event file '%s'... exiting" % hepmcfile)
sys.exit(2)
except TimeoutException, te:
logging.error("Timeout in initialisation from event file '%s'... exiting" % hepmcfile)
sys.exit(3)
## Event loop
evtnum = 0
-starttime = time.time()
for fileidx, hepmcfile in enumerate(HEPMCFILES):
## Apply a file-level weight derived from the filename
hepmcfileweight = 1.0
if ":" in hepmcfile:
hepmcfile, hepmcfileweight = hepmcfile.rsplit(":", 1)
hepmcfileweight = float(hepmcfileweight)
## Open next HepMC file (NB. this doesn't apply to the first file: it was already used for the run init)
if fileidx > 0:
run.openFile(hepmcfile, hepmcfileweight)
if not run.readEvent():
logging.warning("Could not read events from '%s'" % hepmcfile)
continue
msg = "Reading events from '%s'" % hepmcfile
if hepmcfileweight != 1.0:
msg += " (file weight = %e)" % hepmcfileweight
logging.info(msg)
while opts.MAXEVTNUM is None or evtnum < opts.MAXEVTNUM:
evtnum += 1
logNEvt(evtnum, starttime, opts.MAXEVTNUM)
## Process this event
processed_ok = run.processEvent()
if not processed_ok:
logging.warn("Event processing failed for evt #%i!" % evtnum)
break
## Set flag to exit event loop if run timeout exceeded
if opts.RUN_TIMEOUT and (time.time() - starttime) > opts.RUN_TIMEOUT:
logging.warning("Run timeout of %d secs exceeded... exiting gracefully" % opts.RUN_TIMEOUT)
RECVD_KILL_SIGNAL = True
## Exit the loop if signalled
if RECVD_KILL_SIGNAL is not None:
break
## Read next event (with timeout handling if requested)
try:
if opts.EVENT_TIMEOUT:
signal.alarm(opts.EVENT_TIMEOUT)
read_ok = run.readEvent()
signal.alarm(0)
if not read_ok:
break
except TimeoutException, te:
logging.error("Timeout in reading event from '%s'... exiting" % hepmcfile)
sys.exit(3)
## Write a histo file snapshot if appropriate
if opts.HISTO_WRITE_INTERVAL is not None:
if evtnum % opts.HISTO_WRITE_INTERVAL == 0:
ah.writeData(opts.HISTOFILE)
-logging.info("Finished event loop")
-run.finalize()
+loopendtime = datetime.datetime.now().replace(microsecond=0)
+logging.info("Finished event loop at %s" % str(loopendtime))
+logging.info("Cross-section = %e pb" % ah.crossSection())
+print
## Finalize and write out data file
-print "Cross-section = %e pb" % ah.crossSection()
+run.finalize()
ah.writeData(opts.HISTOFILE)
+print
+endtime = datetime.datetime.now().replace(microsecond=0)
+logging.info("Rivet run completed at %s, time elapsed = %s" % (str(endtime), str(endtime-starttime)))
diff --git a/bin/rivet-cmphistos b/bin/rivet-cmphistos
--- a/bin/rivet-cmphistos
+++ b/bin/rivet-cmphistos
@@ -1,338 +1,347 @@
#! /usr/bin/env python
"""\
%prog - generate histogram comparison plots
USAGE:
- %prog [options] yodafile1[:'PlotOption1=Value':'PlotOption2=Value':...] [path/to/yodafile2 ...]
+ %prog [options] yodafile1[:'PlotOption1=Value':'PlotOption2=Value':...] [path/to/yodafile2 ...] [PLOT:Key1=Val1:...]
where the plot options are described in the make-plots manual in the HISTOGRAM
section.
"""
import rivet, yoda, sys, os
rivet.util.check_python_version()
rivet.util.set_process_name(os.path.basename(__file__))
class Plot(dict):
"A tiny Plot object to help writing out the head in the .dat file"
def __repr__(self):
return "# BEGIN PLOT\n" + "\n".join("%s=%s" % (k,v) for k,v in self.iteritems()) + "\n# END PLOT\n\n"
def sanitiseString(s):
#s = s.replace('_','\\_')
#s = s.replace('^','\\^{}')
#s = s.replace('$','\\$')
s = s.replace('#','\\#')
s = s.replace('%','\\%')
return s
def getCommandLineOptions():
"Parse command line options"
from optparse import OptionParser, OptionGroup
parser = OptionParser(usage=__doc__)
parser.add_option("--no-rivet-refs", dest="RIVETREFS", action="store_false",
default=True, help="don't use Rivet reference data files")
parser.add_option('-o', '--outdir', dest='OUTDIR',
default='.', help='write data files into this directory')
parser.add_option("--hier-out", action="store_true", dest="HIER_OUTPUT", default=False,
help="write output dat files into a directory hierarchy which matches the analysis paths")
parser.add_option('--plotinfodir', dest='PLOTINFODIRS', action='append',
default=['.'], help='directory which may contain plot header information (in addition '
'to standard Rivet search paths)')
stygroup = OptionGroup(parser, "Plot style")
# stygroup.add_option("--refid", dest="REF_ID",
# default="REF", help="ID of reference data set (file path for non-REF data)")
stygroup.add_option("--linear", action="store_true", dest="LINEAR",
default=False, help="plot with linear scale")
stygroup.add_option("--mc-errs", action="store_true", dest="MC_ERRS",
default=False, help="show vertical error bars on the MC lines")
stygroup.add_option("--no-ratio", action="store_false", dest="RATIO",
default=True, help="disable the ratio plot")
stygroup.add_option("--rel-ratio", action="store_true", dest="RATIO_DEVIATION",
default=False, help="show the ratio plots scaled to the ref error")
stygroup.add_option("--no-plottitle", action="store_true", dest="NOPLOTTITLE",
default=False, help="don't show the plot title on the plot "
"(useful when the plot description should only be given in a caption)")
stygroup.add_option("--style", dest="STYLE", default="default",
help="change plotting style: default|bw|talk")
stygroup.add_option("-c", "--config", dest="CONFIGFILES", action="append", default=["~/.make-plots"],
help="additional plot config file(s). Settings will be included in the output configuration.")
parser.add_option_group(stygroup)
# TODO: re-enable the pattern matching, and _maybe_ the variants on ref-only plotting if we don't have a plotting system via YODA soon
#
# selgroup = OptionGroup(parser, "Selective plotting")
# selgroup.add_option("--show-single", dest="SHOW_SINGLE", choices=("no", "ref", "mc", "all"),
# default="mc", help="control if a plot file is made if there is only one dataset to be plotted "
# "[default=%default]. If the value is 'no', single plots are always skipped, for 'ref' and 'mc', "
# "the plot will be written only if the single plot is a reference plot or an MC "
# "plot respectively, and 'all' will always create single plot files.\n The 'ref' and 'all' values "
# "should be used with great care, as they will also write out plot files for all reference "
# "histograms without MC traces: combined with the -R/--rivet-refs flag, this is a great way to "
# "write out several thousand irrelevant reference data histograms!")
# selgroup.add_option("--show-mc-only", "--all", action="store_true", dest="SHOW_IF_MC_ONLY",
# default=False, help="make a plot file even if there is only one dataset to be plotted and "
# "it is an MC one. Deprecated and will be removed: use --show-single instead, which overrides this.")
# # selgroup.add_option("-l", "--histogram-list", dest="HISTOGRAMLIST",
# # default=None, help="specify a file containing a list of histograms to plot, in the format "
# # "/ANALYSIS_ID/histoname, one per line, e.g. '/DELPHI_1996_S3430090/d01-x01-y01'.")
# selgroup.add_option("-m", "--match", action="append",
# help="Only write out histograms whose $path/$name string matches these regexes. The argument "
# "may also be a text file.",
# dest="PATHPATTERNS")
# selgroup.add_option("-M", "--unmatch", action="append",
# help="Exclude histograms whose $path/$name string matches these regexes",
# dest="PATHUNPATTERNS")
# parser.add_option_group(selgroup)
return parser
def getHistos(filelist):
"""Loop over all input files. Only use the first occurrence of any REF-histogram
and the first occurrence in each MC file for every MC-histogram."""
refhistos = {}
mchistos = {}
for infile in filelist:
mchistos.setdefault(infile, {})
analysisobjects = yoda.read(infile)
for path, ao in analysisobjects.iteritems():
if path.startswith('/REF/'):
if not refhistos.has_key(path):
refhistos[path] = ao
else:
if not mchistos[infile].has_key(path):
mchistos[infile][path] = ao
return refhistos, mchistos
def getRivetRefData(refhistos, anas=None):
"Find all Rivet reference data files"
rivet_data_dirs = rivet.getAnalysisRefPaths()
dirlist = []
for d in rivet_data_dirs:
if anas is None:
import glob
dirlist.append(glob.glob(os.path.join(d, '*.yoda')))
else:
dirlist.append([os.path.join(d, a+'.yoda') for a in anas])
for filelist in dirlist:
for infile in filelist:
analysisobjects = yoda.read(infile)
for path, ao in analysisobjects.iteritems():
if path.startswith('/REF/'):
if not refhistos.has_key(path):
refhistos[path] = ao
def parseArgs(args):
"""Look at the argument list and split it at colons, in order to separate
the file names from the plotting options. Store the file names and
file specific plotting options."""
filelist = []
plotoptions = {}
for a in args:
asplit = a.split(':')
path = asplit[0]
filelist.append(path)
plotoptions[path] = []
has_title = False
for i in xrange(1, len(asplit)):
## Add 'Title' if there is no = sign before math mode
- if not '=' in asplit[i] or ('$' in asplit[i] and asplit[i].index('$') < asplit[i].index('=')):
+ if '=' not in asplit[i] or ('$' in asplit[i] and asplit[i].index('$') < asplit[i].index('=')):
asplit[i] = 'Title=%s' % asplit[i]
if asplit[i].startswith('Title='):
has_title = True
plotoptions[path].append(asplit[i])
- if not has_title:
+ if path != "PLOT" and not has_title:
plotoptions[path].append('Title=%s' % sanitiseString(os.path.basename( os.path.splitext(path)[0] )) )
return filelist, plotoptions
def setStyle(ao, style):
"""Set default plot styles (color and line width) colors borrowed from Google Ngrams"""
LINECOLORS = ['{[HTML]{EE3311}}', # red (Google uses 'DC3912')
'{[HTML]{3366FF}}', # blue
'{[HTML]{109618}}', # green
'{[HTML]{FF9900}}', # orange... weirdly this screws up if the F is lower-case!
'{[HTML]{990099}}'] # lilac
LINESTYLES = ['solid',
'dashed',
'dashdotted',
'dotted']
if opts.STYLE == 'talk':
ao.setAnnotation('LineWidth', '1pt')
if opts.STYLE == 'bw':
LINECOLORS = ['black!90',
'black!50',
'black!30']
c = style % len(LINECOLORS)
- s = style / len(LINECOLORS)
+ s = (style / len(LINECOLORS)) % len(LINESTYLES)
ao.setAnnotation('LineStyle', '%s' % LINESTYLES[s])
ao.setAnnotation('LineColor', '%s' % LINECOLORS[c])
def setOptions(ao, options):
"Set arbitrary annotations"
for opt in options:
key, val = opt.split('=', 1)
ao.setAnnotation(key, val)
def mkoutdir(outdir):
"Function to make output directories"
if not os.path.exists(outdir):
try:
os.makedirs(outdir)
except:
msg = "Can't make output directory '%s'" % outdir
raise Exception(msg)
if not os.access(outdir, os.W_OK):
msg = "Can't write to output directory '%s'" % outdir
raise Exception(msg)
def writeOutput(output, h):
"Choose output file name and dir"
hparts = h.strip("/").split("/")
if opts.HIER_OUTPUT:
ana = "_".join(hparts[:-1]) if len(hparts) > 1 else "ANALYSIS"
outdir = os.path.join(opts.OUTDIR, ana)
outfile = '%s.dat' % hparts[-1]
else:
outdir = opts.OUTDIR
outfile = '%s.dat' % "_".join(hparts)
mkoutdir(outdir)
outfilepath = os.path.join(outdir, outfile)
f = open(outfilepath, 'w')
f.write(output)
f.close()
#--------------------------------------------------------------------------------------------
if __name__ == '__main__':
## Command line parsing
parser = getCommandLineOptions()
opts, args = parser.parse_args()
## Split the input file names and the associated plotting options
## given on the command line into two separate lists
filelist, plotoptions = parseArgs(args)
+ ## Remove the PLOT dummy file from the file list
+ if "PLOT" in filelist:
+ filelist.remove("PLOT")
## Read the .plot files
plotdirs = opts.PLOTINFODIRS + [os.path.abspath(os.path.dirname(f)) for f in filelist]
plotparser = rivet.mkStdPlotParser(plotdirs, opts.CONFIGFILES)
## Create a list of all histograms to be plotted
refhistos, mchistos = getHistos(filelist)
hpaths = []
for aos in mchistos.values():
for p in aos.keys():
if p and p not in hpaths:
hpaths.append(p)
## Read the reference data from the Rivet search paths and add them
## to the list of reference histograms
if opts.RIVETREFS:
getRivetRefData(refhistos)
## Now loop over all MC histograms and plot them
for h in hpaths:
#print 'Currently looking at', h
## A list of all analysis objects to be plotted
anaobjects = []
## Plot object for the PLOT section in the .dat file
plot = Plot()
plot['Legend'] = '1'
plot['LogY'] = '1'
for key, val in plotparser.getHeaders(h).iteritems():
plot[key] = val
+ if plotoptions.has_key("PLOT"):
+ for key_val in plotoptions["PLOT"]:
+ key, val = [s.strip() for s in key_val.split("=")]
+ plot[key] = val
if opts.LINEAR:
plot['LogY'] = '0'
if opts.NOPLOTTITLE:
plot['Title'] = ''
if opts.STYLE == 'talk':
plot['PlotSize'] = '8,6'
elif opts.STYLE == 'bw':
if opts.RATIO:
plot['RatioPlotErrorBandColor'] = 'black!10'
## DrawOnly is needed to keep the order in the Legend equal to the
## order of the files on the command line
drawonly = []
## Check if we have reference data for the histogram
ratioreference = None
if refhistos.has_key('/REF' + h):
refdata = refhistos['/REF' + h]
refdata.setAnnotation('ErrorBars', '1')
refdata.setAnnotation('PolyMarker', '*')
refdata.setAnnotation('ConnectBins', '0')
refdata.setAnnotation('Title', 'Data')
if opts.RATIO:
ratioreference = '/REF'+h
anaobjects.append(refdata)
drawonly.append('/REF' + h)
if opts.RATIO and opts.RATIO_DEVIATION:
plot['RatioPlotMode'] = 'deviation'
## Loop over the MC files to plot all instances of the histogram
styleidx = 0
for infile in filelist:
+ # if infile == "PLOT":
+ # continue ##< This isn't a real file!
if mchistos.has_key(infile) and mchistos[infile].has_key(h):
## Default linecolor, linestyle
setStyle(mchistos[infile][h], styleidx)
styleidx += 1
if opts.MC_ERRS:
mchistos[infile][h].setAnnotation('ErrorBars', '1')
## Plot defaults from .plot files
for key, val in plotparser.getHistogramOptions(h).iteritems():
mchistos[infile][h].setAnnotation(key, val)
## Command line plot options
setOptions(mchistos[infile][h], plotoptions[infile])
mchistos[infile][h].setAnnotation('Path', infile + h)
anaobjects.append(mchistos[infile][h])
drawonly.append(infile + h)
if opts.RATIO and ratioreference is None:
ratioreference = infile + h
plot['DrawOnly'] = ' '.join(drawonly).strip()
if opts.RATIO and len(drawonly) > 1:
plot['RatioPlot'] = '1'
plot['RatioPlotReference'] = ratioreference
## Now create the output. We can't use "yoda.writeFLAT(anaobjects, 'foobar.dat')" because
## the PLOT and SPECIAL blocks don't have a corresponding analysis object.
output = ''
output += str(plot)
## Special
special = plotparser.getSpecial(h)
if special:
output += "\n"
output += "# BEGIN SPECIAL %s\n" % h
output += special
output += "# END SPECIAL\n\n"
from cStringIO import StringIO
sio = StringIO()
yoda.writeFLAT(anaobjects, sio)
output += sio.getvalue()
## Write everything into a file
writeOutput(output, h)
diff --git a/bin/rivet-mkanalysis b/bin/rivet-mkanalysis
--- a/bin/rivet-mkanalysis
+++ b/bin/rivet-mkanalysis
@@ -1,322 +1,322 @@
#! /usr/bin/env python
"""\
%prog: make templates of analysis source files for Rivet
Usage: %prog [--help|-h] [--srcroot=<srcrootdir>] <analysisname>
Without the --srcroot flag, the analysis files will be created in the current
directory.
"""
import rivet, sys, os
rivet.util.check_python_version()
rivet.util.set_process_name(os.path.basename(__file__))
import logging
## Handle command line
from optparse import OptionParser
parser = OptionParser(usage=__doc__)
parser.add_option("--srcroot", metavar="DIR", dest="SRCROOT", default=None,
help="install the templates into the Rivet source tree (rooted " +
"at directory DIR) rather than just creating all in the current dir")
parser.add_option("-q", "--quiet", dest="LOGLEVEL", default=logging.INFO,
action="store_const", const=logging.WARNING, help="only write out warning and error messages")
parser.add_option("-v", "--verbose", dest="LOGLEVEL", default=logging.INFO,
action="store_const", const=logging.DEBUG, help="provide extra debugging messages")
parser.add_option("-i", "--inline-info", dest="INLINE", action="store_true",
default=False, help="Put analysis info into source file instead of separate data file.")
opts, args = parser.parse_args()
logging.basicConfig(format="%(msg)s", level=opts.LOGLEVEL)
ANANAMES = args
## Work out installation paths
ANAROOT = os.path.abspath(opts.SRCROOT or os.getcwd())
if not os.access(ANAROOT, os.W_OK):
logging.error("Can't write to source root directory %s" % ANAROOT)
sys.exit(1)
ANASRCDIR = os.getcwd()
ANAINFODIR = os.getcwd()
ANAPLOTDIR = os.getcwd()
if opts.SRCROOT:
ANASRCDIR = os.path.join(ANAROOT, "src/Analyses")
ANAINFODIR = os.path.join(ANAROOT, "data/anainfo")
ANAPLOTDIR = os.path.join(ANAROOT, "data/plotinfo")
if not (os.path.exists(ANASRCDIR) and os.path.exists(ANAINFODIR) and os.path.exists(ANAPLOTDIR)):
logging.error("Rivet analysis dirs do not exist under %s" % ANAROOT)
sys.exit(1)
if not (os.access(ANASRCDIR, os.W_OK) and os.access(ANAINFODIR, os.W_OK) and os.access(ANAPLOTDIR, os.W_OK)):
logging.error("Can't write to Rivet analysis dirs under %s" % ANAROOT)
sys.exit(1)
## Check for disallowed characters in analysis names
import string
allowedchars = string.letters + string.digits + "_"
all_ok = True
for ananame in ANANAMES:
for c in ananame:
if c not in allowedchars:
logging.error("Analysis name '%s' contains disallowed character '%s'!" % (ananame, c))
all_ok = False
break
if not all_ok:
logging.error("Exiting... please ensure that all analysis names are valid")
sys.exit(1)
## Now make each analysis
for ANANAME in ANANAMES:
logging.info("Writing templates for %s to %s" % (ANANAME, ANAROOT))
## Extract some metadata from the name if it matches the standard pattern
import re
re_stdana = re.compile(r"^(\w+)_(\d{4})_(I|S)(\d+)$")
match = re_stdana.match(ANANAME)
STDANA = False
ANAEXPT = "<Insert the experiment name>"
ANACOLLIDER = "<Insert the collider name>"
ANAYEAR = "<Insert year of publication>"
INSPIRE_SPIRES = None
ANAINSPIREID = "<Insert the Inspire ID>"
if match:
STDANA = True
ANAEXPT = match.group(1)
if ANAEXPT.upper() in ("ALICE", "ATLAS", "CMS", "LHCB"):
ANACOLLIDER = "LHC"
elif ANAEXPT.upper() in ("CDF", "D0"):
ANACOLLIDER = "Tevatron"
elif ANAEXPT.upper() == "BABAR":
ANACOLLIDER = "PEP-II"
elif ANAEXPT.upper() == "BELLE":
ANACOLLIDER = "KEKB"
ANAYEAR = match.group(2)
INSPIRE_SPIRES = match.group(3)
ANAINSPIREID = match.group(4)
if INSPIRE_SPIRES == "I":
ANAREFREPO = "Inspire"
else:
ANAREFREPO = "Spires"
KEYWORDS = {
"ANANAME" : ANANAME,
"ANAEXPT" : ANAEXPT,
"ANACOLLIDER" : ANACOLLIDER,
"ANAYEAR" : ANAYEAR,
"ANAREFREPO" : ANAREFREPO,
"ANAINSPIREID" : ANAINSPIREID
}
## Try to get bib info from SPIRES
ANABIBKEY = ""
ANABIBTEX = ""
bibkey, bibtex = None, None
if STDANA:
try:
logging.debug("Getting Inspire/SPIRES biblio data for '%s'" % ANANAME)
bibkey, bibtex = rivet.spiresbib.get_bibtex_from_repo(INSPIRE_SPIRES, ANAINSPIREID)
except Exception, e:
logging.error("Inspire/SPIRES oops: %s" % e)
if bibkey and bibtex:
ANABIBKEY = bibkey
ANABIBTEX = bibtex
KEYWORDS["ANABIBKEY"] = ANABIBKEY
KEYWORDS["ANABIBTEX"] = ANABIBTEX
## Try to download YODA data file from HepData
if STDANA:
try:
import urllib
hdurl = None
if INSPIRE_SPIRES == "I":
hdurl = "http://hepdata.cedar.ac.uk/view/ins%s/yoda" % ANAINSPIREID
elif INSPIRE_SPIRES == "S":
hdurl = "http://hepdata.cedar.ac.uk/view/irn%s/yoda" % ANAINSPIREID
if hdurl:
logging.debug("Getting data file from HepData at %s" % hdurl)
httpstream = urllib.urlopen(hdurl)
yodastr = httpstream.read()
if not yodastr or "<html>" in yodastr:
logging.warning("Problem encountered when getting data from HepData (%s). No reference data file written." % hdurl)
else:
f = open("%s.yoda" % ANANAME, "w")
f.write(yodastr)
f.close()
httpstream.close()
else:
logging.warning("Could not identify a URL for getting reference data from HepData. No reference data file written.")
except Exception, e:
logging.error("HepData oops: %s" % e)
if opts.INLINE:
KEYWORDS["ANAREFREPO_LOWER"] = KEYWORDS["ANAREFREPO"].lower()
INLINEMETHODS="""
public:
string experiment() const { return "%(ANAEXPT)s"; }
string year() const { return "%(ANAYEAR)s"; }
string %(ANAREFREPO_LOWER)sId() const { return "%(ANAINSPIREID)s"; }
string collider() const { return ""; }
string summary() const { return ""; }
string description() const { return ""; }
string runInfo() const { return ""; }
string bibKey() const { return "%(ANABIBKEY)s"; }
string bibTeX() const { return "%(ANABIBTEX)s"; }
string status() const { return "UNVALIDATED"; }
vector<string> authors() const { return vector<string>(); }
vector<string> references() const { return vector<string>(); }
vector<std::string> todos() const { return vector<string>(); }
""" % KEYWORDS
del KEYWORDS["ANAREFREPO_LOWER"]
else:
INLINEMETHODS=""
KEYWORDS["INLINEMETHODS"] = INLINEMETHODS
ANASRCFILE = os.path.join(ANASRCDIR, ANANAME+".cc")
logging.debug("Writing implementation template to %s" % ANASRCFILE)
f = open(ANASRCFILE, "w")
src = '''\
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
/// @todo Include more projections as required, e.g. ChargedFinalState, FastJets, ZFinder...
namespace Rivet {
- using namespace Cuts;
class %(ANANAME)s : public Analysis {
public:
/// Constructor
%(ANANAME)s()
: Analysis("%(ANANAME)s")
{ }
/// @name Analysis methods
//@{
/// Book histograms and initialise projections before the run
void init() {
/// @todo Initialise and register projections here
/// @todo Book histograms here, e.g.:
// _h_XXXX = bookProfile1D(1, 1, 1);
// _h_YYYY = bookHisto1D(2, 1, 1);
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
/// @todo Do the event by event analysis here
}
/// Normalise histograms etc., after the run
void finalize() {
/// @todo Normalise, scale and otherwise manipulate histograms here
// scale(_h_YYYY, crossSection()/sumOfWeights()); // norm to cross section
// normalize(_h_YYYY); // normalize to unity
}
//@}
private:
// Data members like post-cuts event weight counters go here
/// @name Histograms
//@{
Profile1DPtr _h_XXXX;
Histo1DPtr _h_YYYY;
//@}
%(INLINEMETHODS)s
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(%(ANANAME)s);
+
}
''' % KEYWORDS
f.write(src)
f.close()
ANAPLOTFILE = os.path.join(ANAPLOTDIR, ANANAME+".plot")
logging.debug("Writing plot template to %s" % ANAPLOTFILE)
f = open(ANAPLOTFILE, "w")
src = '''\
# BEGIN PLOT /%(ANANAME)s/d01-x01-y01
#Title=[Uncomment and insert title for histogram d01-x01-y01 here]
#XLabel=[Uncomment and insert x-axis label for histogram d01-x01-y01 here]
#YLabel=[Uncomment and insert y-axis label for histogram d01-x01-y01 here]
# + any additional plot settings you might like, see make-plots documentation
# END PLOT
# ... add more histograms as you need them ...
''' % KEYWORDS
f.write(src)
f.close()
if opts.INLINE:
sys.exit(0)
ANAINFOFILE = os.path.join(ANAINFODIR, ANANAME+".info")
logging.debug("Writing info template to %s" % ANAINFOFILE)
f = open(ANAINFOFILE, "w")
src = """\
Name: %(ANANAME)s
Year: %(ANAYEAR)s
Summary: <Insert short %(ANANAME)s description>
Experiment: %(ANAEXPT)s
Collider: %(ANACOLLIDER)s
%(ANAREFREPO)sID: %(ANAINSPIREID)s
Status: UNVALIDATED
Authors:
- Your Name <your@email.address>
#References:
# - <Example: Phys.Lett.B639:151-158,2006, Erratum-ibid.B658:285-289,2008>
# - <Example: doi:10.1016/j.physletb.2006.04.048>
# - <Example: arXiv:hep-ex/0511054 (plus erratum)>
RunInfo:
<Insert event types (not gen-specific), energy, any kinematic
efficiency cut(s) that may be needed; essentially any details needed to set
up a generator to reproduce the data.>
NumEvents: 1000000
NeedCrossSection: no
#Beams: <Insert beam pair(s), e.g. [p-, p+] or [[p-, e-], [p-, e+]]>
#Energies: <Insert list of run energies or beam energy pairs in GeV,
# e.g. [1960] or [[8.0, 3.5]] or [630, 1800]. Order pairs to match "Beams">
#PtCuts: <Insert list of kinematic pT cuts in GeV, e.g. [0, 20]>
#NeedCrossSection: True
Description:
'<Insert a fairly long description, including what is measured
and if possible what it is useful for in terms of MC validation
and tuning. Use LaTeX for maths like $\pT > 50\;\GeV$.
Use single quotes around the block if required (see http://www.yaml.org)>'
BibKey: %(ANABIBKEY)s
BibTeX: '%(ANABIBTEX)s'
ToDo:
- Implement the analysis, test it, remove this ToDo, and mark as VALIDATED :-)
""" % KEYWORDS
f.write(src)
f.close()
logging.info("Use e.g. 'rivet-buildplugin Rivet%s.so %s.cc' to compile the plugin" % (ANANAME, ANANAME))
diff --git a/bin/rivet-mkhtml b/bin/rivet-mkhtml
--- a/bin/rivet-mkhtml
+++ b/bin/rivet-mkhtml
@@ -1,365 +1,367 @@
#! /usr/bin/env python
"""\
%prog [options] <yodafile1> [<yodafile2> <yodafile3>...]
Make web pages from histogram files written out by Rivet. You can specify
multiple Monte Carlo YODA files to be compared in the same syntax as for
rivet-cmphistos, i.e. including plotting options.
Reference data, analysis metadata, and plot style information should be found
automatically (if not, set the RIVET_ANALYSIS_PATH or similar variables
appropriately).
You can overwrite an existing output directory.
"""
import rivet, sys, os
rivet.util.check_python_version()
rivet.util.set_process_name(os.path.basename(__file__))
import glob, shutil
from subprocess import Popen,PIPE
from optparse import OptionParser
parser = OptionParser(usage=__doc__)
parser.add_option("-o", "--outputdir", dest="OUTPUTDIR",
default="./plots", help="directory for webpage output")
parser.add_option("-t", "--title", dest="TITLE",
default="Plots from Rivet analyses", help="title to be displayed on the main web page")
parser.add_option("-c", "--config", dest="CONFIGFILES", action="append", default=["~/.make-plots"],
help="plot config file(s) to be used with rivet-cmphistos.")
parser.add_option("-s", "--single", dest="SINGLE", action="store_true",
default=False, help="display plots on single webpage.")
parser.add_option("--no-ratio", dest="SHOW_RATIO", action="store_false",
default=True, help="don't draw a ratio plot under each main plot.")
parser.add_option("--mc-errs", dest="MC_ERRS", action="store_true",
default=False, help="plot error bars.")
parser.add_option("--refid", dest="REF_ID",
default=None, help="ID of reference data set (file path for non-REF data)")
parser.add_option("-n", "--num-threads", metavar="NUMTHREADS", dest="NUMTHREADS", type=int,
default=None, help="request make-plots to use a specific number of threads.")
parser.add_option("--pdf", dest="VECTORFORMAT", action="store_const", const="PDF",
default="PDF", help="use PDF as the vector plot format.")
parser.add_option("--ps", dest="VECTORFORMAT", action="store_const", const="PS",
default="PDF", help="use PostScript as the vector plot format.")
parser.add_option("--booklet", dest="BOOKLET", action="store_true",
default=False, help="create booklet (currently only available for PDF with pdftk).")
parser.add_option("-i", "--ignore-unvalidated", dest="IGNORE_UNVALIDATED", action="store_true",
default=False, help="ignore unvalidated analyses.")
parser.add_option("--ignore-missing", dest="IGNORE_MISSING", action="store_true",
default=False, help="ignore missing YODA files.")
parser.add_option("-m", "--match", action="append", dest="PATHPATTERNS",
help="only write out histograms from analyses whose name matches any of these regexes")
parser.add_option("-M", "--unmatch", action="append", dest="PATHUNPATTERNS",
help="Exclude histograms whose $path/$name string matches these regexes")
parser.add_option("--palatino", dest="OUTPUT_FONT", action="store_const", const="PALATINO", default="PALATINO",
help="Use Palatino as font (default).")
parser.add_option("--cm", dest="OUTPUT_FONT", action="store_const", const="CM", default="PALATINO",
help="Use Computer Modern as font.")
parser.add_option("--times", dest="OUTPUT_FONT", action="store_const", const="TIMES", default="PALATINO",
help="Use Times as font.")
parser.add_option("--minion", dest="OUTPUT_FONT", action="store_const", const="MINION", default="PALATINO",
help="Use Adobe Minion Pro as font. Note: You need to set TEXMFHOME first.")
parser.add_option("-v", "--verbose", help="Add extra debug messages", dest="VERBOSE",
action="store_true", default=False)
opts, yodafiles = parser.parse_args()
## Check that there are some arguments!
if not yodafiles:
print "Error: You need to specify some YODA files to be plotted!"
sys.exit(1)
## Make output directory
if os.path.exists(opts.OUTPUTDIR) and not os.path.realpath(opts.OUTPUTDIR)==os.getcwd():
import shutil
shutil.rmtree(opts.OUTPUTDIR)
try:
os.makedirs(opts.OUTPUTDIR)
except:
print "Error: failed to make new directory '%s'" % opts.OUTPUTDIR
sys.exit(1)
## Get set of analyses involved in the runs
analyses = set()
blocked_analyses = set()
import yoda
for yodafile in yodafiles:
yodafilepath = os.path.abspath(yodafile.split(":")[0])
if not os.access(yodafilepath, os.R_OK):
print "Error: cannot read from %s" % yodafilepath
if opts.IGNORE_MISSING:
continue
else:
sys.exit(2)
analysisobjects = yoda.read(yodafilepath)
+
for path, ao in analysisobjects.iteritems():
## If regexes have been provided, only add analyses which match and don't unmatch
pathparts = path.strip('/').split('/')
if pathparts[0] == "REF":
del pathparts[0]
# TODO: What if there are more than two parts (other than REF)?
analysis = "_".join(pathparts[:-1]) if len(pathparts) > 1 else "ANALYSIS"
if analysis in analyses.union(blocked_analyses):
continue
+ import re
+ matched = True
if opts.PATHPATTERNS:
- import re
matched = False
for patt in opts.PATHPATTERNS:
if re.search(patt, analysis) is not None:
matched = True
break
- if matched and opts.PATHUNPATTERNS:
- for patt in opts.PATHUNPATTERNS:
- if re.search(patt, analysis):
- matched = False
- break
- if not matched:
- blocked_analyses.add(analysis)
- continue
- analyses.add(analysis)
+ if matched and opts.PATHUNPATTERNS:
+ for patt in opts.PATHUNPATTERNS:
+ if re.search(patt, analysis) is not None:
+ matched = False
+ break
+ if matched:
+ analyses.add(analysis)
+ else:
+ blocked_analyses.add(analysis)
## Sort analyses: group ascending by analysis name (could specialise grouping by collider), then
## descending by year, and finally descending by bibliographic archive ID code (INSPIRE first).
def anasort(name):
rtn = (1, name)
if name.startswith("MC"):
rtn = (99999999, name)
else:
stdparts = name.split("_")
try:
year = int(stdparts[1])
rtn = (0, stdparts[0], -year, 0)
idcode = (0 if stdparts[2][0] == "I" else 1e10) - int(stdparts[2][1:])
rtn = (0, stdparts[0], -year, idcode)
if len(stdparts) > 3:
rtn += stdparts[3:]
except:
pass
return rtn
analyses = sorted(analyses, key=anasort)
## Uncomment to test analysis ordering on index page
# print analyses
# sys.exit(0)
## Run rivet-cmphistos to get plain .dat files from .yoda
## We do this here since it also makes the necessary directories
ch_cmd = ["rivet-cmphistos"]
if opts.MC_ERRS:
ch_cmd.append("--mc-errs")
if not opts.SHOW_RATIO:
ch_cmd.append("--no-ratio")
if opts.REF_ID is not None:
ch_cmd.append("--refid=%s" % os.path.abspath(opts.REF_ID))
ch_cmd.append("--hier-out")
# TODO: Need to be able to override this: provide a --plotinfodir cmd line option?
ch_cmd.append("--plotinfodir=%s" % os.path.abspath("../"))
for af in yodafiles:
yodafilepath = os.path.abspath(af.split(":")[0])
if not os.access(yodafilepath, os.R_OK):
continue
newarg = yodafilepath
if ":" in af:
for opt in af.split(":")[1:]:
newarg += ":%s" % opt
# print newarg
ch_cmd.append(newarg)
for configfile in opts.CONFIGFILES:
configfile = os.path.abspath(os.path.expanduser(configfile))
if os.access(configfile, os.R_OK):
ch_cmd += ["-c", configfile]
# TODO: Pass rivet-mkhtml -m and -M args to rivet-cmphistos
if opts.VERBOSE:
# ch_cmd.append("--verbose")
print "Calling rivet-cmphistos with the following command:"
print " ".join(ch_cmd)
## Run rivet-cmphistos in a subdir, after fixing any relative paths in Rivet env vars
for var in ("RIVET_ANALYSIS_PATH", "RIVET_REF_PATH", "RIVET_INFO_PATH", "RIVET_PLOT_PATH"):
if var in os.environ:
abspaths = map(os.path.abspath, os.environ[var].split(":"))
os.environ[var] = ":".join(abspaths)
subproc = Popen(ch_cmd, cwd=opts.OUTPUTDIR,stdout=PIPE,stderr=PIPE)
out,err = subproc.communicate()
retcode = subproc.returncode
if (opts.VERBOSE or retcode !=0) and out :
print 'Output from rivet-cmphistos\n',out
if err :
print 'Errors from rivet-cmphistos\n',err
if(retcode != 0) :
print 'Crash in rivet-cmphistos code = ',retcode,' exiting'
exit(retcode)
## Write web page containing all (matched) plots
## Make web pages first so that we can load it locally in
## a browser to view the output before all plots are made
style = '''<style>
html { font-family: sans-serif; }
img { border: 0; }
a { text-decoration: none; font-weight: bold; }
</style>
'''
## Include MathJax configuration
script = '''\
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
tex2jax: {inlineMath: [["$","$"]]}
});
</script>
<script type="text/javascript"
src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML">
</script>
'''
## A helper function for metadata LaTeX -> HTML conversion
from rivet.util import htmlify
## A timestamp HTML fragment to be used on each page:
import datetime
timestamp = '<p>Generated at %s</p>\n' % datetime.datetime.now().strftime("%A, %d. %B %Y %I:%M%p")
index = open(os.path.join(opts.OUTPUTDIR, "index.html"), "w")
index.write('<html>\n<head>\n<title>%s</title>\n%s</head>\n<body>' % (opts.TITLE, style + script))
if opts.BOOKLET and opts.VECTORFORMAT == "PDF":
index.write('<h2><a href="booklet.pdf">%s</a></h2>\n\n' % opts.TITLE)
else:
index.write('<h2>%s</h2>\n\n' % opts.TITLE)
if opts.SINGLE:
## Write table of contents
index.write('<ul>\n')
for analysis in analyses:
summary = analysis
ana = rivet.AnalysisLoader.getAnalysis(analysis)
if ana:
summary = "%s (%s)" % (ana.summary(), analysis)
if opts.IGNORE_UNVALIDATED and ana.status() != "VALIDATED":
continue
index.write('<li><a href="#%s">%s</a>\n' % (analysis, htmlify(summary)) )
index.write('</ul>\n')
for analysis in analyses:
references = []
summary = analysis
description = "NONE"
if analysis.find("_S") > 0:
spiresid = analysis[analysis.rfind('_S')+2:len(analysis)]
inspireid = "NONE"
elif analysis.find("_I") > 0:
inspireid = analysis[analysis.rfind('_I')+2:len(analysis)]
else:
inspireid = "NONE"
spiresid = "NONE"
ana = rivet.AnalysisLoader.getAnalysis(analysis)
if ana:
if ana.summary() and ana.summary() != "NONE":
summary = "%s (%s)" % (ana.summary(), analysis)
references = ana.references()
description = ana.description()
spiresid = ana.spiresId()
if opts.IGNORE_UNVALIDATED and ana.status().upper() != "VALIDATED":
continue
if opts.SINGLE:
index.write('\n<h3 style="clear:left; padding-top:2em;"><a name="%s">%s</a></h3>\n' % (analysis, htmlify(summary)) )
else:
index.write('\n<h3><a href="%s/index.html" style="text-decoration:none;">%s</a></h3>\n' % (analysis, htmlify(summary)))
reflist = []
if inspireid and inspireid !="NONE":
reflist.append('<a href="http://inspirehep.net/record/%s">Spires</a>' % inspireid)
elif spiresid and spiresid != "NONE":
reflist.append('<a href="http://durpdg.dur.ac.uk/cgi-bin/spiface/hep/www?irn+%s">Spires</a>' % spiresid)
reflist += references
index.write('<p>%s</p>\n' % " &#124; ".join(reflist))
index.write('<p style="font-size:smaller;">%s</p>\n' % htmlify(description))
anapath = os.path.join(opts.OUTPUTDIR, analysis)
if not opts.SINGLE:
if not os.path.exists(anapath):
os.makedirs(anapath)
anaindex = open(os.path.join(anapath, "index.html"), 'w')
anaindex.write('<html>\n<head>\n<title>%s - %s</title>\n%s</head>\n<body>\n' %
(opts.OUTPUTDIR, analysis, style + script))
anaindex.write('<h3>%s</h3>\n' % analysis)
anaindex.write('<p><a href="../index.html">Back to index</a></p>\n')
anaindex.write('<p>\n %s\n</p>\n' % htmlify(description))
else:
anaindex = index
datfiles = glob.glob("%s/*.dat" % anapath)
anaindex.write('<div style="float:none; overflow:auto; width:100%">\n')
for datfile in sorted(datfiles):
obsname = os.path.basename(datfile).replace(".dat", "")
pngfile = obsname+".png"
vecfile = obsname+"."+opts.VECTORFORMAT.lower()
srcfile = obsname+".dat"
if opts.SINGLE:
pngfile = os.path.join(analysis, pngfile)
vecfile = os.path.join(analysis, vecfile)
srcfile = os.path.join(analysis, srcfile)
anaindex.write(' <div style="float:left; font-size:smaller; font-weight:bold;">\n')
anaindex.write(' <a href="#%s-%s">&#9875;</a><a href="%s">&#8984</a> %s:<br/>\n' %
(analysis, obsname, srcfile, os.path.splitext(vecfile)[0]) )
anaindex.write(' <a name="%s-%s"><a href="%s">\n' % (analysis, obsname, vecfile) )
anaindex.write(' <img src="%s">\n' % pngfile )
anaindex.write(' </a></a>\n')
anaindex.write(' </div>\n')
anaindex.write('</div>\n')
if not opts.SINGLE:
anaindex.write('<div style="float:none">%s</body>\n</html></div>\n' % timestamp)
anaindex.close()
index.write('<br>%s</body>\n</html>' % timestamp)
index.close()
## Run make-plots on all generated .dat files
# sys.exit(0)
mp_cmd = ["make-plots"]
if opts.NUMTHREADS:
mp_cmd.append("--num-threads=%d" % opts.NUMTHREADS)
if opts.VECTORFORMAT == "PDF":
mp_cmd.append("--pdfpng")
elif opts.VECTORFORMAT == "PS":
mp_cmd.append("--pspng")
if opts.OUTPUT_FONT == "CM":
mp_cmd.append("--cm")
elif opts.OUTPUT_FONT == "TIMES":
mp_cmd.append("--times")
elif opts.OUTPUT_FONT == "minion":
mp_cmd.append("--minion")
datfiles = []
for analysis in analyses:
anapath = os.path.join(opts.OUTPUTDIR, analysis)
#print anapath
anadatfiles = glob.glob("%s/*.dat" % anapath)
datfiles += sorted(anadatfiles)
if datfiles:
mp_cmd += datfiles
if opts.VERBOSE:
mp_cmd.append("--verbose")
print "Calling make-plots with the following options:"
print " ".join(mp_cmd)
Popen(mp_cmd).wait()
if opts.BOOKLET and opts.VECTORFORMAT=="PDF":
bookletcmd = ["pdftk"]
for analysis in analyses:
anapath = os.path.join(opts.OUTPUTDIR, analysis)
bookletcmd += sorted(glob.glob("%s/*.pdf" % anapath))
bookletcmd += ["cat", "output", "%s/booklet.pdf" % opts.OUTPUTDIR]
print bookletcmd
Popen(bookletcmd).wait()
diff --git a/data/anainfo/ATLAS_2012_I1091481.info b/data/anainfo/ATLAS_2012_I1091481.info
--- a/data/anainfo/ATLAS_2012_I1091481.info
+++ b/data/anainfo/ATLAS_2012_I1091481.info
@@ -1,45 +1,43 @@
Name: ATLAS_2012_I1091481
Year: 2012
Summary: Azimuthal ordering of charged hadrons
Experiment: ATLAS
Collider: LHC
InspireID: 1091481
-Status: UNVALIDATED
+Status: VALIDATED
Authors:
- Sharka Todorova <sarka.todorova@cern.ch>
- - Holger Schulz <hschulz@physik.hu-berlin.de>
+ - Holger Schulz <holger.schulz@durham.ac.uk>
+ - Christian Guetschow <chris.g@cern.ch>
References:
- CERN-PH-EP-2011-197
- arXiv:1203.0419 [hep-ex]
RunInfo:
QCD events with diffractives switched on.
NumEvents: 1000000
NeedCrossSection: no
Beams: [p+, p+]
Energies: [900, 7000]
Description:
'Measurement of the ordering of charged hadrons in the azimuthal angle relative
to the beam axis at the Large Hadron Collider (LHC). A spectral analysis of
correlations between longitudinal and transverse components of the momentum of
the charged hadrons is performed. Data were recorded with the ATLAS detector at
centre-of-mass energies of $\sqrt{s} = 900\;\GeV$ and $\sqrt{s} = 7\;\TeV$.
The correlations measured in a phase space region dominated by low-$\pT$ particles
are not well described by conventional models of hadron production. The measured
spectra show features consistent with the fragmentation of a QCD string
represented by a helix-like ordered gluon chain.'
BibKey: :2012fa
BibTeX: '@article{:2012fa,
author = "Aad, Georges and others",
title = "{Measurement of the azimuthal ordering of charged hadrons
with the ATLAS detector}",
collaboration = "ATLAS Collaboration",
year = "2012",
eprint = "1203.0419",
archivePrefix = "arXiv",
primaryClass = "hep-ex",
reportNumber = "CERN-PH-EP-2011-197",
SLACcitation = "%%CITATION = ARXIV:1203.0419;%%",
}'
-ToDo:
- - Check normalisation(s)
- - Check the -1 in the fill expression
diff --git a/data/anainfo/ATLAS_2013_I1219109.info b/data/anainfo/ATLAS_2013_I1219109.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2013_I1219109.info
@@ -0,0 +1,46 @@
+Name: ATLAS_2013_I1219109
+Year: 2013
+Summary: W + b production at 7 TeV
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1219109
+Status: VALIDATED
+Authors:
+ - Christian Gutschow <chris.g@cern.ch>
+References:
+ - JHEP 1306 (2013) 084
+ - doi:10.1007/JHEP06(2013)084
+ - arXiv:1302.2929 [hep-ex]
+RunInfo:
+ W+b in the electron channel
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: [25,25]
+NeedCrossSection: True
+Description:
+ Measurements of the W+b-jets ($W+b+X$ and $W+b\bar{b}+X$) production cross-section in proton-proton collisions at a
+ centre-of-mass energy of 7 TeV at the LHC. These results are based on data corresponding to an integrated luminosity of
+ 4.6~$fb^{−1}$, collected with the ATLAS detector. Cross-sections are presented as a function of jet multiplicity and of the
+ transverse momentum of the leading b-jet for both the combined muon and electron decay modes of the W boson.
+ The default routine will consider the electron decay channel of the W boson. Use ATLAS_2013_I1217863_W_EL and
+ ATLAS_2013_I1217863_W_MU to specify the decay channel directly.
+BibKey: Aad:2013vka
+BibTeX: '@article{Aad:2013vka,
+ author = "Aad, Georges and others",
+ title = "{Measurement of the cross-section for W boson production
+ in association with b-jets in pp collisions at $\sqrt{s}$
+ = 7 TeV with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ journal = "JHEP",
+ volume = "1306",
+ pages = "084",
+ doi = "10.1007/JHEP06(2013)084",
+ year = "2013",
+ eprint = "1302.2929",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2012-357",
+ SLACcitation = "%%CITATION = ARXIV:1302.2929;%%",
+}'
+
diff --git a/data/anainfo/ATLAS_2013_I1219109_EL.info b/data/anainfo/ATLAS_2013_I1219109_EL.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2013_I1219109_EL.info
@@ -0,0 +1,46 @@
+Name: ATLAS_2013_I1219109_EL
+Year: 2013
+Summary: W + b production at 7 TeV
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1219109
+Status: VALIDATED
+Authors:
+ - Christian Gutschow <chris.g@cern.ch>
+References:
+ - JHEP 1306 (2013) 084
+ - doi:10.1007/JHEP06(2013)084
+ - arXiv:1302.2929 [hep-ex]
+RunInfo:
+ W+b in the electron channel
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: [25,25]
+NeedCrossSection: True
+Description:
+ Measurements of the W+b-jets ($W+b+X$ and $W+b\bar{b}+X$) production cross-section in proton-proton collisions at a
+ centre-of-mass energy of 7 TeV at the LHC. These results are based on data corresponding to an integrated luminosity of
+ 4.6~$fb^{−1}$, collected with the ATLAS detector. Cross-sections are presented as a function of jet multiplicity and of the
+ transverse momentum of the leading b-jet for both the combined muon and electron decay modes of the W boson.
+ The default routine will consider the electron decay channel of the W boson. Use ATLAS_2013_I1219109_EL and
+ ATLAS_2013_I1219109_MU to specify the decay channel directly.
+BibKey: Aad:2013vka
+BibTeX: '@article{Aad:2013vka,
+ author = "Aad, Georges and others",
+ title = "{Measurement of the cross-section for W boson production
+ in association with b-jets in pp collisions at $\sqrt{s}$
+ = 7 TeV with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ journal = "JHEP",
+ volume = "1306",
+ pages = "084",
+ doi = "10.1007/JHEP06(2013)084",
+ year = "2013",
+ eprint = "1302.2929",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2012-357",
+ SLACcitation = "%%CITATION = ARXIV:1302.2929;%%",
+}'
+
diff --git a/data/anainfo/ATLAS_2013_I1219109_MU.info b/data/anainfo/ATLAS_2013_I1219109_MU.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2013_I1219109_MU.info
@@ -0,0 +1,46 @@
+Name: ATLAS_2013_I1219109_MU
+Year: 2013
+Summary: W + b production at 7 TeV
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1219109
+Status: VALIDATED
+Authors:
+ - Christian Gutschow <chris.g@cern.ch>
+References:
+ - JHEP 1306 (2013) 084
+ - doi:10.1007/JHEP06(2013)084
+ - arXiv:1302.2929 [hep-ex]
+RunInfo:
+ W+b in the muon channel
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: [25,25]
+NeedCrossSection: True
+Description:
+ Measurements of the W+b-jets ($W+b+X$ and $W+b\bar{b}+X$) production cross-section in proton-proton collisions at a
+ centre-of-mass energy of 7 TeV at the LHC. These results are based on data corresponding to an integrated luminosity of
+ 4.6~$fb^{−1}$, collected with the ATLAS detector. Cross-sections are presented as a function of jet multiplicity and of the
+ transverse momentum of the leading b-jet for both the combined muon and electron decay modes of the W boson.
+ The default routine will consider the electron decay channel of the W boson. Use ATLAS_2013_I1219109_EL and
+ ATLAS_2013_I1219109_MU to specify the decay channel directly.
+BibKey: Aad:2013vka
+BibTeX: '@article{Aad:2013vka,
+ author = "Aad, Georges and others",
+ title = "{Measurement of the cross-section for W boson production
+ in association with b-jets in pp collisions at $\sqrt{s}$
+ = 7 TeV with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ journal = "JHEP",
+ volume = "1306",
+ pages = "084",
+ doi = "10.1007/JHEP06(2013)084",
+ year = "2013",
+ eprint = "1302.2929",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2012-357",
+ SLACcitation = "%%CITATION = ARXIV:1302.2929;%%",
+}'
+
diff --git a/data/anainfo/ATLAS_2014_I1306294.info b/data/anainfo/ATLAS_2014_I1306294.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2014_I1306294.info
@@ -0,0 +1,52 @@
+Name: ATLAS_2014_I1306294
+Year: 2014
+Summary: Measurement of Z boson in association with b-jets at 7 TeV in ATLAS (electron channel)
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1306294
+Status: VALIDATED
+Authors:
+ - Gavin Hesketh <gavin.hesketh@ucl.ac.uk>
+References:
+ - arXiv:1407.3643 [hep-ex]
+ - JHEP 1410 (2014) 141
+RunInfo: Z+b(b) production in $pp$ collisions at $\sqrt{s} = 7$ TeV, electronic Z-decays
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: pT(jets) > 20 GeV, pT(leptons) > 20 GeV
+NeedsCrossSection: True
+Description:
+ 'Measurements of differential production cross-sections of a $Z$
+ boson in association with $b$-jets in $pp$ collisions at $\sqrt{s}=7$
+ TeV are reported. The data analysed correspond to an integrated
+ luminosity of 4.6~fb$^{-1}$ recorded with the ATLAS detector at the
+ Large Hadron Collider. Particle-level cross-sections are determined
+ for events with a $Z$ boson decaying into an electron or muon pair,
+ and containing $b$-jets. For events with at least one $b$-jet, the
+ cross-section is presented as a function of the $Z$ boson transverse
+ momentum and rapidity, together with the inclusive $b$-jet
+ cross-section as a function of $b$-jet transverse momentum, rapidity
+ and angular separations between the $b$-jet and the $Z$ boson. For
+ events with at least two $b$-jets, the cross-section is determined as
+ a function of the invariant mass and angular separation of the two
+ highest transverse momentum $b$-jets, and as a function of the $Z$
+ boson transverse momentum and rapidity. Results are compared to
+ leading-order and next-to-leading-order perturbative QCD
+ calculations.
+ This Rivet module implements the event selection for Z decaying into electrons.
+ If you want to use muonic events, please refer to ATLAS\_2014\_I1306294\_MU'
+BibKey: Aad:2014dvb
+BibTeX: '@article{Aad:2014dvb,
+ author = "Aad, Georges and others",
+ title = "{Measurement of differential production cross-sections
+ for a $Z$ boson in association with $b$-jets in 7 TeV
+ proton-proton collisions with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ year = "2014",
+ eprint = "1407.3643",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2014-118",
+ SLACcitation = "%%CITATION = ARXIV:1407.3643;%%",
+}'
diff --git a/data/anainfo/ATLAS_2014_I1306294_EL.info b/data/anainfo/ATLAS_2014_I1306294_EL.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2014_I1306294_EL.info
@@ -0,0 +1,52 @@
+Name: ATLAS_2014_I1306294_EL
+Year: 2014
+Summary: Measurement of Z boson in association with b-jets at 7 TeV in ATLAS, (electron channel)
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1306294
+Status: VALIDATED
+Authors:
+ - Gavin Hesketh <gavin.hesketh@ucl.ac.uk>
+References:
+ - arXiv:1407.3643 [hep-ex]
+ - JHEP 1410 (2014) 141
+RunInfo: Z+b(b) production in $pp$ collisions at $\sqrt{s} = 7$ TeV, electronic Z-decays
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: pT(jets) > 20 GeV, pT(leptons) > 20 GeV
+NeedsCrossSection: True
+Description:
+ 'Measurements of differential production cross-sections of a $Z$
+ boson in association with $b$-jets in $pp$ collisions at $\sqrt{s}=7$
+ TeV are reported. The data analysed correspond to an integrated
+ luminosity of 4.6~fb$^{-1}$ recorded with the ATLAS detector at the
+ Large Hadron Collider. Particle-level cross-sections are determined
+ for events with a $Z$ boson decaying into an electron or muon pair,
+ and containing $b$-jets. For events with at least one $b$-jet, the
+ cross-section is presented as a function of the $Z$ boson transverse
+ momentum and rapidity, together with the inclusive $b$-jet
+ cross-section as a function of $b$-jet transverse momentum, rapidity
+ and angular separations between the $b$-jet and the $Z$ boson. For
+ events with at least two $b$-jets, the cross-section is determined as
+ a function of the invariant mass and angular separation of the two
+ highest transverse momentum $b$-jets, and as a function of the $Z$
+ boson transverse momentum and rapidity. Results are compared to
+ leading-order and next-to-leading-order perturbative QCD
+ calculations.
+ This Rivet module implements the event selection for Z decaying into electrons.
+ If you want to use muonic events, please refer to ATLAS\_2014\_I1306294\_MU'
+BibKey: Aad:2014dvb
+BibTeX: '@article{Aad:2014dvb,
+ author = "Aad, Georges and others",
+ title = "{Measurement of differential production cross-sections
+ for a $Z$ boson in association with $b$-jets in 7 TeV
+ proton-proton collisions with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ year = "2014",
+ eprint = "1407.3643",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2014-118",
+ SLACcitation = "%%CITATION = ARXIV:1407.3643;%%",
+}'
diff --git a/data/anainfo/ATLAS_2014_I1306294_MU.info b/data/anainfo/ATLAS_2014_I1306294_MU.info
new file mode 100644
--- /dev/null
+++ b/data/anainfo/ATLAS_2014_I1306294_MU.info
@@ -0,0 +1,52 @@
+Name: ATLAS_2014_I1306294_MU
+Year: 2014
+Summary: Measurement of Z boson in association with b-jets at 7 TeV in ATLAS (muon channel)
+Experiment: ATLAS
+Collider: LHC
+InspireID: 1306294
+Status: VALIDATED
+Authors:
+ - Gavin Hesketh <gavin.hesketh@ucl.ac.uk>
+References:
+ - arXiv:1407.3643 [hep-ex]
+ - JHEP 1410 (2014) 141
+RunInfo: Z+b(b) production in $pp$ collisions at $\sqrt{s} = 7$ TeV, muonic Z-decays
+NumEvents: 1000000
+Beams: [p+, p+]
+Energies: [7000]
+PtCuts: pT(jets) > 20 GeV, pT(leptons) > 20 GeV
+NeedsCrossSection: True
+Description:
+ 'Measurements of differential production cross-sections of a $Z$
+ boson in association with $b$-jets in $pp$ collisions at $\sqrt{s}=7$
+ TeV are reported. The data analysed correspond to an integrated
+ luminosity of 4.6~fb$^{-1}$ recorded with the ATLAS detector at the
+ Large Hadron Collider. Particle-level cross-sections are determined
+ for events with a $Z$ boson decaying into an electron or muon pair,
+ and containing $b$-jets. For events with at least one $b$-jet, the
+ cross-section is presented as a function of the $Z$ boson transverse
+ momentum and rapidity, together with the inclusive $b$-jet
+ cross-section as a function of $b$-jet transverse momentum, rapidity
+ and angular separations between the $b$-jet and the $Z$ boson. For
+ events with at least two $b$-jets, the cross-section is determined as
+ a function of the invariant mass and angular separation of the two
+ highest transverse momentum $b$-jets, and as a function of the $Z$
+ boson transverse momentum and rapidity. Results are compared to
+ leading-order and next-to-leading-order perturbative QCD
+ calculations.
+ This Rivet module implements the event selection for Z decaying into muons.
+ If you want to use electronic events, please refer to ATLAS\_2014\_I1306294\_EL'
+BibKey: Aad:2014dvb
+BibTeX: '@article{Aad:2014dvb,
+ author = "Aad, Georges and others",
+ title = "{Measurement of differential production cross-sections
+ for a $Z$ boson in association with $b$-jets in 7 TeV
+ proton-proton collisions with the ATLAS detector}",
+ collaboration = "ATLAS Collaboration",
+ year = "2014",
+ eprint = "1407.3643",
+ archivePrefix = "arXiv",
+ primaryClass = "hep-ex",
+ reportNumber = "CERN-PH-EP-2014-118",
+ SLACcitation = "%%CITATION = ARXIV:1407.3643;%%",
+}'
diff --git a/data/anainfo/Makefile.am b/data/anainfo/Makefile.am
--- a/data/anainfo/Makefile.am
+++ b/data/anainfo/Makefile.am
@@ -1,297 +1,303 @@
dist_pkgdata_DATA = \
ALEPH_1991_S2435284.info \
ALEPH_1996_S3486095.info \
ALEPH_1996_S3196992.info \
ALEPH_1999_S4193598.info \
ALEPH_2001_S4656318.info \
ALEPH_2002_S4823664.info \
ALEPH_2004_S5765862.info \
ALICE_2010_S8624100.info \
ALICE_2010_S8625980.info \
ALICE_2010_S8706239.info \
ALICE_2011_S8909580.info \
ALICE_2011_S8945144.info \
ALICE_2012_I1181770.info \
ARGUS_1993_S2653028.info \
ARGUS_1993_S2669951.info \
ARGUS_1993_S2789213.info \
ATLAS_2010_S8591806.info \
ATLAS_2010_S8817804.info \
ATLAS_2010_S8894728.info \
ATLAS_2010_S8914702.info \
ATLAS_2010_S8918562.info \
ATLAS_2010_S8919674.info \
ATLAS_2010_CONF_2010_049.info \
ATLAS_2011_S8924791.info \
ATLAS_2011_S8971293.info \
ATLAS_2011_S8983313.info \
ATLAS_2011_S8994773.info \
ATLAS_2011_S9002537.info \
ATLAS_2011_S9019561.info \
ATLAS_2011_S9041966.info \
ATLAS_2011_S9120807.info \
ATLAS_2011_S9126244.info \
ATLAS_2011_S9128077.info \
ATLAS_2011_S9131140.info \
ATLAS_2011_S9108483.info \
ATLAS_2011_S9212183.info \
ATLAS_2011_I894867.info \
ATLAS_2011_I921594.info \
ATLAS_2011_I930220.info \
ATLAS_2011_S9035664.info \
ATLAS_2011_I919017.info \
ATLAS_2011_I925932.info \
ATLAS_2011_I926145.info \
ATLAS_2011_I944826.info \
ATLAS_2011_I945498.info \
ATLAS_2011_I954993.info \
ATLAS_2011_S9225137.info \
ATLAS_2011_S9212353.info \
ATLAS_2011_CONF_2011_090.info \
ATLAS_2011_CONF_2011_098.info \
ATLAS_2012_I943401.info \
ATLAS_2012_I946427.info \
ATLAS_2012_I1083318.info \
ATLAS_2012_I1082936.info \
ATLAS_2012_I1084540.info \
ATLAS_2012_I1093734.info \
ATLAS_2012_I1093738.info \
ATLAS_2012_I1094061.info \
ATLAS_2012_I1094564.info \
ATLAS_2012_I1094568.info \
ATLAS_2012_I1095236.info \
ATLAS_2012_I1082009.info \
ATLAS_2012_I1091481.info \
ATLAS_2012_I1119557.info \
ATLAS_2012_I1124167.info \
ATLAS_2012_I1125575.info \
ATLAS_2012_I1183818.info \
ATLAS_2012_I1188891.info \
ATLAS_2012_I1112263.info \
ATLAS_2012_I1125961.info \
ATLAS_2012_I1126136.info \
ATLAS_2012_I1117704.info \
ATLAS_2012_I1118269.info \
ATLAS_2012_I1180197.info \
ATLAS_2012_I1186556.info \
ATLAS_2012_I1190891.info \
ATLAS_2012_I1199269.info \
ATLAS_2012_I1203852.info \
ATLAS_2012_I1204447.info \
ATLAS_2012_I1204784.info \
ATLAS_2012_CONF_2012_001.info \
ATLAS_2012_CONF_2012_103.info \
ATLAS_2012_CONF_2012_104.info \
ATLAS_2012_CONF_2012_105.info \
ATLAS_2012_CONF_2012_109.info \
ATLAS_2012_CONF_2012_153.info \
ATLAS_2013_I1190187.info \
ATLAS_2013_I1217867.info \
+ ATLAS_2013_I1219109.info \
+ ATLAS_2013_I1219109_EL.info \
+ ATLAS_2013_I1219109_MU.info \
ATLAS_2013_I1230812.info \
ATLAS_2013_I1230812_EL.info \
ATLAS_2013_I1230812_MU.info \
ATLAS_2013_I1243871.info \
ATLAS_2013_I1263495.info \
ATLAS_2014_I1268975.info \
ATLAS_2014_I1279489.info \
ATLAS_2014_I1282441.info \
ATLAS_2014_I1298811.info \
ATLAS_2014_I1304688.info \
ATLAS_2014_I1307756.info \
+ ATLAS_2014_I1306294.info \
+ ATLAS_2014_I1306294_EL.info \
+ ATLAS_2014_I1306294_MU.info \
BABAR_2003_I593379.info \
BABAR_2005_S6181155.info \
BABAR_2007_S6895344.info \
BABAR_2007_S7266081.info \
BELLE_2008_I786560.info \
BABAR_2013_I1238276.info \
BELLE_2001_S4598261.info \
BELLE_2013_I1216515.info \
CDF_1988_S1865951.info \
CDF_1990_S2089246.info \
CDF_1993_S2742446.info \
CDF_1994_S2952106.info \
CDF_1996_S3108457.info \
CDF_1996_S3349578.info \
CDF_1996_S3418421.info \
CDF_1997_S3541940.info \
CDF_1998_S3618439.info \
CDF_2000_S4155203.info \
CDF_2000_S4266730.info \
CDF_2001_S4517016.info \
CDF_2001_S4563131.info \
CDF_2001_S4751469.info \
CDF_2002_S4796047.info \
CDF_2004_S5839831.info \
CDF_2005_S6080774.info \
CDF_2005_S6217184.info \
CDF_2006_S6450792.info \
CDF_2006_S6653332.info \
CDF_2007_S7057202.info \
CDF_2008_S7540469.info \
CDF_2008_S7541902.info \
CDF_2008_S7782535.info \
CDF_2008_S7828950.info \
CDF_2008_S8093652.info \
CDF_2008_S8095620.info \
CDF_2009_S8233977.info \
CDF_2009_NOTE_9936.info \
CDF_2009_S8383952.info \
CDF_2009_S8436959.info \
CDF_2010_S8591881_DY.info \
CDF_2010_S8591881_QCD.info \
CDF_2012_NOTE10874.info \
CLEO_2004_S5809304.info\
CMS_2010_S8547297.info \
CMS_2010_S8656010.info \
CMS_2011_S8884919.info \
CMS_2011_S9215166.info \
CMS_2012_I941555.info \
CMS_2011_I954992.info \
CMS_2011_S8941262.info \
CMS_2011_S8950903.info \
CMS_2011_S8957746.info \
CMS_2011_S8968497.info \
CMS_2011_S8973270.info \
CMS_2011_S8978280.info \
CMS_2011_S9086218.info \
CMS_2011_S9088458.info \
CMS_2011_S9120041.info \
CMS_2012_I1087342.info \
CMS_2012_I1090423.info \
CMS_2012_I1102908.info \
CMS_2012_I1107658.info \
CMS_2012_I1184941.info \
CMS_2012_I1193338.info \
CMS_2013_I1209721.info \
CMS_2013_I1218372.info \
CMS_2013_I1224539_DIJET.info \
CMS_2013_I1224539_WJET.info \
CMS_2013_I1224539_ZJET.info \
CMS_2013_I1256943.info \
CMS_2013_I1258128.info \
CMS_2013_I1261026.info \
CMS_2013_I1265659.info \
CMS_2013_I1272853.info \
CMS_2013_I1273574.info \
CMSTOTEM_2014_I1294140.info \
CMS_2012_PAS_QCD_11_010.info \
CMS_QCD_10_024.info \
D0_1996_S3214044.info \
D0_1996_S3324664.info \
D0_2000_S4480767.info \
D0_2000_I499943.info \
D0_2001_S4674421.info \
D0_2004_S5992206.info \
D0_2006_S6438750.info \
D0_2007_S7075677.info \
D0_2008_S6879055.info \
D0_2008_S7554427.info \
D0_2008_S7662670.info \
D0_2008_S7719523.info \
D0_2008_S7837160.info \
D0_2008_S7863608.info \
D0_2009_S8202443.info \
D0_2009_S8320160.info \
D0_2009_S8349509.info \
D0_2010_S8566488.info \
D0_2010_S8570965.info \
D0_2010_S8671338.info \
D0_2010_S8821313.info \
D0_2011_I895662.info \
E735_1998_S3905616.info \
DELPHI_1995_S3137023.info \
DELPHI_1996_S3430090.info \
DELPHI_1999_S3960137.info \
DELPHI_2000_S4328825.info \
DELPHI_2002_069_CONF_603.info \
DELPHI_2003_WUD_03_11.info \
EXAMPLE.info \
H1_1994_S2919893.info \
H1_1995_S3167097.info \
H1_2000_S4129130.info \
JADE_OPAL_2000_S4300807.info \
JADE_1998_S3612880.info \
LHCB_2010_S8758301.info \
LHCB_2010_I867355.info \
LHCB_2011_I917009.info \
LHCB_2011_I919315.info \
LHCB_2012_I1119400.info \
LHCB_2013_I1208105.info \
LHCB_2013_I1218996.info \
LHCF_2012_I1115479.info \
MC_DIJET.info \
MC_DIPHOTON.info \
MC_ELECTRONS.info \
MC_GENERIC.info \
MC_HFJETS.info \
MC_HINC.info \
MC_HJETS.info \
MC_HKTSPLITTINGS.info \
MC_IDENTIFIED.info \
MC_JETS.info \
MC_JETTAGS.info \
MC_KTSPLITTINGS.info \
MC_LEADJETUE.info \
MC_MUONS.info \
MC_PDFS.info \
MC_PHOTONINC.info \
MC_PHOTONJETS.info \
MC_PHOTONKTSPLITTINGS.info \
MC_PHOTONJETUE.info \
MC_PHOTONS.info \
MC_PRINTEVENT.info \
MC_QCD_PARTONS.info \
MC_SUSY.info \
MC_TAUS.info \
MC_TTBAR.info \
MC_VH2BB.info \
MC_WINC.info \
MC_WJETS.info \
MC_WKTSPLITTINGS.info \
MC_WPOL.info \
MC_WWINC.info \
MC_WWJETS.info \
MC_WWKTSPLITTINGS.info \
MC_XS.info \
MC_ZINC.info \
MC_ZJETS.info \
MC_ZKTSPLITTINGS.info \
MC_ZZINC.info \
MC_ZZJETS.info \
MC_ZZKTSPLITTINGS.info \
OPAL_1993_S2692198.info \
OPAL_1994_S2927284.info \
OPAL_1995_S3198391.info \
OPAL_1996_S3257789.info \
OPAL_1997_S3396100.info \
OPAL_1997_S3608263.info \
OPAL_1998_S3702294.info \
OPAL_1998_S3780481.info \
OPAL_1998_S3749908.info \
OPAL_2000_S4418603.info \
OPAL_2001_S4553896.info \
OPAL_2002_S5361494.info \
OPAL_2004_S6132243.info \
PDG_HADRON_MULTIPLICITIES.info \
PDG_HADRON_MULTIPLICITIES_RATIOS.info \
SFM_1984_S1178091.info \
SLD_1996_S3398250.info \
SLD_1999_S3743934.info \
SLD_2002_S4869273.info \
SLD_2004_S5693039.info \
STAR_2006_S6500200.info \
STAR_2006_S6860818.info \
STAR_2006_S6870392.info \
STAR_2008_S7869363.info \
STAR_2008_S7993412.info \
STAR_2009_UE_HELEN.info \
TASSO_1990_S2148048.info \
TOTEM_2012_I1115294.info \
TOTEM_2012_002.info \
ZEUS_2001_S4815815.info \
UA1_1990_S2044935.info \
UA5_1982_S875503.info \
UA5_1986_S1583476.info \
UA5_1987_S1640666.info \
UA5_1988_S1867512.info \
UA5_1989_S1926373.info
diff --git a/data/plotinfo/ATLAS_2013_I1219109.plot b/data/plotinfo/ATLAS_2013_I1219109.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2013_I1219109.plot
@@ -0,0 +1,34 @@
+# BEGIN PLOT /ATLAS_2013_I1219109/d..-x..-y..
+LogY=1
+LegendYPos=0.90
+LegendXPos=0.75
+XTwosidedTicks=1
+YTwosidedTicks=1
+ErrorBars=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109/d01-x..-y..
+LogY=0
+LegendXPos=0.20
+YLabel=$\sigma_\text{fiducial}$ [pb]
+XMinorTickMarks=0
+XCustomMajorTicks=1 1 jet 2 2 jet 3 1+2 jet
+Title=electron channel, dressed level
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109/d02-x..-y..
+LeftMargin=1.5
+XLabel=b-jet $p_\text{T}$ [GeV]
+YLabel=$\text{d}\sigma / \text{d} p_\text{T}^\text{b-jet}$ [GeV/pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109/d02-x01-y01
+Title=$N_\text{jet} = 1$, electron channel, dressed level
+RatioPlotYMax=1.6
+RatioPlotYMin=0.4
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109/d02-x02-y01
+Title=$N_\text{jet} = 2$, electron channel, dressed level
+# END PLOT
+
diff --git a/data/plotinfo/ATLAS_2013_I1219109_EL.plot b/data/plotinfo/ATLAS_2013_I1219109_EL.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2013_I1219109_EL.plot
@@ -0,0 +1,34 @@
+# BEGIN PLOT /ATLAS_2013_I1219109_EL/d..-x..-y..
+LogY=1
+LegendYPos=0.90
+LegendXPos=0.75
+XTwosidedTicks=1
+YTwosidedTicks=1
+ErrorBars=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_EL/d01-x..-y..
+LogY=0
+LegendXPos=0.20
+YLabel=$\sigma_\text{fiducial}$ [pb]
+XMinorTickMarks=0
+XCustomMajorTicks=1 1 jet 2 2 jet 3 1+2 jet
+Title=electron channel, dressed level
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_EL/d02-x..-y..
+LeftMargin=1.5
+XLabel=b-jet $p_\text{T}$ [GeV]
+YLabel=$\text{d}\sigma / \text{d} p_\text{T}^\text{b-jet}$ [GeV/pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_EL/d02-x01-y01
+Title=$N_\text{jet} = 1$, electron channel, dressed level
+RatioPlotYMax=1.6
+RatioPlotYMin=0.4
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_EL/d02-x02-y01
+Title=$N_\text{jet} = 2$, electron channel, dressed level
+# END PLOT
+
diff --git a/data/plotinfo/ATLAS_2013_I1219109_MU.plot b/data/plotinfo/ATLAS_2013_I1219109_MU.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2013_I1219109_MU.plot
@@ -0,0 +1,34 @@
+# BEGIN PLOT /ATLAS_2013_I1219109_MU/d..-x..-y..
+LogY=1
+LegendYPos=0.90
+LegendXPos=0.75
+XTwosidedTicks=1
+YTwosidedTicks=1
+ErrorBars=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_MU/d01-x..-y..
+LogY=0
+LegendXPos=0.20
+YLabel=$\sigma_\text{fiducial}$ [pb]
+XMinorTickMarks=0
+XCustomMajorTicks=1 1 jet 2 2 jet 3 1+2 jet
+Title=electron channel, dressed level
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_MU/d02-x..-y..
+LeftMargin=1.5
+XLabel=b-jet $p_\text{T}$ [GeV]
+YLabel=$\text{d}\sigma / \text{d} p_\text{T}^\text{b-jet}$ [GeV/pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_MU/d02-x01-y01
+Title=$N_\text{jet} = 1$, electron channel, dressed level
+RatioPlotYMax=1.6
+RatioPlotYMin=0.4
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2013_I1219109_MU/d02-x02-y01
+Title=$N_\text{jet} = 2$, electron channel, dressed level
+# END PLOT
+
diff --git a/data/plotinfo/ATLAS_2014_I1306294.plot b/data/plotinfo/ATLAS_2014_I1306294.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2014_I1306294.plot
@@ -0,0 +1,85 @@
+# BEGIN PLOT /ATLAS_2014_I1306294/d03-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $p_{T}$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} p_T / N_{b\text{-jets}}$ [pb/GeV]
+LogX=1
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d05-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $|y|$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} |y| / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d07-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$y_{\text{boost}}(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} y_{\text{boost}}(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d09-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta y(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta y(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d11-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta\phi(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta\phi(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d13-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta R(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta R(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d15-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$p_{T}(Z)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} p_{T}(Z)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d17-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$|y(Z)|$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} |y(Z)|$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d21-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$\Delta R(b,b)$
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} \Delta R(b,b)$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d23-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$m(b,b)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} m(b,b)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d25-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$p_{T}(Z)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} p_{T}(Z)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294/d27-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$|y(Z)|$
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} |y(Z)|$ [pb]
+LogY=0
+# END PLOT
+
diff --git a/data/plotinfo/ATLAS_2014_I1306294_EL.plot b/data/plotinfo/ATLAS_2014_I1306294_EL.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2014_I1306294_EL.plot
@@ -0,0 +1,85 @@
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d03-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $p_{T}$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} p_T / N_{b\text{-jets}}$ [pb/GeV]
+LogX=1
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d05-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $|y|$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} |y| / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d07-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$y_{\text{boost}}(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} y_{\text{boost}}(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d09-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta y(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta y(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d11-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta\phi(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta\phi(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d13-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta R(Z,b)$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} \Delta R(Z,b) / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d15-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$p_{T}(Z)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} p_{T}(Z)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d17-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$|y(Z)|$
+YLabel=$\mathrm{d} \sigma(Zb) / \mathrm{d} |y(Z)|$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d21-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$\Delta R(b,b)$
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} \Delta R(b,b)$ [pb]
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d23-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$m(b,b)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} m(b,b)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d25-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$p_{T}(Z)$ [GeV]
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} p_{T}(Z)$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_EL/d27-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$|y(Z)|$
+YLabel=$\mathrm{d} \sigma(Zbb) / \mathrm{d} |y(Z)|$ [pb]
+LogY=0
+# END PLOT
+
diff --git a/data/plotinfo/ATLAS_2014_I1306294_MU.plot b/data/plotinfo/ATLAS_2014_I1306294_MU.plot
new file mode 100644
--- /dev/null
+++ b/data/plotinfo/ATLAS_2014_I1306294_MU.plot
@@ -0,0 +1,85 @@
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d..
+LeftMargin=1.8
+YLabelSep=7.5
+LogY=0
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d03-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $p_\text{T}$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} p_\text{T}} / N_{b\text{-jets}}$ [pb/GeV]
+LogX=1
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d05-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$b$-jet $|y|$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} |y|} / N_{b\text{-jets}}$ [pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d07-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$y_{\text{boost}}(Z,b)$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} y_{\text{boost}}(Z,b)} / N_{b\text{-jets}}$ [pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d09-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta y(Z,b)$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} \Delta y(Z,b)} / N_{b\text{-jets}}$ [pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d11-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta\phi(Z,b)$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} \Delta\phi(Z,b)} / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d13-x01-y01
+Title=$Z + \ge 1$ b-jet, $p_{T}(Z) > 20$ GeV
+XLabel=$\Delta R(Z,b)$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} \Delta R(Z,b)} / N_{b\text{-jets}}$ [pb]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d15-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$p_\text{T}(Z)$ [GeV]
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} p_\text{T}(Z)}$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d17-x01-y01
+Title=$Z + \ge 1$ b-jet
+XLabel=$|y(Z)|$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zb)}{\mathrm{d} |y(Z)|}$ [pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d21-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$\Delta R(b,b)$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zbb)}{\mathrm{d} \Delta R(b,b)}$ [pb]
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d23-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$m(b,b)$ [GeV]
+YLabel=$\dfrac{\mathrm{d} \sigma(Zbb)}{\mathrm{d} m(b,b)}$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d25-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$p_\text{T}(Z)$ [GeV]
+YLabel=$\dfrac{\mathrm{d} \sigma(Zbb)}{\mathrm{d} p_{T}(Z)}$ [pb/GeV]
+LogY=1
+# END PLOT
+
+# BEGIN PLOT /ATLAS_2014_I1306294_MU/d27-x01-y01
+Title=$Z + \ge 2$ b-jets
+XLabel=$|y(Z)|$
+YLabel=$\dfrac{\mathrm{d} \sigma(Zbb)}{\mathrm{d} |y(Z)|}$ [pb]
+# END PLOT
+
diff --git a/data/plotinfo/Makefile.am b/data/plotinfo/Makefile.am
--- a/data/plotinfo/Makefile.am
+++ b/data/plotinfo/Makefile.am
@@ -1,292 +1,298 @@
dist_pkgdata_DATA = \
ALEPH_1991_S2435284.plot \
ALEPH_1996_S3486095.plot \
ALEPH_1996_S3196992.plot \
ALEPH_1999_S4193598.plot \
ALEPH_2001_S4656318.plot \
ALEPH_2002_S4823664.plot \
ALEPH_2004_S5765862.plot \
ALICE_2010_S8624100.plot \
ALICE_2010_S8625980.plot \
ALICE_2010_S8706239.plot \
ALICE_2011_S8909580.plot \
ALICE_2011_S8945144.plot \
ALICE_2012_I1181770.plot \
ARGUS_1993_S2653028.plot \
ARGUS_1993_S2669951.plot \
ARGUS_1993_S2789213.plot \
ATLAS_2010_S8591806.plot \
ATLAS_2010_S8817804.plot \
ATLAS_2010_S8894728.plot \
ATLAS_2010_S8914702.plot \
ATLAS_2010_S8918562.plot \
ATLAS_2010_S8919674.plot \
ATLAS_2010_CONF_2010_049.plot \
ATLAS_2011_S8924791.plot \
ATLAS_2011_S8971293.plot \
ATLAS_2011_S8994773.plot \
ATLAS_2011_S9002537.plot \
ATLAS_2011_S9120807.plot \
ATLAS_2011_S9126244.plot \
ATLAS_2011_S9128077.plot \
ATLAS_2011_S9131140.plot \
ATLAS_2011_I894867.plot \
ATLAS_2011_I919017.plot \
ATLAS_2011_I921594.plot \
ATLAS_2011_I925932.plot \
ATLAS_2011_I926145.plot \
ATLAS_2011_I930220.plot \
ATLAS_2011_I944826.plot \
ATLAS_2011_I945498.plot \
ATLAS_2011_I954993.plot \
ATLAS_2011_S9225137.plot \
ATLAS_2011_S9212183.plot \
ATLAS_2011_S8983313.plot \
ATLAS_2011_S9212353.plot \
ATLAS_2011_CONF_2011_090.plot \
ATLAS_2011_CONF_2011_098.plot \
ATLAS_2012_I1082936.plot \
ATLAS_2012_I1083318.plot \
ATLAS_2012_I1084540.plot \
ATLAS_2012_I1091481.plot \
ATLAS_2012_I1093734.plot \
ATLAS_2012_I1093738.plot \
ATLAS_2012_I1094061.plot \
ATLAS_2012_I1094564.plot \
ATLAS_2012_I1094568.plot \
ATLAS_2012_I1095236.plot \
ATLAS_2012_I943401.plot \
ATLAS_2012_I946427.plot \
ATLAS_2012_I1119557.plot \
ATLAS_2012_I1124167.plot \
ATLAS_2012_I1125575.plot \
ATLAS_2012_I1112263.plot \
ATLAS_2012_I1125961.plot \
ATLAS_2012_I1126136.plot \
ATLAS_2012_I1117704.plot \
ATLAS_2012_I1118269.plot \
ATLAS_2012_I1180197.plot \
ATLAS_2012_I1082009.plot \
ATLAS_2012_I1183818.plot \
ATLAS_2012_I1188891.plot \
ATLAS_2012_I1186556.plot \
ATLAS_2012_I1190891.plot \
ATLAS_2012_I1199269.plot \
ATLAS_2012_I1203852.plot \
ATLAS_2012_I1204447.plot \
ATLAS_2012_I1204784.plot \
ATLAS_2012_CONF_2012_001.plot \
ATLAS_2012_CONF_2012_103.plot \
ATLAS_2012_CONF_2012_104.plot \
ATLAS_2012_CONF_2012_105.plot \
ATLAS_2012_CONF_2012_109.plot \
ATLAS_2012_CONF_2012_153.plot \
ATLAS_2013_I1190187.plot \
+ ATLAS_2013_I1219109.plot \
+ ATLAS_2013_I1219109_EL.plot \
+ ATLAS_2013_I1219109_MU.plot \
ATLAS_2013_I1217867.plot \
ATLAS_2013_I1230812.plot \
ATLAS_2013_I1230812_EL.plot \
ATLAS_2013_I1230812_MU.plot \
ATLAS_2013_I1243871.plot \
ATLAS_2013_I1263495.plot \
ATLAS_2014_I1268975.plot \
ATLAS_2014_I1279489.plot \
ATLAS_2014_I1282441.plot \
ATLAS_2014_I1298811.plot \
ATLAS_2014_I1304688.plot \
ATLAS_2014_I1307756.plot \
+ ATLAS_2014_I1306294.plot \
+ ATLAS_2014_I1306294_EL.plot \
+ ATLAS_2014_I1306294_MU.plot \
BABAR_2003_I593379.plot \
BABAR_2005_S6181155.plot \
BABAR_2007_S6895344.plot \
BABAR_2007_S7266081.plot \
BABAR_2013_I1238276.plot \
BELLE_2001_S4598261.plot \
BELLE_2008_I786560.plot \
BELLE_2013_I1216515.plot \
CDF_1988_S1865951.plot \
CDF_1990_S2089246.plot \
CDF_1993_S2742446.plot \
CDF_1994_S2952106.plot \
CDF_1996_S3108457.plot \
CDF_1996_S3349578.plot \
CDF_1996_S3418421.plot \
CDF_1997_S3541940.plot \
CDF_1998_S3618439.plot \
CDF_2000_S4155203.plot \
CDF_2000_S4266730.plot \
CDF_2001_S4517016.plot \
CDF_2001_S4563131.plot \
CDF_2001_S4751469.plot \
CDF_2002_S4796047.plot \
CDF_2004_S5839831.plot \
CDF_2005_S6080774.plot \
CDF_2005_S6217184.plot \
CDF_2006_S6450792.plot \
CDF_2006_S6653332.plot \
CDF_2007_S7057202.plot \
CDF_2008_S7540469.plot \
CDF_2008_S7541902.plot \
CDF_2008_S7782535.plot \
CDF_2008_S7828950.plot \
CDF_2008_S8093652.plot \
CDF_2008_S8095620.plot \
CDF_2009_S8233977.plot \
CDF_2009_NOTE_9936.plot \
CDF_2009_S8383952.plot \
CDF_2009_S8436959.plot \
CDF_2010_S8591881_DY.plot \
CDF_2010_S8591881_QCD.plot \
CDF_2012_NOTE10874.plot \
CLEO_2004_S5809304.plot \
CMS_2010_S8547297.plot \
CMS_2010_S8656010.plot \
CMS_2011_S8884919.plot \
CMS_2011_S8941262.plot \
CMS_2011_S8950903.plot \
CMS_2011_S8957746.plot \
CMS_2011_S8968497.plot \
CMS_2011_S8973270.plot \
CMS_2011_S8978280.plot \
CMS_2011_S9086218.plot \
CMS_2011_S9088458.plot \
CMS_2011_S9120041.plot \
CMS_2011_S9215166.plot \
CMS_2012_I941555.plot \
CMS_2011_I954992.plot \
CMS_2012_I1087342.plot \
CMS_2012_I1090423.plot \
CMS_2012_I1102908.plot \
CMS_2012_I1107658.plot \
CMS_2012_I1184941.plot \
CMS_2012_I1193338.plot \
CMS_2013_I1209721.plot \
CMS_2013_I1218372.plot \
CMS_2013_I1224539_DIJET.plot \
CMS_2013_I1224539_WJET.plot \
CMS_2013_I1224539_ZJET.plot \
CMS_2013_I1256943.plot \
CMS_2013_I1258128.plot \
CMS_2013_I1261026.plot \
CMS_2013_I1265659.plot \
CMS_2013_I1272853.plot \
CMS_2013_I1273574.plot \
CMSTOTEM_2014_I1294140.plot \
CMS_2012_PAS_QCD_11_010.plot \
CMS_QCD_10_024.plot \
D0_1996_S3214044.plot \
D0_1996_S3324664.plot \
D0_2000_S4480767.plot \
D0_2000_I499943.plot \
D0_2001_S4674421.plot \
D0_2004_S5992206.plot \
D0_2006_S6438750.plot \
D0_2007_S7075677.plot \
D0_2008_S6879055.plot \
D0_2008_S7554427.plot \
D0_2008_S7662670.plot \
D0_2008_S7719523.plot \
D0_2008_S7837160.plot \
D0_2008_S7863608.plot \
D0_2009_S8202443.plot \
D0_2009_S8320160.plot \
D0_2009_S8349509.plot \
D0_2010_S8566488.plot \
D0_2010_S8570965.plot \
D0_2010_S8671338.plot \
D0_2010_S8821313.plot \
D0_2011_I895662.plot \
E735_1998_S3905616.plot \
DELPHI_1995_S3137023.plot \
DELPHI_1996_S3430090.plot \
DELPHI_1999_S3960137.plot \
DELPHI_2000_S4328825.plot \
DELPHI_2002_069_CONF_603.plot \
DELPHI_2003_WUD_03_11.plot \
EXAMPLE.plot \
H1_1994_S2919893.plot \
H1_1995_S3167097.plot \
H1_2000_S4129130.plot \
JADE_OPAL_2000_S4300807.plot \
JADE_1998_S3612880.plot \
LHCB_2010_S8758301.plot \
LHCB_2010_I867355.plot \
LHCB_2011_I917009.plot \
LHCB_2011_I919315.plot \
LHCB_2012_I1119400.plot \
LHCB_2013_I1208105.plot \
LHCB_2013_I1218996.plot \
LHCF_2012_I1115479.plot \
MC_DIJET.plot \
MC_DIPHOTON.plot \
MC_ELECTRONS.plot \
MC_GENERIC.plot \
MC_HFJETS.plot \
MC_HINC.plot \
MC_HJETS.plot \
MC_HKTSPLITTINGS.plot \
MC_IDENTIFIED.plot \
MC_JETS.plot \
MC_JETTAGS.plot \
MC_KTSPLITTINGS.plot \
MC_LEADJETUE.plot \
MC_MUONS.plot \
MC_PDFS.plot \
MC_PHOTONINC.plot \
MC_PHOTONJETS.plot \
MC_PHOTONKTSPLITTINGS.plot \
MC_PHOTONS.plot \
MC_PHOTONJETUE.plot \
MC_QCD_PARTONS.plot \
MC_SUSY.plot \
MC_TAUS.plot \
MC_TTBAR.plot \
MC_VH2BB.plot \
MC_WINC.plot \
MC_WJETS.plot \
MC_WKTSPLITTINGS.plot \
MC_WPOL.plot \
MC_WWINC.plot \
MC_WWJETS.plot \
MC_WWKTSPLITTINGS.plot \
MC_XS.plot \
MC_ZINC.plot \
MC_ZJETS.plot \
MC_ZKTSPLITTINGS.plot \
MC_ZZINC.plot \
MC_ZZJETS.plot \
MC_ZZKTSPLITTINGS.plot \
OPAL_1993_S2692198.plot \
OPAL_1994_S2927284.plot \
OPAL_1995_S3198391.plot \
OPAL_1996_S3257789.plot \
OPAL_1997_S3396100.plot \
OPAL_1997_S3608263.plot \
OPAL_1998_S3702294.plot \
OPAL_1998_S3749908.plot \
OPAL_1998_S3780481.plot \
OPAL_2000_S4418603.plot \
OPAL_2001_S4553896.plot \
OPAL_2002_S5361494.plot \
OPAL_2004_S6132243.plot \
PDG_HADRON_MULTIPLICITIES.plot \
PDG_HADRON_MULTIPLICITIES_RATIOS.plot \
SFM_1984_S1178091.plot \
SLD_1996_S3398250.plot \
SLD_1999_S3743934.plot \
SLD_2002_S4869273.plot \
SLD_2004_S5693039.plot \
STAR_2006_S6500200.plot \
STAR_2006_S6860818.plot \
STAR_2006_S6870392.plot \
STAR_2008_S7869363.plot \
STAR_2008_S7993412.plot \
STAR_2009_UE_HELEN.plot \
TASSO_1990_S2148048.plot \
TOTEM_2012_I1115294.plot \
TOTEM_2012_002.plot \
ZEUS_2001_S4815815.plot \
UA1_1990_S2044935.plot \
UA5_1982_S875503.plot \
UA5_1986_S1583476.plot \
UA5_1987_S1640666.plot \
UA5_1988_S1867512.plot \
UA5_1989_S1926373.plot
diff --git a/data/refdata/ATLAS_2013_I1219109.yoda b/data/refdata/ATLAS_2013_I1219109.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2013_I1219109.yoda
@@ -0,0 +1,29 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109/d01-x01-y01
+Path=/REF/ATLAS_2013_I1219109/d01-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+1.000000e+00 0.500000e+00 0.500000e+00 5.000000e+00 1.300000e+00 1.300000e+00
+2.000000e+00 0.500000e+00 0.500000e+00 2.200000e+00 0.538516e+00 0.538516e+00
+3.000000e+00 0.500000e+00 0.500000e+00 7.100000e+00 1.486607e+00 1.486607e+00
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109/d02-x01-y01
+Path=/REF/ATLAS_2013_I1219109/d02-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 2.590000e-01 6.638691e-02 6.638691e-02
+3.500000e+01 5.000000e+00 5.000000e+00 1.430000e-01 2.849255e-02 2.849255e-02
+5.000000e+01 1.000000e+01 1.000000e+01 6.500000e-02 2.282416e-02 2.282416e-02
+1.000000e+02 4.000000e+01 4.000000e+01 1.030000e-02 5.862863e-03 5.862863e-03
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109/d02-x02-y01
+Path=/REF/ATLAS_2013_I1219109/d02-x02-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 7.300000e-02 2.090402e-02 2.090402e-02
+3.500000e+01 5.000000e+00 5.000000e+00 5.800000e-02 1.357745e-02 1.357745e-02
+5.000000e+01 1.000000e+01 1.000000e+01 3.800000e-02 9.590766e-03 9.590766e-03
+1.000000e+02 4.000000e+01 4.000000e+01 9.300000e-03 3.589848e-03 3.589848e-03
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/ATLAS_2013_I1219109_EL.yoda b/data/refdata/ATLAS_2013_I1219109_EL.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2013_I1219109_EL.yoda
@@ -0,0 +1,29 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_EL/d01-x01-y02
+Path=/REF/ATLAS_2013_I1219109_EL/d01-x01-y02
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+1.000000e+00 0.500000e+00 0.500000e+00 5.000000e+00 1.300000e+00 1.300000e+00
+2.000000e+00 0.500000e+00 0.500000e+00 2.200000e+00 0.538516e+00 0.538516e+00
+3.000000e+00 0.500000e+00 0.500000e+00 7.100000e+00 1.486607e+00 1.486607e+00
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_EL/d02-x01-y02
+Path=/REF/ATLAS_2013_I1219109_EL/d02-x01-y02
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 2.590000e+02 6.638691e+01 6.638691e+01
+3.500000e+01 5.000000e+00 0.000000e+00 1.430000e+02 2.849255e+01 2.849255e+01
+5.000000e+01 1.000000e+01 1.000000e+01 6.500000e+01 2.282416e+01 2.282416e+01
+1.000000e+02 4.000000e+01 4.000000e+01 1.030000e+01 5.862863e+00 5.862863e+00
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_EL/d02-x02-y02
+Path=/REF/ATLAS_2013_I1219109_EL/d02-x02-y02
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 7.300000e+01 2.090402e+01 2.090402e+01
+3.500000e+01 5.000000e+00 0.000000e+00 5.800000e+01 1.357745e+01 1.357745e+01
+5.000000e+01 1.000000e+01 1.000000e+01 3.800000e+01 9.590766e+00 9.590766e+00
+1.000000e+02 4.000000e+01 4.000000e+01 9.300000e+00 3.589848e+00 3.589848e+00
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/ATLAS_2013_I1219109_MU.yoda b/data/refdata/ATLAS_2013_I1219109_MU.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2013_I1219109_MU.yoda
@@ -0,0 +1,29 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_MU/d01-x01-y03
+Path=/REF/ATLAS_2013_I1219109_MU/d01-x01-y03
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+1.000000e+00 0.500000e+00 0.500000e+00 5.000000e+00 1.300000e+00 1.300000e+00
+2.000000e+00 0.500000e+00 0.500000e+00 2.200000e+00 0.538516e+00 0.538516e+00
+3.000000e+00 0.500000e+00 0.500000e+00 7.100000e+00 1.486607e+00 1.486607e+00
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_MU/d02-x01-y03
+Path=/REF/ATLAS_2013_I1219109_MU/d02-x01-y03
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 2.590000e+02 6.638691e+01 6.638691e+01
+3.500000e+01 5.000000e+00 0.000000e+00 1.430000e+02 2.849255e+01 2.849255e+01
+5.000000e+01 1.000000e+01 1.000000e+01 6.500000e+01 2.282416e+01 2.282416e+01
+1.000000e+02 4.000000e+01 4.000000e+01 1.030000e+01 5.862863e+00 5.862863e+00
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2013_I1219109_MU/d02-x02-y03
+Path=/REF/ATLAS_2013_I1219109_MU/d02-x02-y03
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+2.750000e+01 2.500000e+00 2.500000e+00 7.300000e+01 2.090402e+01 2.090402e+01
+3.500000e+01 5.000000e+00 0.000000e+00 5.800000e+01 1.357745e+01 1.357745e+01
+5.000000e+01 1.000000e+01 1.000000e+01 3.800000e+01 9.590766e+00 9.590766e+00
+1.000000e+02 4.000000e+01 4.000000e+01 9.300000e+00 3.589848e+00 3.589848e+00
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/ATLAS_2014_I1306294.yoda b/data/refdata/ATLAS_2014_I1306294.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2014_I1306294.yoda
@@ -0,0 +1,160 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d03-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d03-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+25.0 5.0 5.0 0.196336 0.022767002685686488 0.022442750699509982
+40.0 10.0 10.0 0.093764 0.007448250031418763 0.008162343348859616
+62.5 12.5 12.5 0.035126 0.003307914470330743 0.0030853847425376625
+92.5 17.5 17.5 0.01184 0.0014315010480918273 0.0013816970553518597
+155.0 45.0 45.0 0.002142 2.77101624652906E-4 2.9025364621895793E-4
+350.0 150.0 150.0 1.01E-4 2.6031861948965543E-5 2.8642877036010193E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d05-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d05-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.995969 0.2502050286854681 0.2540329091205163
+0.3 0.09999999999999998 0.10000000000000003 2.935592 0.2423499662186646 0.2455083521104788
+0.5 0.09999999999999998 0.09999999999999998 3.036675 0.23848341077489127 0.25181855164208083
+0.7 0.09999999999999998 0.10000000000000009 2.706656 0.22988195239694798 0.22508346896326376
+1.0 0.19999999999999996 0.19999999999999996 2.540045 0.20006301281941774 0.20460988758462773
+1.4 0.19999999999999996 0.20000000000000018 2.136394 0.19707166424050004 0.193061234261348
+1.8 0.19999999999999996 0.19999999999999996 1.637068 0.1729878996186148 0.1748039869207207
+2.2 0.20000000000000018 0.19999999999999973 1.21209 0.1527242057761174 0.14945113936795842
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d07-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d07-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 4.328249 0.36738854687035144 0.35840515301630893
+0.3 0.09999999999999998 0.10000000000000003 4.075786 0.34044403930699496 0.3298406468813218
+0.5 0.09999999999999998 0.09999999999999998 4.008528 0.3269278371498574 0.31803065980564604
+0.7 0.09999999999999998 0.10000000000000009 3.789802 0.32872966247795 0.3308358227088082
+1.0 0.19999999999999996 0.19999999999999996 2.781049 0.23059072437308237 0.22663501859483245
+1.4 0.19999999999999996 0.20000000000000018 1.769883 0.16233918363101357 0.16397495579522062
+1.8 0.19999999999999996 0.19999999999999996 0.689683 0.09141848976274591 0.09262596750446248
+2.25 0.25 0.25 0.087968 0.02147261220284261 0.02042288853165655
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d09-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d09-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.589466 0.21621296809537036 0.21667456364495416
+0.3 0.09999999999999998 0.10000000000000003 2.561052 0.21860840440486368 0.2217777873995027
+0.5 0.09999999999999998 0.09999999999999998 2.611447 0.21472256724541364 0.21564544422947177
+0.7 0.09999999999999998 0.10000000000000009 2.276628 0.20224505276474977 0.2138479713641168
+1.0 0.19999999999999996 0.19999999999999996 2.151346 0.1772141151664671 0.17902766702700815
+1.4 0.19999999999999996 0.20000000000000018 1.465778 0.11420169794409608 0.11904472841562758
+1.8 0.19999999999999996 0.19999999999999996 1.132133 0.09573334320826167 0.09652587469299748
+2.5 0.5 0.5 0.516235 0.044834437637901334 0.04576046929160248
+4.0 1.0 1.0 0.057008 0.008426920168918616 0.008709303795375427
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d11-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d11-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.25 0.25 0.25 0.258716 0.0344730211411511 0.03659045549750295
+0.75 0.25 0.25 0.321299 0.04080957718260836 0.043362570652852955
+1.25 0.25 0.25 0.428997 0.053340131726965556 0.053734336002738525
+1.75 0.25 0.25 0.680451 0.0692593833935523 0.07565027030372867
+2.2 0.20000000000000018 0.19999999999999973 1.318914 0.12003168137137961 0.12572599157414382
+2.6 0.20000000000000018 0.19999999999999973 2.662429 0.20994143217637612 0.2241724111237131
+2.9 0.10000000000000009 0.10000000000000009 5.325996 0.3725963416767095 0.39016147065873164
+3.070795 0.07079499999999994 0.07079499999999994 7.152546 0.5176002564632969 0.5388910622031428
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d13-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d13-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.5 0.5 0.5 0.118174 0.014808729989338689 0.014197242582948176
+1.25 0.25 0.25 0.291116 0.03823580093309602 0.038326833833866945
+1.75 0.25 0.25 0.479091 0.053722851714553134 0.05427982630226803
+2.25 0.25 0.25 1.030053 0.09660650673504886 0.09438798925871045
+2.75 0.25 0.25 2.77592 0.21067934404788868 0.21015199483058047
+3.25 0.25 0.25 3.315635 0.22790088986624232 0.23297686294194836
+3.75 0.25 0.25 0.689159 0.056576973901365166 0.05652066019367535
+4.25 0.25 0.25 0.194539 0.02980817439669081 0.03157319424774677
+5.25 0.75 0.75 0.014181 0.00423741948213287 0.004300101990031343
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d15-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d15-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.038103 0.0049386221247694605 0.004374931306206956
+25.0 5.0 5.0 0.077657 0.006873627207595276 0.007200043624535636
+35.0 5.0 5.0 0.088236 0.007052617981193254 0.007271428060409394
+50.0 10.0 10.0 0.056102 0.004085658723838573 0.004170080944310314
+70.0 10.0 10.0 0.031694 0.002216888253705874 0.0022395012695023415
+95.0 15.0 15.0 0.013054 0.0011267538000508364 0.0011290804483923191
+155.0 45.0 45.0 0.002612 2.114340995033677E-4 2.1631681355715278E-4
+350.0 150.0 150.0 1.43E-4 1.989508281787236E-5 2.0539524630575075E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d17-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d17-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 3.077631 0.25726675788078296 0.24764853516322485
+0.3 0.09999999999999998 0.10000000000000003 3.135388 0.26213437578787635 0.2638590610187275
+0.5 0.09999999999999998 0.09999999999999998 3.087059 0.2657964969859084 0.2618162747323344
+0.7 0.09999999999999998 0.10000000000000009 2.80385 0.23395771990181527 0.23395771990181527
+1.0 0.19999999999999996 0.19999999999999996 2.543685 0.20424460061212973 0.2093012277174697
+1.4 0.19999999999999996 0.20000000000000018 1.94042 0.16099418122118603 0.16496366250614766
+1.8 0.19999999999999996 0.19999999999999996 1.107935 0.12274304788507884 0.1210094296178372
+2.25 0.25 0.25 0.332534 0.0454225498641802 0.044843982269960926
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d21-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d21-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.775 0.375 0.3749999999999999 0.137041 0.018809220473416707 0.023622533553299376
+1.525 0.375 0.375 0.115713 0.021170236488912513 0.02234154649052398
+2.15 0.25 0.25 0.151525 0.031117292203442223 0.03396421974511283
+2.6 0.20000000000000018 0.19999999999999973 0.177919 0.03396883299763693 0.038858715259500694
+3.0 0.20000000000000018 0.20000000000000018 0.231309 0.04204910700533564 0.046064727110737516
+4.1 0.8999999999999995 0.9000000000000004 0.059726 0.009801043879217286 0.010137761470412524
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d23-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d23-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+27.5 17.5 17.5 0.00246 3.85041868710404E-4 4.387361551456638E-4
+65.0 20.0 20.0 0.00391 6.693157141110614E-4 7.285736037141341E-4
+100.0 15.0 15.0 0.003441 5.900896647853442E-4 6.312958794462703E-4
+140.0 25.0 25.0 0.00202 3.3106561355719205E-4 4.2444579164835637E-4
+257.5 92.5 92.5 4.76E-4 9.232467581313243E-5 1.0148232926455719E-4
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d25-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d25-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.003396 7.842258175089111E-4 8.062115443968289E-4
+30.0 10.0 10.0 0.00611 0.0010294202271337006 0.0011206454587357232
+50.0 10.0 10.0 0.005461 0.0010411164122046102 0.0011391033388129803
+70.0 10.0 10.0 0.005774 8.597889278654617E-4 0.0010068678948983328
+95.0 15.0 15.0 0.002199 4.141261167110932E-4 4.6202685532904905E-4
+180.0 70.0 70.0 3.94E-4 6.630523677206801E-5 7.461745259790098E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294/d27-x01-y01
+Path=/REF/ATLAS_2014_I1306294/d27-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 0.343007 0.06493028286579952 0.0770227622120602
+0.3 0.09999999999999998 0.10000000000000003 0.387994 0.0655121406231226 0.0765024998806512
+0.5 0.09999999999999998 0.09999999999999998 0.442316 0.07247425608158785 0.08206392346989141
+0.7 0.09999999999999998 0.10000000000000009 0.322771 0.05806269016573667 0.06220352405136918
+1.0 0.19999999999999996 0.19999999999999996 0.266841 0.04834102190742222 0.047535018591356316
+1.4 0.19999999999999996 0.20000000000000018 0.186557 0.033072431175901404 0.036228163772138534
+2.05 0.44999999999999973 0.4500000000000002 0.051962 0.013870587988761523 0.017058158297853083
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/ATLAS_2014_I1306294_EL.yoda b/data/refdata/ATLAS_2014_I1306294_EL.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2014_I1306294_EL.yoda
@@ -0,0 +1,160 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d03-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d03-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+25.0 5.0 5.0 0.196336 0.022767002685686488 0.022442750699509982
+40.0 10.0 10.0 0.093764 0.007448250031418763 0.008162343348859616
+62.5 12.5 12.5 0.035126 0.003307914470330743 0.0030853847425376625
+92.5 17.5 17.5 0.01184 0.0014315010480918273 0.0013816970553518597
+155.0 45.0 45.0 0.002142 2.77101624652906E-4 2.9025364621895793E-4
+350.0 150.0 150.0 1.01E-4 2.6031861948965543E-5 2.8642877036010193E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d05-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d05-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.995969 0.2502050286854681 0.2540329091205163
+0.3 0.09999999999999998 0.10000000000000003 2.935592 0.2423499662186646 0.2455083521104788
+0.5 0.09999999999999998 0.09999999999999998 3.036675 0.23848341077489127 0.25181855164208083
+0.7 0.09999999999999998 0.10000000000000009 2.706656 0.22988195239694798 0.22508346896326376
+1.0 0.19999999999999996 0.19999999999999996 2.540045 0.20006301281941774 0.20460988758462773
+1.4 0.19999999999999996 0.20000000000000018 2.136394 0.19707166424050004 0.193061234261348
+1.8 0.19999999999999996 0.19999999999999996 1.637068 0.1729878996186148 0.1748039869207207
+2.2 0.20000000000000018 0.19999999999999973 1.21209 0.1527242057761174 0.14945113936795842
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d07-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d07-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 4.328249 0.36738854687035144 0.35840515301630893
+0.3 0.09999999999999998 0.10000000000000003 4.075786 0.34044403930699496 0.3298406468813218
+0.5 0.09999999999999998 0.09999999999999998 4.008528 0.3269278371498574 0.31803065980564604
+0.7 0.09999999999999998 0.10000000000000009 3.789802 0.32872966247795 0.3308358227088082
+1.0 0.19999999999999996 0.19999999999999996 2.781049 0.23059072437308237 0.22663501859483245
+1.4 0.19999999999999996 0.20000000000000018 1.769883 0.16233918363101357 0.16397495579522062
+1.8 0.19999999999999996 0.19999999999999996 0.689683 0.09141848976274591 0.09262596750446248
+2.25 0.25 0.25 0.087968 0.02147261220284261 0.02042288853165655
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d09-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d09-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.589466 0.21621296809537036 0.21667456364495416
+0.3 0.09999999999999998 0.10000000000000003 2.561052 0.21860840440486368 0.2217777873995027
+0.5 0.09999999999999998 0.09999999999999998 2.611447 0.21472256724541364 0.21564544422947177
+0.7 0.09999999999999998 0.10000000000000009 2.276628 0.20224505276474977 0.2138479713641168
+1.0 0.19999999999999996 0.19999999999999996 2.151346 0.1772141151664671 0.17902766702700815
+1.4 0.19999999999999996 0.20000000000000018 1.465778 0.11420169794409608 0.11904472841562758
+1.8 0.19999999999999996 0.19999999999999996 1.132133 0.09573334320826167 0.09652587469299748
+2.5 0.5 0.5 0.516235 0.044834437637901334 0.04576046929160248
+4.0 1.0 1.0 0.057008 0.008426920168918616 0.008709303795375427
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d11-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d11-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.25 0.25 0.25 0.258716 0.0344730211411511 0.03659045549750295
+0.75 0.25 0.25 0.321299 0.04080957718260836 0.043362570652852955
+1.25 0.25 0.25 0.428997 0.053340131726965556 0.053734336002738525
+1.75 0.25 0.25 0.680451 0.0692593833935523 0.07565027030372867
+2.2 0.20000000000000018 0.19999999999999973 1.318914 0.12003168137137961 0.12572599157414382
+2.6 0.20000000000000018 0.19999999999999973 2.662429 0.20994143217637612 0.2241724111237131
+2.9 0.10000000000000009 0.10000000000000009 5.325996 0.3725963416767095 0.39016147065873164
+3.070795 0.07079499999999994 0.07079499999999994 7.152546 0.5176002564632969 0.5388910622031428
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d13-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d13-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.5 0.5 0.5 0.118174 0.014808729989338689 0.014197242582948176
+1.25 0.25 0.25 0.291116 0.03823580093309602 0.038326833833866945
+1.75 0.25 0.25 0.479091 0.053722851714553134 0.05427982630226803
+2.25 0.25 0.25 1.030053 0.09660650673504886 0.09438798925871045
+2.75 0.25 0.25 2.77592 0.21067934404788868 0.21015199483058047
+3.25 0.25 0.25 3.315635 0.22790088986624232 0.23297686294194836
+3.75 0.25 0.25 0.689159 0.056576973901365166 0.05652066019367535
+4.25 0.25 0.25 0.194539 0.02980817439669081 0.03157319424774677
+5.25 0.75 0.75 0.014181 0.00423741948213287 0.004300101990031343
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d15-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d15-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.038103 0.0049386221247694605 0.004374931306206956
+25.0 5.0 5.0 0.077657 0.006873627207595276 0.007200043624535636
+35.0 5.0 5.0 0.088236 0.007052617981193254 0.007271428060409394
+50.0 10.0 10.0 0.056102 0.004085658723838573 0.004170080944310314
+70.0 10.0 10.0 0.031694 0.002216888253705874 0.0022395012695023415
+95.0 15.0 15.0 0.013054 0.0011267538000508364 0.0011290804483923191
+155.0 45.0 45.0 0.002612 2.114340995033677E-4 2.1631681355715278E-4
+350.0 150.0 150.0 1.43E-4 1.989508281787236E-5 2.0539524630575075E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d17-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d17-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 3.077631 0.25726675788078296 0.24764853516322485
+0.3 0.09999999999999998 0.10000000000000003 3.135388 0.26213437578787635 0.2638590610187275
+0.5 0.09999999999999998 0.09999999999999998 3.087059 0.2657964969859084 0.2618162747323344
+0.7 0.09999999999999998 0.10000000000000009 2.80385 0.23395771990181527 0.23395771990181527
+1.0 0.19999999999999996 0.19999999999999996 2.543685 0.20424460061212973 0.2093012277174697
+1.4 0.19999999999999996 0.20000000000000018 1.94042 0.16099418122118603 0.16496366250614766
+1.8 0.19999999999999996 0.19999999999999996 1.107935 0.12274304788507884 0.1210094296178372
+2.25 0.25 0.25 0.332534 0.0454225498641802 0.044843982269960926
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d21-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d21-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.775 0.375 0.3749999999999999 0.137041 0.018809220473416707 0.023622533553299376
+1.525 0.375 0.375 0.115713 0.021170236488912513 0.02234154649052398
+2.15 0.25 0.25 0.151525 0.031117292203442223 0.03396421974511283
+2.6 0.20000000000000018 0.19999999999999973 0.177919 0.03396883299763693 0.038858715259500694
+3.0 0.20000000000000018 0.20000000000000018 0.231309 0.04204910700533564 0.046064727110737516
+4.1 0.8999999999999995 0.9000000000000004 0.059726 0.009801043879217286 0.010137761470412524
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d23-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d23-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+27.5 17.5 17.5 0.00246 3.85041868710404E-4 4.387361551456638E-4
+65.0 20.0 20.0 0.00391 6.693157141110614E-4 7.285736037141341E-4
+100.0 15.0 15.0 0.003441 5.900896647853442E-4 6.312958794462703E-4
+140.0 25.0 25.0 0.00202 3.3106561355719205E-4 4.2444579164835637E-4
+257.5 92.5 92.5 4.76E-4 9.232467581313243E-5 1.0148232926455719E-4
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d25-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d25-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.003396 7.842258175089111E-4 8.062115443968289E-4
+30.0 10.0 10.0 0.00611 0.0010294202271337006 0.0011206454587357232
+50.0 10.0 10.0 0.005461 0.0010411164122046102 0.0011391033388129803
+70.0 10.0 10.0 0.005774 8.597889278654617E-4 0.0010068678948983328
+95.0 15.0 15.0 0.002199 4.141261167110932E-4 4.6202685532904905E-4
+180.0 70.0 70.0 3.94E-4 6.630523677206801E-5 7.461745259790098E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_EL/d27-x01-y01
+Path=/REF/ATLAS_2014_I1306294_EL/d27-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 0.343007 0.06493028286579952 0.0770227622120602
+0.3 0.09999999999999998 0.10000000000000003 0.387994 0.0655121406231226 0.0765024998806512
+0.5 0.09999999999999998 0.09999999999999998 0.442316 0.07247425608158785 0.08206392346989141
+0.7 0.09999999999999998 0.10000000000000009 0.322771 0.05806269016573667 0.06220352405136918
+1.0 0.19999999999999996 0.19999999999999996 0.266841 0.04834102190742222 0.047535018591356316
+1.4 0.19999999999999996 0.20000000000000018 0.186557 0.033072431175901404 0.036228163772138534
+2.05 0.44999999999999973 0.4500000000000002 0.051962 0.013870587988761523 0.017058158297853083
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/ATLAS_2014_I1306294_MU.yoda b/data/refdata/ATLAS_2014_I1306294_MU.yoda
new file mode 100644
--- /dev/null
+++ b/data/refdata/ATLAS_2014_I1306294_MU.yoda
@@ -0,0 +1,160 @@
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d03-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d03-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+25.0 5.0 5.0 0.196336 0.022767002685686488 0.022442750699509982
+40.0 10.0 10.0 0.093764 0.007448250031418763 0.008162343348859616
+62.5 12.5 12.5 0.035126 0.003307914470330743 0.0030853847425376625
+92.5 17.5 17.5 0.01184 0.0014315010480918273 0.0013816970553518597
+155.0 45.0 45.0 0.002142 2.77101624652906E-4 2.9025364621895793E-4
+350.0 150.0 150.0 1.01E-4 2.6031861948965543E-5 2.8642877036010193E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d05-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d05-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.995969 0.2502050286854681 0.2540329091205163
+0.3 0.09999999999999998 0.10000000000000003 2.935592 0.2423499662186646 0.2455083521104788
+0.5 0.09999999999999998 0.09999999999999998 3.036675 0.23848341077489127 0.25181855164208083
+0.7 0.09999999999999998 0.10000000000000009 2.706656 0.22988195239694798 0.22508346896326376
+1.0 0.19999999999999996 0.19999999999999996 2.540045 0.20006301281941774 0.20460988758462773
+1.4 0.19999999999999996 0.20000000000000018 2.136394 0.19707166424050004 0.193061234261348
+1.8 0.19999999999999996 0.19999999999999996 1.637068 0.1729878996186148 0.1748039869207207
+2.2 0.20000000000000018 0.19999999999999973 1.21209 0.1527242057761174 0.14945113936795842
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d07-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d07-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 4.328249 0.36738854687035144 0.35840515301630893
+0.3 0.09999999999999998 0.10000000000000003 4.075786 0.34044403930699496 0.3298406468813218
+0.5 0.09999999999999998 0.09999999999999998 4.008528 0.3269278371498574 0.31803065980564604
+0.7 0.09999999999999998 0.10000000000000009 3.789802 0.32872966247795 0.3308358227088082
+1.0 0.19999999999999996 0.19999999999999996 2.781049 0.23059072437308237 0.22663501859483245
+1.4 0.19999999999999996 0.20000000000000018 1.769883 0.16233918363101357 0.16397495579522062
+1.8 0.19999999999999996 0.19999999999999996 0.689683 0.09141848976274591 0.09262596750446248
+2.25 0.25 0.25 0.087968 0.02147261220284261 0.02042288853165655
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d09-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d09-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 2.589466 0.21621296809537036 0.21667456364495416
+0.3 0.09999999999999998 0.10000000000000003 2.561052 0.21860840440486368 0.2217777873995027
+0.5 0.09999999999999998 0.09999999999999998 2.611447 0.21472256724541364 0.21564544422947177
+0.7 0.09999999999999998 0.10000000000000009 2.276628 0.20224505276474977 0.2138479713641168
+1.0 0.19999999999999996 0.19999999999999996 2.151346 0.1772141151664671 0.17902766702700815
+1.4 0.19999999999999996 0.20000000000000018 1.465778 0.11420169794409608 0.11904472841562758
+1.8 0.19999999999999996 0.19999999999999996 1.132133 0.09573334320826167 0.09652587469299748
+2.5 0.5 0.5 0.516235 0.044834437637901334 0.04576046929160248
+4.0 1.0 1.0 0.057008 0.008426920168918616 0.008709303795375427
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d11-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d11-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.25 0.25 0.25 0.258716 0.0344730211411511 0.03659045549750295
+0.75 0.25 0.25 0.321299 0.04080957718260836 0.043362570652852955
+1.25 0.25 0.25 0.428997 0.053340131726965556 0.053734336002738525
+1.75 0.25 0.25 0.680451 0.0692593833935523 0.07565027030372867
+2.2 0.20000000000000018 0.19999999999999973 1.318914 0.12003168137137961 0.12572599157414382
+2.6 0.20000000000000018 0.19999999999999973 2.662429 0.20994143217637612 0.2241724111237131
+2.9 0.10000000000000009 0.10000000000000009 5.325996 0.3725963416767095 0.39016147065873164
+3.070795 0.07079499999999994 0.07079499999999994 7.152546 0.5176002564632969 0.5388910622031428
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d13-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d13-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.5 0.5 0.5 0.118174 0.014808729989338689 0.014197242582948176
+1.25 0.25 0.25 0.291116 0.03823580093309602 0.038326833833866945
+1.75 0.25 0.25 0.479091 0.053722851714553134 0.05427982630226803
+2.25 0.25 0.25 1.030053 0.09660650673504886 0.09438798925871045
+2.75 0.25 0.25 2.77592 0.21067934404788868 0.21015199483058047
+3.25 0.25 0.25 3.315635 0.22790088986624232 0.23297686294194836
+3.75 0.25 0.25 0.689159 0.056576973901365166 0.05652066019367535
+4.25 0.25 0.25 0.194539 0.02980817439669081 0.03157319424774677
+5.25 0.75 0.75 0.014181 0.00423741948213287 0.004300101990031343
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d15-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d15-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.038103 0.0049386221247694605 0.004374931306206956
+25.0 5.0 5.0 0.077657 0.006873627207595276 0.007200043624535636
+35.0 5.0 5.0 0.088236 0.007052617981193254 0.007271428060409394
+50.0 10.0 10.0 0.056102 0.004085658723838573 0.004170080944310314
+70.0 10.0 10.0 0.031694 0.002216888253705874 0.0022395012695023415
+95.0 15.0 15.0 0.013054 0.0011267538000508364 0.0011290804483923191
+155.0 45.0 45.0 0.002612 2.114340995033677E-4 2.1631681355715278E-4
+350.0 150.0 150.0 1.43E-4 1.989508281787236E-5 2.0539524630575075E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d17-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d17-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 3.077631 0.25726675788078296 0.24764853516322485
+0.3 0.09999999999999998 0.10000000000000003 3.135388 0.26213437578787635 0.2638590610187275
+0.5 0.09999999999999998 0.09999999999999998 3.087059 0.2657964969859084 0.2618162747323344
+0.7 0.09999999999999998 0.10000000000000009 2.80385 0.23395771990181527 0.23395771990181527
+1.0 0.19999999999999996 0.19999999999999996 2.543685 0.20424460061212973 0.2093012277174697
+1.4 0.19999999999999996 0.20000000000000018 1.94042 0.16099418122118603 0.16496366250614766
+1.8 0.19999999999999996 0.19999999999999996 1.107935 0.12274304788507884 0.1210094296178372
+2.25 0.25 0.25 0.332534 0.0454225498641802 0.044843982269960926
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d21-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d21-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.775 0.375 0.3749999999999999 0.137041 0.018809220473416707 0.023622533553299376
+1.525 0.375 0.375 0.115713 0.021170236488912513 0.02234154649052398
+2.15 0.25 0.25 0.151525 0.031117292203442223 0.03396421974511283
+2.6 0.20000000000000018 0.19999999999999973 0.177919 0.03396883299763693 0.038858715259500694
+3.0 0.20000000000000018 0.20000000000000018 0.231309 0.04204910700533564 0.046064727110737516
+4.1 0.8999999999999995 0.9000000000000004 0.059726 0.009801043879217286 0.010137761470412524
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d23-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d23-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+27.5 17.5 17.5 0.00246 3.85041868710404E-4 4.387361551456638E-4
+65.0 20.0 20.0 0.00391 6.693157141110614E-4 7.285736037141341E-4
+100.0 15.0 15.0 0.003441 5.900896647853442E-4 6.312958794462703E-4
+140.0 25.0 25.0 0.00202 3.3106561355719205E-4 4.2444579164835637E-4
+257.5 92.5 92.5 4.76E-4 9.232467581313243E-5 1.0148232926455719E-4
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d25-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d25-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+10.0 10.0 10.0 0.003396 7.842258175089111E-4 8.062115443968289E-4
+30.0 10.0 10.0 0.00611 0.0010294202271337006 0.0011206454587357232
+50.0 10.0 10.0 0.005461 0.0010411164122046102 0.0011391033388129803
+70.0 10.0 10.0 0.005774 8.597889278654617E-4 0.0010068678948983328
+95.0 15.0 15.0 0.002199 4.141261167110932E-4 4.6202685532904905E-4
+180.0 70.0 70.0 3.94E-4 6.630523677206801E-5 7.461745259790098E-5
+# END YODA_SCATTER2D
+
+# BEGIN YODA_SCATTER2D /REF/ATLAS_2014_I1306294_MU/d27-x01-y01
+Path=/REF/ATLAS_2014_I1306294_MU/d27-x01-y01
+Type=Scatter2D
+# xval xerr- xerr+ yval yerr- yerr+
+0.1 0.1 0.1 0.343007 0.06493028286579952 0.0770227622120602
+0.3 0.09999999999999998 0.10000000000000003 0.387994 0.0655121406231226 0.0765024998806512
+0.5 0.09999999999999998 0.09999999999999998 0.442316 0.07247425608158785 0.08206392346989141
+0.7 0.09999999999999998 0.10000000000000009 0.322771 0.05806269016573667 0.06220352405136918
+1.0 0.19999999999999996 0.19999999999999996 0.266841 0.04834102190742222 0.047535018591356316
+1.4 0.19999999999999996 0.20000000000000018 0.186557 0.033072431175901404 0.036228163772138534
+2.05 0.44999999999999973 0.4500000000000002 0.051962 0.013870587988761523 0.017058158297853083
+# END YODA_SCATTER2D
+
diff --git a/data/refdata/Makefile.am b/data/refdata/Makefile.am
--- a/data/refdata/Makefile.am
+++ b/data/refdata/Makefile.am
@@ -1,243 +1,249 @@
dist_pkgdata_DATA = \
ALEPH_1991_S2435284.yoda \
ALEPH_1996_S3486095.yoda \
ALEPH_1996_S3196992.yoda \
ALEPH_1999_S4193598.yoda \
ALEPH_2001_S4656318.yoda \
ALEPH_2002_S4823664.yoda \
ALEPH_2004_S5765862.yoda \
ALICE_2010_S8624100.yoda \
ALICE_2010_S8625980.yoda \
ALICE_2010_S8706239.yoda \
ALICE_2011_S8909580.yoda \
ALICE_2011_S8945144.yoda \
ALICE_2012_I1181770.yoda \
ARGUS_1993_S2653028.yoda \
ARGUS_1993_S2669951.yoda \
ARGUS_1993_S2789213.yoda \
ATLAS_2010_S8591806.yoda \
ATLAS_2010_S8817804.yoda \
ATLAS_2010_S8894728.yoda \
ATLAS_2010_S8914702.yoda \
ATLAS_2010_S8918562.yoda \
ATLAS_2010_S8919674.yoda \
ATLAS_2011_S8924791.yoda \
ATLAS_2011_S8971293.yoda \
ATLAS_2011_S8994773.yoda \
ATLAS_2011_S9002537.yoda \
ATLAS_2010_CONF_2010_049.yoda \
ATLAS_2011_S9120807.yoda \
ATLAS_2011_S9126244.yoda \
ATLAS_2011_S9128077.yoda \
ATLAS_2011_S9131140.yoda \
ATLAS_2011_I894867.yoda \
ATLAS_2011_S9035664.yoda \
ATLAS_2011_I919017.yoda \
ATLAS_2011_I921594.yoda \
ATLAS_2011_I925932.yoda \
ATLAS_2011_I926145.yoda \
ATLAS_2011_I930220.yoda \
ATLAS_2011_I944826.yoda \
ATLAS_2011_I945498.yoda \
ATLAS_2011_I954993.yoda \
ATLAS_2011_S9225137.yoda \
ATLAS_2011_S9212183.yoda \
ATLAS_2012_I1082936.yoda \
ATLAS_2012_I1083318.yoda \
ATLAS_2012_I1084540.yoda \
ATLAS_2012_I1091481.yoda \
ATLAS_2012_I1093734.yoda \
ATLAS_2012_I1093738.yoda \
ATLAS_2012_I1094061.yoda \
ATLAS_2012_I1094564.yoda \
ATLAS_2012_I1094568.yoda \
ATLAS_2012_I943401.yoda \
ATLAS_2012_I1082009.yoda \
ATLAS_2012_I1118269.yoda \
ATLAS_2012_I1119557.yoda \
ATLAS_2012_I1124167.yoda \
ATLAS_2012_I1125575.yoda \
ATLAS_2012_I1183818.yoda \
ATLAS_2012_I1188891.yoda \
ATLAS_2012_I1199269.yoda \
ATLAS_2012_CONF_2012_001.yoda \
ATLAS_2012_I1203852.yoda \
ATLAS_2012_I1204784.yoda \
ATLAS_2013_I1190187.yoda \
ATLAS_2013_I1217867.yoda \
+ ATLAS_2013_I1219109.yoda \
+ ATLAS_2013_I1219109_EL.yoda \
+ ATLAS_2013_I1219109_MU.yoda \
ATLAS_2013_I1230812.yoda \
ATLAS_2013_I1230812_EL.yoda \
ATLAS_2013_I1230812_MU.yoda \
ATLAS_2013_I1243871.yoda \
ATLAS_2013_I1263495.yoda \
ATLAS_2014_I1268975.yoda \
ATLAS_2014_I1279489.yoda \
ATLAS_2014_I1282441.yoda \
ATLAS_2014_I1298811.yoda \
ATLAS_2014_I1304688.yoda \
ATLAS_2014_I1307756.yoda \
+ ATLAS_2014_I1306294.yoda \
+ ATLAS_2014_I1306294_EL.yoda \
+ ATLAS_2014_I1306294_MU.yoda \
BABAR_2003_I593379.yoda \
BABAR_2005_S6181155.yoda \
BABAR_2006_S6511112.yoda \
BABAR_2007_S6895344.yoda \
BABAR_2007_S7266081.yoda \
BELLE_2001_S4598261.yoda \
BELLE_2008_I786560.yoda \
BELLE_2013_I1216515.yoda \
BABAR_2013_I1238276.yoda \
CLEO_2001_S4557530.yoda \
CLEO_2004_S5809304.yoda \
CMS_2010_S8547297.yoda \
CMS_2010_S8656010.yoda \
CMS_2011_S8884919.yoda \
CMS_2011_S8941262.yoda \
CMS_2011_S8950903.yoda \
CMS_2011_S8957746.yoda \
CMS_2011_S8968497.yoda \
CMS_2011_S8973270.yoda \
CMS_2011_S8978280.yoda \
CMS_2011_S9086218.yoda \
CMS_2011_S9088458.yoda \
CMS_2011_S9120041.yoda \
CMS_2011_S9215166.yoda \
CMS_2012_I941555.yoda \
CMS_2011_I954992.yoda \
CMS_2012_I1087342.yoda \
CMS_2012_I1090423.yoda \
CMS_2012_I1102908.yoda \
CMS_2012_I1107658.yoda \
CMS_2012_I1184941.yoda \
CMS_2012_I1193338.yoda \
CMS_2013_I1209721.yoda \
CMS_2013_I1218372.yoda \
CMS_2013_I1224539_DIJET.yoda \
CMS_2013_I1224539_WJET.yoda \
CMS_2013_I1224539_ZJET.yoda \
CMS_2013_I1256943.yoda \
CMS_2013_I1258128.yoda \
CMS_2013_I1261026.yoda \
CMS_2013_I1265659.yoda \
CMS_2013_I1272853.yoda \
CMS_2013_I1273574.yoda \
CMSTOTEM_2014_I1294140.yoda \
CMS_2012_PAS_QCD_11_010.yoda \
CMS_QCD_10_024.yoda \
LHCB_2010_S8758301.yoda \
LHCB_2010_I867355.yoda \
LHCB_2011_I917009.yoda \
LHCB_2011_I919315.yoda \
LHCB_2012_I1119400.yoda \
LHCB_2013_I1208105.yoda \
LHCB_2013_I1218996.yoda \
LHCF_2012_I1115479.yoda \
DELPHI_1994_S3021912.yoda \
DELPHI_1995_S3137023.yoda \
DELPHI_1996_S3430090.yoda \
DELPHI_1999_S3960137.yoda \
DELPHI_2000_S4328825.yoda \
DELPHI_2002_069_CONF_603.yoda \
DELPHI_2003_WUD_03_11.yoda \
OPAL_1993_S2692198.yoda \
OPAL_1994_S2927284.yoda \
OPAL_1995_S3198391.yoda \
OPAL_1996_S3257789.yoda \
OPAL_1997_S3396100.yoda \
OPAL_1997_S3608263.yoda \
OPAL_1998_S3702294.yoda \
OPAL_1998_S3780481.yoda \
OPAL_1998_S3749908.yoda \
OPAL_2000_S4418603.yoda \
OPAL_2001_S4553896.yoda \
OPAL_2002_S5361494.yoda \
OPAL_2004_S6132243.yoda \
JADE_OPAL_2000_S4300807.yoda \
JADE_1998_S3612880.yoda \
TASSO_1990_S2148048.yoda \
H1_1994_S2919893.yoda \
H1_1995_S3167097.yoda \
H1_2000_S4129130.yoda \
H1_2007_S7223935.yoda \
ZEUS_2001_S4815815.yoda \
PHENIX_2003_S5538505.yoda \
STAR_2006_S6500200.yoda \
STAR_2006_S6860818.yoda \
STAR_2006_S6870392.yoda \
STAR_2008_S7993412.yoda \
STAR_2009_UE_HELEN.yoda \
BRAHMS_2007_S7052448.yoda \
UA1_1990_S2044935.yoda \
UA5_1982_S875503.yoda \
UA5_1986_S1583476.yoda \
UA5_1989_S1926373.yoda \
UA5_1988_S1867512.yoda \
UA5_1987_S1640666.yoda \
CDF_1988_S1865951.yoda \
CDF_1990_S2089246.yoda \
CDF_1993_S2742446.yoda \
CDF_1994_S2952106.yoda \
CDF_1996_S3108457.yoda \
CDF_1996_S3349578.yoda \
CDF_1996_S3418421.yoda \
CDF_1997_S3541940.yoda \
CDF_1998_S3618439.yoda \
CDF_2000_S4155203.yoda \
CDF_2000_S4266730.yoda \
CDF_2001_S4517016.yoda \
CDF_2001_S4563131.yoda \
CDF_2001_S4751469.yoda \
CDF_2002_S4796047.yoda \
CDF_2004_S5839831.yoda \
CDF_2005_S6080774.yoda \
CDF_2005_S6217184.yoda \
CDF_2006_S6450792.yoda \
CDF_2006_S6653332.yoda \
CDF_2007_S7057202.yoda \
CDF_2008_S7541902.yoda \
CDF_2008_S7554427.yoda \
CDF_2008_S7540469.yoda \
CDF_2008_S7782535.yoda \
CDF_2008_S7828950.yoda \
CDF_2008_S8093652.yoda \
CDF_2008_S8095620.yoda \
CDF_2009_S8233977.yoda \
CDF_2009_NOTE_9936.yoda \
CDF_2009_S8383952.yoda \
CDF_2009_S8436959.yoda \
CDF_2010_S8591881_DY.yoda \
CDF_2010_S8591881_QCD.yoda \
CDF_2012_NOTE10874.yoda \
D0_1996_S3214044.yoda \
D0_1996_S3324664.yoda \
D0_2000_S4480767.yoda \
D0_2000_I499943.yoda \
D0_2001_S4674421.yoda \
D0_2004_S5992206.yoda \
D0_2006_S6438750.yoda \
D0_2007_S7075677.yoda \
D0_2008_S6879055.yoda \
D0_2008_S7554427.yoda \
D0_2008_S7662670.yoda \
D0_2008_S7719523.yoda \
D0_2008_S7837160.yoda \
D0_2008_S7863608.yoda \
D0_2009_S8202443.yoda \
D0_2009_S8320160.yoda \
D0_2009_S8349509.yoda \
D0_2010_S8566488.yoda \
D0_2010_S8570965.yoda \
D0_2010_S8671338.yoda \
D0_2010_S8821313.yoda \
D0_2011_I895662.yoda \
E735_1992_S2485869.yoda \
E735_1993_S2896508.yoda \
E735_1998_S3905616.yoda \
SFM_1984_S1178091.yoda \
SLD_1996_S3398250.yoda \
SLD_1999_S3743934.yoda \
SLD_2002_S4869273.yoda \
SLD_2004_S5693039.yoda \
STAR_2008_S7869363.yoda \
TOTEM_2012_I1115294.yoda \
TOTEM_2012_002.yoda \
PDG_HADRON_MULTIPLICITIES.yoda \
PDG_HADRON_MULTIPLICITIES_RATIOS.yoda
diff --git a/include/Rivet/Analysis.hh b/include/Rivet/Analysis.hh
--- a/include/Rivet/Analysis.hh
+++ b/include/Rivet/Analysis.hh
@@ -1,903 +1,902 @@
// -*- C++ -*-
#ifndef RIVET_Analysis_HH
#define RIVET_Analysis_HH
#include "Rivet/Config/RivetCommon.hh"
#include "Rivet/AnalysisInfo.hh"
#include "Rivet/Event.hh"
#include "Rivet/Projection.hh"
#include "Rivet/ProjectionApplier.hh"
#include "Rivet/ProjectionHandler.hh"
#include "Rivet/AnalysisLoader.hh"
#include "Rivet/Tools/RivetYODA.hh"
#include "Rivet/Tools/Logging.hh"
#include "Rivet/Tools/ParticleUtils.hh"
/// @def vetoEvent
/// Preprocessor define for vetoing events, including the log message and return.
#define vetoEvent \
do { MSG_DEBUG("Vetoing event on line " << __LINE__ << " of " << __FILE__); return; } while(0)
namespace Rivet {
// Forward declaration
class AnalysisHandler;
/// @brief This is the base class of all analysis classes in Rivet.
///
/// There are
/// three virtual functions which should be implemented in base classes:
///
/// void init() is called by Rivet before a run is started. Here the
/// analysis class should book necessary histograms. The needed
/// projections should probably rather be constructed in the
/// constructor.
///
/// void analyze(const Event&) is called once for each event. Here the
/// analysis class should apply the necessary Projections and fill the
/// histograms.
///
/// void finalize() is called after a run is finished. Here the analysis
/// class should do whatever manipulations are necessary on the
/// histograms. Writing the histograms to a file is, however, done by
/// the Rivet class.
class Analysis : public ProjectionApplier {
/// The AnalysisHandler is a friend.
friend class AnalysisHandler;
public:
/// @name Standard constructors and destructors.
//@{
// /// The default constructor.
// Analysis();
/// Constructor
Analysis(const std::string& name);
/// The destructor.
virtual ~Analysis() {}
//@}
public:
/// @name Main analysis methods
//@{
/// Initialize this analysis object. A concrete class should here
/// book all necessary histograms. An overridden function must make
/// sure it first calls the base class function.
virtual void init() { }
/// Analyze one event. A concrete class should here apply the
/// necessary projections on the \a event and fill the relevant
/// histograms. An overridden function must make sure it first calls
/// the base class function.
virtual void analyze(const Event& event) = 0;
/// Finalize this analysis object. A concrete class should here make
/// all necessary operations on the histograms. Writing the
/// histograms to a file is, however, done by the Rivet class. An
/// overridden function must make sure it first calls the base class
/// function.
virtual void finalize() { }
//@}
public:
/// @name Metadata
/// Metadata is used for querying from the command line and also for
/// building web pages and the analysis pages in the Rivet manual.
//@{
/// Get the actual AnalysisInfo object in which all this metadata is stored.
const AnalysisInfo& info() const {
assert(_info.get() != 0 && "No AnalysisInfo object :O");
return *_info;
}
/// @brief Get the name of the analysis.
///
/// By default this is computed by combining the results of the experiment,
/// year and Spires ID metadata methods and you should only override it if
/// there's a good reason why those won't work.
virtual std::string name() const {
return (info().name().empty()) ? _defaultname : info().name();
}
/// Get the Inspire ID code for this analysis.
virtual std::string inspireId() const {
return info().inspireId();
}
/// Get the SPIRES ID code for this analysis (~deprecated).
virtual std::string spiresId() const {
return info().spiresId();
}
/// @brief Names & emails of paper/analysis authors.
///
/// Names and email of authors in 'NAME \<EMAIL\>' format. The first
/// name in the list should be the primary contact person.
virtual std::vector<std::string> authors() const {
return info().authors();
}
/// @brief Get a short description of the analysis.
///
/// Short (one sentence) description used as an index entry.
/// Use @a description() to provide full descriptive paragraphs
/// of analysis details.
virtual std::string summary() const {
return info().summary();
}
/// @brief Get a full description of the analysis.
///
/// Full textual description of this analysis, what it is useful for,
/// what experimental techniques are applied, etc. Should be treated
/// as a chunk of restructuredText (http://docutils.sourceforge.net/rst.html),
/// with equations to be rendered as LaTeX with amsmath operators.
virtual std::string description() const {
return info().description();
}
/// @brief Information about the events needed as input for this analysis.
///
/// Event types, energies, kinematic cuts, particles to be considered
/// stable, etc. etc. Should be treated as a restructuredText bullet list
/// (http://docutils.sourceforge.net/rst.html)
virtual std::string runInfo() const {
return info().runInfo();
}
/// Experiment which performed and published this analysis.
virtual std::string experiment() const {
return info().experiment();
}
/// Collider on which the experiment ran.
virtual std::string collider() const {
return info().collider();
}
/// When the original experimental analysis was published.
virtual std::string year() const {
return info().year();
}
/// Journal, and preprint references.
virtual std::vector<std::string> references() const {
return info().references();
}
/// BibTeX citation key for this article.
virtual std::string bibKey() const {
return info().bibKey();
}
/// BibTeX citation entry for this article.
virtual std::string bibTeX() const {
return info().bibTeX();
}
/// Whether this analysis is trusted (in any way!)
virtual std::string status() const {
return (info().status().empty()) ? "UNVALIDATED" : info().status();
}
/// Any work to be done on this analysis.
virtual std::vector<std::string> todos() const {
return info().todos();
}
/// Return the allowed pairs of incoming beams required by this analysis.
virtual const std::vector<PdgIdPair>& requiredBeams() const {
return info().beams();
}
/// Declare the allowed pairs of incoming beams required by this analysis.
virtual Analysis& setRequiredBeams(const std::vector<PdgIdPair>& requiredBeams) {
info().setBeams(requiredBeams);
return *this;
}
/// Sets of valid beam energy pairs, in GeV
virtual const std::vector<std::pair<double, double> >& requiredEnergies() const {
return info().energies();
}
/// Declare the list of valid beam energy pairs, in GeV
virtual Analysis& setRequiredEnergies(const std::vector<std::pair<double, double> >& requiredEnergies) {
info().setEnergies(requiredEnergies);
return *this;
}
/// Return true if this analysis needs to know the process cross-section.
/// @todo Remove this and require HepMC >= 2.06
bool needsCrossSection() const {
return info().needsCrossSection();
}
/// Declare whether this analysis needs to know the process cross-section from the generator.
/// @todo Remove this and require HepMC >= 2.06
Analysis& setNeedsCrossSection(bool needed=true) {
info().setNeedsCrossSection(needed);
return *this;
}
//@}
/// @name Internal metadata modifying methods
//@{
/// Get the actual AnalysisInfo object in which all this metadata is stored (non-const).
AnalysisInfo& info() {
assert(_info.get() != 0 && "No AnalysisInfo object :O");
return *_info;
}
//@}
/// @name Run conditions
//@{
/// Incoming beams for this run
const ParticlePair& beams() const;
/// Incoming beam IDs for this run
const PdgIdPair beamIds() const;
/// Centre of mass energy for this run
double sqrtS() const;
//@}
/// @name Analysis / beam compatibility testing
//@{
/// Check if analysis is compatible with the provided beam particle IDs and energies
bool isCompatible(const ParticlePair& beams) const;
/// Check if analysis is compatible with the provided beam particle IDs and energies
bool isCompatible(PdgId beam1, PdgId beam2, double e1, double e2) const;
/// Check if analysis is compatible with the provided beam particle IDs and energies
bool isCompatible(const PdgIdPair& beams, const std::pair<double,double>& energies) const;
//@}
/// Set the cross section from the generator
Analysis& setCrossSection(double xs);
/// Access the controlling AnalysisHandler object.
AnalysisHandler& handler() const { return *_analysishandler; }
protected:
/// Get a Log object based on the name() property of the calling analysis object.
Log& getLog() const;
/// Get the process cross-section in pb. Throws if this hasn't been set.
double crossSection() const;
/// Get the process cross-section per generated event in pb. Throws if this
/// hasn't been set.
double crossSectionPerEvent() const;
/// Get the number of events seen (via the analysis handler). Use in the
/// finalize phase only.
size_t numEvents() const;
/// Get the sum of event weights seen (via the analysis handler). Use in the
/// finalize phase only.
double sumOfWeights() const;
protected:
/// @name Histogram paths
//@{
/// Get the canonical histogram "directory" path for this analysis.
const std::string histoDir() const;
/// Get the canonical histogram path for the named histogram in this analysis.
const std::string histoPath(const std::string& hname) const;
/// Get the canonical histogram path for the numbered histogram in this analysis.
const std::string histoPath(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const;
/// Get the internal histogram name for given d, x and y (cf. HepData)
const std::string makeAxisCode(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const;
//@}
/// @name Histogram reference data
//@{
/// Get reference data for a named histo
const YODA::Scatter2D& refData(const string& hname) const;
/// Get reference data for a numbered histo
const YODA::Scatter2D& refData(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const;
/// @todo Provide 3D versions as well? (How to distinguish the signatures? Template magic or explicit name?)
//@}
/// @name 1D histogram booking
//@{
/// Book a 1D histogram with @a nbins uniformly distributed across the range @a lower - @a upper .
Histo1DPtr bookHisto1D(const std::string& name,
size_t nbins, double lower, double upper,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D histogram with non-uniform bins defined by the vector of bin edges @a binedges .
Histo1DPtr bookHisto1D(const std::string& name,
const std::vector<double>& binedges,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D histogram with binning from a reference scatter.
Histo1DPtr bookHisto1D(const std::string& name,
const Scatter2D& refscatter,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D histogram, using the binnings in the reference data histogram.
Histo1DPtr bookHisto1D(const std::string& name,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D histogram, using the binnings in the reference data histogram.
///
/// The paper, dataset and x/y-axis IDs will be used to build the histo name in the HepData standard way.
Histo1DPtr bookHisto1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
//@}
/// @name 2D histogram booking
//@{
/// Book a 2D histogram with @a nxbins and @a nybins uniformly
/// distributed across the ranges @a xlower - @a xupper and @a
/// ylower - @a yupper respectively along the x- and y-axis.
Histo2DPtr bookHisto2D(const std::string& name,
size_t nxbins, double xlower, double xupper,
size_t nybins, double ylower, double yupper,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="",
const std::string& ztitle="");
/// Book a 2D histogram with non-uniform bins defined by the
/// vectorx of bin edges @a xbinedges and @a ybinedges.
Histo2DPtr bookHisto2D(const std::string& name,
const std::vector<double>& xbinedges,
const std::vector<double>& ybinedges,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="",
const std::string& ztitle="");
// /// Book a 2D histogram with binning from a reference scatter.
// Histo2DPtr bookHisto2D(const std::string& name,
// const Scatter3D& refscatter,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
// /// Book a 2D histogram, using the binnings in the reference data histogram.
// Histo2DPtr bookHisto2D(const std::string& name,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
// /// Book a 2D histogram, using the binnings in the reference data histogram.
// ///
// /// The paper, dataset and x/y-axis IDs will be used to build the histo name in the HepData standard way.
// Histo2DPtr bookHisto2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
//@}
/// @name 1D profile histogram booking
//@{
/// Book a 1D profile histogram with @a nbins uniformly distributed across the range @a lower - @a upper .
Profile1DPtr bookProfile1D(const std::string& name,
size_t nbins, double lower, double upper,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D profile histogram with non-uniform bins defined by the vector of bin edges @a binedges .
Profile1DPtr bookProfile1D(const std::string& name,
const std::vector<double>& binedges,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D profile histogram with binning from a reference scatter.
Profile1DPtr bookProfile1D(const std::string& name,
const Scatter2D& refscatter,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D profile histogram, using the binnings in the reference data histogram.
Profile1DPtr bookProfile1D(const std::string& name,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// Book a 1D profile histogram, using the binnings in the reference data histogram.
///
/// The paper, dataset and x/y-axis IDs will be used to build the histo name in the HepData standard way.
Profile1DPtr bookProfile1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
//@}
/// @name 2D profile histogram booking
//@{
/// Book a 2D profile histogram with @a nxbins and @a nybins uniformly
/// distributed across the ranges @a xlower - @a xupper and @a ylower - @a
/// yupper respectively along the x- and y-axis.
Profile2DPtr bookProfile2D(const std::string& name,
size_t nxbins, double xlower, double xupper,
size_t nybins, double ylower, double yupper,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="",
const std::string& ztitle="");
/// Book a 2D profile histogram with non-uniform bins defined by the vectorx
/// of bin edges @a xbinedges and @a ybinedges.
Profile2DPtr bookProfile2D(const std::string& name,
const std::vector<double>& xbinedges,
const std::vector<double>& ybinedges,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="",
const std::string& ztitle="");
/// Book a 2D profile histogram with binning from a reference scatter.
// Profile2DPtr bookProfile2D(const std::string& name,
// const Scatter3D& refscatter,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
// /// Book a 2D profile histogram, using the binnings in the reference data histogram.
// Profile2DPtr bookProfile2D(const std::string& name,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
// /// Book a 2D profile histogram, using the binnings in the reference data histogram.
// ///
// /// The paper, dataset and x/y-axis IDs will be used to build the histo name in the HepData standard way.
// Profile2DPtr bookProfile2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId,
// const std::string& title="",
// const std::string& xtitle="",
// const std::string& ytitle="",
// const std::string& ztitle="");
//@}
/// @name 2D scatter booking
//@{
/// @brief Book a 2-dimensional data point set with the given name.
///
/// @note Unlike histogram booking, scatter booking by default makes no
/// attempt to use reference data to pre-fill the data object. If you want
/// this, which is sometimes useful e.g. when the x-position is not really
/// meaningful and can't be extracted from the data, then set the @a
/// copy_pts parameter to true. This creates points to match the reference
/// data's x values and errors, but with the y values and errors zeroed...
/// assuming that there is a reference histo with the same name: if there
/// isn't, an exception will be thrown.
Scatter2DPtr bookScatter2D(const std::string& name,
bool copy_pts=false,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// @brief Book a 2-dimensional data point set, using the binnings in the reference data histogram.
///
/// The paper, dataset and x/y-axis IDs will be used to build the histo name in the HepData standard way.
///
/// @note Unlike histogram booking, scatter booking by default makes no
/// attempt to use reference data to pre-fill the data object. If you want
/// this, which is sometimes useful e.g. when the x-position is not really
/// meaningful and can't be extracted from the data, then set the @a
/// copy_pts parameter to true. This creates points to match the reference
/// data's x values and errors, but with the y values and errors zeroed.
Scatter2DPtr bookScatter2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId,
bool copy_pts=false,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// @brief Book a 2-dimensional data point set with equally spaced x-points in a range.
///
/// The y values and errors will be set to 0.
Scatter2DPtr bookScatter2D(const std::string& name,
size_t npts, double lower, double upper,
const std::string& title="",
const std::string& xtitle="",
const std::string& ytitle="");
/// @brief Book a 2-dimensional data point set based on provided contiguous "bin edges".
///
/// The y values and errors will be set to 0.
Scatter2DPtr bookScatter2D(const std::string& hname,
const std::vector<double>& binedges,
const std::string& title,
const std::string& xtitle,
const std::string& ytitle);
//@}
/// @todo What follows should really be protected: only public to keep BinnedHistogram happy for now...
public:
/// @name Histogram manipulation
//@{
/// Normalize the given histogram, @a histo, to area = @a norm.
///
/// @note The histogram is no longer invalidated by this procedure.
void normalize(Histo1DPtr histo, double norm=1.0, bool includeoverflows=true);
/// Multiplicatively scale the given histogram, @a histo, by factor @s scale.
///
/// @note The histogram is no longer invalidated by this procedure.
void scale(Histo1DPtr histo, double scale);
/// Normalize the given histogram, @a histo, to area = @a norm.
///
/// @note The histogram is no longer invalidated by this procedure.
void normalize(Histo2DPtr histo, double norm=1.0, bool includeoverflows=true);
/// Multiplicatively scale the given histogram, @a histo, by factor @s scale.
///
/// @note The histogram is no longer invalidated by this procedure.
void scale(Histo2DPtr histo, double scale);
/// Helper for histogram division.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(Histo1DPtr h1, Histo1DPtr h2, Scatter2DPtr s) const;
/// Helper for histogram division with raw YODA objects.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
- void divide(const YODA::Histo1D& h1,
- const YODA::Histo1D& h2, Scatter2DPtr s) const;
+ void divide(const YODA::Histo1D& h1, const YODA::Histo1D& h2, Scatter2DPtr s) const;
/// Helper for profile histogram division.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(Profile1DPtr p1, Profile1DPtr p2, Scatter2DPtr s) const;
/// Helper for profile histogram division with raw YODA objects.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(const YODA::Profile1D& p1, const YODA::Profile1D& p2, Scatter2DPtr s) const;
/// Helper for 2D histogram division.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(Histo2DPtr h1, Histo2DPtr h2, Scatter3DPtr s) const;
/// Helper for 2D histogram division with raw YODA objects.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(const YODA::Histo2D& h1, const YODA::Histo2D& h2, Scatter3DPtr s) const;
/// Helper for 2D profile histogram division.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(Profile2DPtr p1, Profile2DPtr p2, Scatter3DPtr s) const;
/// Helper for 2D profile histogram division with raw YODA objects
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void divide(const YODA::Profile2D& p1, const YODA::Profile2D& p2, Scatter3DPtr s) const;
/// Helper for histogram efficiency calculation.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void efficiency(Histo1DPtr h1, Histo1DPtr h2, Scatter2DPtr s) const;
/// Helper for histogram efficiency calculation.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void efficiency(const YODA::Histo1D& h1, const YODA::Histo1D& h2, Scatter2DPtr s) const;
/// Helper for histogram asymmetry calculation.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void asymm(Histo1DPtr h1, Histo1DPtr h2, Scatter2DPtr s) const;
/// Helper for histogram asymmetry calculation.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void asymm(const YODA::Histo1D& h1, const YODA::Histo1D& h2, Scatter2DPtr s) const;
/// Helper for converting a differential histo to an integral one.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void integrate(Histo1DPtr h, Scatter2DPtr s) const;
/// Helper for converting a differential histo to an integral one.
///
/// @note Assigns to the (already registered) output scatter, @a s. Preserves the path information of the target.
void integrate(const Histo1D& h, Scatter2DPtr s) const;
//@}
public:
/// List of registered analysis data objects
const vector<AnalysisObjectPtr>& analysisObjects() const {
return _analysisobjects;
}
protected:
/// @name Data object registration, retrieval, and removal
//@{
/// Register a data object in the histogram system
void addAnalysisObject(AnalysisObjectPtr ao);
/// Get a data object from the histogram system
/// @todo Use this default function template arg in C++11
// template <typename AO=AnalysisObjectPtr>
template <typename AO>
const shared_ptr<AO> getAnalysisObject(const std::string& name) const {
foreach (const AnalysisObjectPtr& ao, analysisObjects()) {
if (ao->path() == histoPath(name)) return dynamic_pointer_cast<AO>(ao);
}
throw Exception("Data object " + histoPath(name) + " not found");
}
/// Get a data object from the histogram system (non-const)
/// @todo Use this default function template arg in C++11
// template <typename AO=AnalysisObjectPtr>
template <typename AO>
shared_ptr<AO> getAnalysisObject(const std::string& name) {
foreach (const AnalysisObjectPtr& ao, analysisObjects()) {
if (ao->path() == histoPath(name)) return dynamic_pointer_cast<AO>(ao);
}
throw Exception("Data object " + histoPath(name) + " not found");
}
/// Unregister a data object from the histogram system (by name)
void removeAnalysisObject(const std::string& path);
/// Unregister a data object from the histogram system (by pointer)
void removeAnalysisObject(AnalysisObjectPtr ao);
/// Get a named Histo1D object from the histogram system
const Histo1DPtr getHisto1D(const std::string& name) const {
return getAnalysisObject<Histo1D>(name);
}
/// Get a named Histo1D object from the histogram system (non-const)
Histo1DPtr getHisto1D(const std::string& name) {
return getAnalysisObject<Histo1D>(name);
}
/// Get a Histo1D object from the histogram system by axis ID codes (non-const)
const Histo1DPtr getHisto1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const {
return getAnalysisObject<Histo1D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
/// Get a Histo1D object from the histogram system by axis ID codes (non-const)
Histo1DPtr getHisto1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) {
return getAnalysisObject<Histo1D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
// /// Get a named Histo2D object from the histogram system
// const Histo2DPtr getHisto2D(const std::string& name) const {
// return getAnalysisObject<Histo2D>(name);
// }
// /// Get a named Histo2D object from the histogram system (non-const)
// Histo2DPtr getHisto2D(const std::string& name) {
// return getAnalysisObject<Histo2D>(name);
// }
// /// Get a Histo2D object from the histogram system by axis ID codes (non-const)
// const Histo2DPtr getHisto2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const {
// return getAnalysisObject<Histo2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
// }
// /// Get a Histo2D object from the histogram system by axis ID codes (non-const)
// Histo2DPtr getHisto2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) {
// return getAnalysisObject<Histo2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
// }
/// Get a named Profile1D object from the histogram system
const Profile1DPtr getProfile1D(const std::string& name) const {
return getAnalysisObject<Profile1D>(name);
}
/// Get a named Profile1D object from the histogram system (non-const)
Profile1DPtr getProfile1D(const std::string& name) {
return getAnalysisObject<Profile1D>(name);
}
/// Get a Profile1D object from the histogram system by axis ID codes (non-const)
const Profile1DPtr getProfile1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const {
return getAnalysisObject<Profile1D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
/// Get a Profile1D object from the histogram system by axis ID codes (non-const)
Profile1DPtr getProfile1D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) {
return getAnalysisObject<Profile1D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
// /// Get a named Profile2D object from the histogram system
// const Profile2DPtr getProfile2D(const std::string& name) const {
// return getAnalysisObject<Profile2D>(name);
// }
// /// Get a named Profile2D object from the histogram system (non-const)
// Profile2DPtr getProfile2D(const std::string& name) {
// return getAnalysisObject<Profile2D>(name);
// }
// /// Get a Profile2D object from the histogram system by axis ID codes (non-const)
// const Profile2DPtr getProfile2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const {
// return getAnalysisObject<Profile2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
// }
// /// Get a Profile2D object from the histogram system by axis ID codes (non-const)
// Profile2DPtr getProfile2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) {
// return getAnalysisObject<Profile2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
// }
/// Get a named Scatter2D object from the histogram system
const Scatter2DPtr getScatter2D(const std::string& name) const {
return getAnalysisObject<Scatter2D>(name);
}
/// Get a named Scatter2D object from the histogram system (non-const)
Scatter2DPtr getScatter2D(const std::string& name) {
return getAnalysisObject<Scatter2D>(name);
}
/// Get a Scatter2D object from the histogram system by axis ID codes (non-const)
const Scatter2DPtr getScatter2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) const {
return getAnalysisObject<Scatter2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
/// Get a Scatter2D object from the histogram system by axis ID codes (non-const)
Scatter2DPtr getScatter2D(unsigned int datasetId, unsigned int xAxisId, unsigned int yAxisId) {
return getAnalysisObject<Scatter2D>(makeAxisCode(datasetId, xAxisId, yAxisId));
}
//@}
private:
/// Name passed to constructor (used to find .info analysis data file, and as a fallback)
string _defaultname;
/// Pointer to analysis metadata object
shared_ptr<AnalysisInfo> _info;
/// Storage of all plot objects
/// @todo Make this a map for fast lookup by path?
vector<AnalysisObjectPtr> _analysisobjects;
/// @name Cross-section variables
//@{
double _crossSection;
bool _gotCrossSection;
//@}
/// The controlling AnalysisHandler object.
AnalysisHandler* _analysishandler;
/// Collection of cached refdata to speed up many autobookings: the
/// reference data file should only be read once.
mutable std::map<std::string, Scatter2DPtr> _refdata;
private:
/// @name Utility functions
//@{
/// Get the reference data for this paper and cache it.
void _cacheRefData() const;
//@}
/// The assignment operator is private and must never be called.
/// In fact, it should not even be implemented.
Analysis& operator=(const Analysis&);
};
}
// Include definition of analysis plugin system so that analyses automatically see it when including Analysis.hh
#include "Rivet/AnalysisBuilder.hh"
/// @def DECLARE_RIVET_PLUGIN
/// Preprocessor define to prettify the global-object plugin hook mechanism.
#define DECLARE_RIVET_PLUGIN(clsname) Rivet::AnalysisBuilder<clsname> plugin_ ## clsname
/// @def DEFAULT_RIVET_ANA_CONSTRUCTOR
/// Preprocessor define to prettify the manky constructor with name string argument
#define DEFAULT_RIVET_ANA_CONSTRUCTOR(clsname) clsname() : Analysis(# clsname) {}
#endif
diff --git a/include/Rivet/Makefile.am b/include/Rivet/Makefile.am
--- a/include/Rivet/Makefile.am
+++ b/include/Rivet/Makefile.am
@@ -1,131 +1,132 @@
## Internal headers - not to be installed
nobase_dist_noinst_HEADERS =
## Public headers - to be installed
nobase_pkginclude_HEADERS =
## Rivet interface
nobase_pkginclude_HEADERS += \
Rivet.hh Exceptions.hh \
Event.hh Run.hh \
ParticleBase.hh ParticleName.hh \
Particle.fhh Particle.hh \
Jet.fhh Jet.hh \
ProjectionApplier.hh ProjectionHandler.hh \
Projection.fhh Projection.hh \
Cmp.fhh Cmp.hh \
BeamConstraint.hh AnalysisHandler.hh \
Analysis.hh AnalysisInfo.hh \
AnalysisBuilder.hh AnalysisLoader.hh \
Cuts.hh
## Build config stuff
nobase_pkginclude_HEADERS += \
Config/RivetCommon.hh \
Config/RivetConfig.hh \
Config/BuildOptions.hh
## Projections
nobase_pkginclude_HEADERS += \
Projections/AxesDefinition.hh \
Projections/Beam.hh \
Projections/BeamThrust.hh \
Projections/CentralEtHCM.hh \
Projections/ChargedFinalState.hh \
Projections/ChargedLeptons.hh \
Projections/ConstLossyFinalState.hh \
Projections/DISFinalState.hh \
Projections/DISKinematics.hh \
Projections/DISLepton.hh \
Projections/DressedLeptons.hh \
Projections/FastJets.hh \
+ Projections/FinalPartons.hh \
Projections/FinalState.hh \
Projections/FoxWolframMoments.hh \
Projections/FParameter.hh \
Projections/HadronicFinalState.hh \
Projections/HeavyHadrons.hh \
Projections/Hemispheres.hh \
Projections/IdentifiedFinalState.hh \
Projections/InitialQuarks.hh \
Projections/InvMassFinalState.hh \
Projections/JetAlg.hh \
Projections/JetShape.hh \
Projections/LeadingParticlesFinalState.hh \
Projections/LossyFinalState.hh \
Projections/MergedFinalState.hh \
Projections/MissingMomentum.hh \
Projections/NeutralFinalState.hh \
Projections/NonHadronicFinalState.hh \
Projections/ParticleFinder.hh \
Projections/ParisiTensor.hh \
Projections/PrimaryHadrons.hh \
Projections/PromptFinalState.hh \
Projections/Sphericity.hh \
Projections/Spherocity.hh \
Projections/TauFinder.hh \
Projections/Thrust.hh \
Projections/TriggerCDFRun0Run1.hh \
Projections/TriggerCDFRun2.hh \
Projections/TriggerUA5.hh \
Projections/UnstableFinalState.hh \
Projections/VetoedFinalState.hh \
Projections/VisibleFinalState.hh \
Projections/WFinder.hh \
Projections/ZFinder.hh
## Analysis base class headers
nobase_pkginclude_HEADERS += \
Analyses/MC_ParticleAnalysis.hh \
Analyses/MC_JetAnalysis.hh \
Analyses/MC_JetSplittings.hh
## Tools
nobase_pkginclude_HEADERS += \
Tools/Logging.hh \
Tools/Utils.hh \
Tools/RivetPaths.hh \
Tools/RivetSTL.hh \
Tools/RivetBoost.hh \
Tools/RivetFastJet.hh \
Tools/RivetHepMC.hh \
Tools/RivetYODA.hh \
Tools/RivetMT2.hh \
Tools/BinnedHistogram.hh \
Tools/ParticleUtils.hh \
Tools/ParticleIdUtils.hh \
Tools/TypeTraits.hh
nobase_dist_noinst_HEADERS += \
Tools/osdir.hh
## Maths
nobase_pkginclude_HEADERS += \
Math/Matrices.hh \
Math/Vector3.hh \
Math/VectorN.hh \
Math/MatrixN.hh \
Math/MatrixDiag.hh \
Math/MathHeader.hh \
Math/Vectors.hh \
Math/LorentzTrans.hh \
Math/Matrix3.hh \
Math/MathUtils.hh \
Math/Vector4.hh \
Math/Math.hh \
Math/Units.hh \
Math/Constants.hh \
Math/eigen/util.h \
Math/eigen/regressioninternal.h \
Math/eigen/regression.h \
Math/eigen/vector.h \
Math/eigen/ludecompositionbase.h \
Math/eigen/ludecomposition.h \
Math/eigen/linearsolver.h \
Math/eigen/linearsolverbase.h \
Math/eigen/matrix.h \
Math/eigen/vectorbase.h \
Math/eigen/projective.h \
Math/eigen/matrixbase.h
diff --git a/include/Rivet/Math/Vector3.hh b/include/Rivet/Math/Vector3.hh
--- a/include/Rivet/Math/Vector3.hh
+++ b/include/Rivet/Math/Vector3.hh
@@ -1,322 +1,322 @@
#ifndef RIVET_MATH_VECTOR3
#define RIVET_MATH_VECTOR3
#include "Rivet/Math/MathHeader.hh"
#include "Rivet/Math/MathUtils.hh"
#include "Rivet/Math/VectorN.hh"
#include <cfloat>
namespace Rivet {
class Vector3;
typedef Vector3 ThreeVector;
class Matrix3;
Vector3 multiply(const double, const Vector3&);
Vector3 multiply(const Vector3&, const double);
Vector3 add(const Vector3&, const Vector3&);
Vector3 operator*(const double, const Vector3&);
Vector3 operator*(const Vector3&, const double);
Vector3 operator/(const Vector3&, const double);
Vector3 operator+(const Vector3&, const Vector3&);
Vector3 operator-(const Vector3&, const Vector3&);
/// @brief Three-dimensional specialisation of Vector.
class Vector3 : public Vector<3> {
friend class Matrix3;
friend Vector3 multiply(const double, const Vector3&);
friend Vector3 multiply(const Vector3&, const double);
friend Vector3 add(const Vector3&, const Vector3&);
friend Vector3 subtract(const Vector3&, const Vector3&);
public:
Vector3() : Vector<3>() { }
template<typename V3>
Vector3(const V3& other) {
this->setX(other.x());
this->setY(other.y());
this->setZ(other.z());
}
Vector3(const Vector<3>& other) {
this->setX(other.get(0));
this->setY(other.get(1));
this->setZ(other.get(2));
}
Vector3(double x, double y, double z) {
this->setX(x);
this->setY(y);
this->setZ(z);
}
~Vector3() { }
public:
static Vector3 mkX() { return Vector3(1,0,0); }
static Vector3 mkY() { return Vector3(0,1,0); }
static Vector3 mkZ() { return Vector3(0,0,1); }
public:
double x() const { return get(0); }
double y() const { return get(1); }
double z() const { return get(2); }
Vector3& setX(double x) { set(0, x); return *this; }
Vector3& setY(double y) { set(1, y); return *this; }
Vector3& setZ(double z) { set(2, z); return *this; }
double dot(const Vector3& v) const {
return _vec.dot(v._vec);
}
Vector3 cross(const Vector3& v) const {
Vector3 result;
result._vec = _vec.cross(v._vec);
return result;
}
double angle(const Vector3& v) const {
const double localDotOther = unit().dot(v.unit());
- if (fuzzyEquals(localDotOther, 1.0)) return 0.0;
- else if (fuzzyEquals(localDotOther, -1.0)) return M_PI;
+ if (localDotOther > 1.0) return 0.0;
+ if (localDotOther < -1.0) return M_PI;
return acos(localDotOther);
}
Vector3 unit() const {
/// @todo What to do in this situation?
if (isZero()) return *this;
else return *this * 1.0/this->mod();
}
double polarRadius2() const {
return x()*x() + y()*y();
}
/// Synonym for polarRadius2
double perp2() const {
return polarRadius2();
}
/// Synonym for polarRadius2
double rho2() const {
return polarRadius2();
}
double polarRadius() const {
return sqrt(polarRadius2());
}
/// Synonym for polarRadius
double perp() const {
return polarRadius();
}
/// Synonym for polarRadius
double rho() const {
return polarRadius();
}
/// Angle subtended by the vector's projection in x-y and the x-axis.
double azimuthalAngle(const PhiMapping mapping = ZERO_2PI) const {
// If this is a null vector, return zero rather than let atan2 set an error state
if (Rivet::isZero(mod2())) return 0.0;
// Calculate the arctan and return in the requested range
const double value = atan2( y(), x() );
return mapAngle(value, mapping);
}
/// Synonym for azimuthalAngle.
double phi(const PhiMapping mapping = ZERO_2PI) const {
return azimuthalAngle(mapping);
}
/// Angle subtended by the vector and the z-axis.
double polarAngle() const {
// Get number beween [0,PI]
const double polarangle = atan2(polarRadius(), z());
return mapAngle0ToPi(polarangle);
}
/// Synonym for polarAngle
double theta() const {
return polarAngle();
}
/// Purely geometric approximation to rapidity; exact for massless particles
/// and in the central region.
// cut-off such that |eta| < log(2/DBL_EPSILON)
double pseudorapidity() const {
const double epsilon = DBL_EPSILON;
double m = mod();
if ( m == 0.0 ) return 0.0;
double pt = max(epsilon*m, perp());
double rap = std::log((m + fabs(z()))/pt);
return z() > 0.0 ? rap: -rap;
}
/// Synonym for pseudorapidity.
double eta() const {
return pseudorapidity();
}
public:
Vector3& operator*=(const double a) {
_vec = multiply(a, *this)._vec;
return *this;
}
Vector3& operator/=(const double a) {
_vec = multiply(1.0/a, *this)._vec;
return *this;
}
Vector3& operator+=(const Vector3& v) {
_vec = add(*this, v)._vec;
return *this;
}
Vector3& operator-=(const Vector3& v) {
_vec = subtract(*this, v)._vec;
return *this;
}
Vector3 operator-() const {
Vector3 rtn;
rtn._vec = -_vec;
return rtn;
}
};
inline double dot(const Vector3& a, const Vector3& b) {
return a.dot(b);
}
inline Vector3 cross(const Vector3& a, const Vector3& b) {
return a.cross(b);
}
inline Vector3 multiply(const double a, const Vector3& v) {
Vector3 result;
result._vec = a * v._vec;
return result;
}
inline Vector3 multiply(const Vector3& v, const double a) {
return multiply(a, v);
}
inline Vector3 operator*(const double a, const Vector3& v) {
return multiply(a, v);
}
inline Vector3 operator*(const Vector3& v, const double a) {
return multiply(a, v);
}
inline Vector3 operator/(const Vector3& v, const double a) {
return multiply(1.0/a, v);
}
inline Vector3 add(const Vector3& a, const Vector3& b) {
Vector3 result;
result._vec = a._vec + b._vec;
return result;
}
inline Vector3 subtract(const Vector3& a, const Vector3& b) {
Vector3 result;
result._vec = a._vec - b._vec;
return result;
}
inline Vector3 operator+(const Vector3& a, const Vector3& b) {
return add(a, b);
}
inline Vector3 operator-(const Vector3& a, const Vector3& b) {
return subtract(a, b);
}
// More physicsy coordinates etc.
/// Angle (in radians) between two 3-vectors.
inline double angle(const Vector3& a, const Vector3& b) {
return a.angle(b);
}
/////////////////////////////////////////////////////
/// @name \f$ |\Delta eta| \f$ calculations from 3-vectors
//@{
/// Calculate the difference in pseudorapidity between two spatial vectors.
inline double deltaEta(const Vector3& a, const Vector3& b) {
return deltaEta(a.pseudorapidity(), b.pseudorapidity());
}
/// Calculate the difference in pseudorapidity between two spatial vectors.
inline double deltaEta(const Vector3& v, double eta2) {
return deltaEta(v.pseudorapidity(), eta2);
}
/// Calculate the difference in pseudorapidity between two spatial vectors.
inline double deltaEta(double eta1, const Vector3& v) {
return deltaEta(eta1, v.pseudorapidity());
}
//@}
/// @name \f$ \Delta phi \f$ calculations from 3-vectors
//@{
/// Calculate the difference in azimuthal angle between two spatial vectors.
inline double deltaPhi(const Vector3& a, const Vector3& b) {
return deltaPhi(a.azimuthalAngle(), b.azimuthalAngle());
}
/// Calculate the difference in azimuthal angle between two spatial vectors.
inline double deltaPhi(const Vector3& v, double phi2) {
return deltaPhi(v.azimuthalAngle(), phi2);
}
/// Calculate the difference in azimuthal angle between two spatial vectors.
inline double deltaPhi(double phi1, const Vector3& v) {
return deltaPhi(phi1, v.azimuthalAngle());
}
//@}
/// @name \f$ \Delta R \f$ calculations from 3-vectors
//@{
/// Calculate the 2D rapidity-azimuthal ("eta-phi") distance between two spatial vectors.
inline double deltaR(const Vector3& a, const Vector3& b) {
return deltaR(a.pseudorapidity(), a.azimuthalAngle(),
b.pseudorapidity(), b.azimuthalAngle());
}
/// Calculate the 2D rapidity-azimuthal ("eta-phi") distance between two spatial vectors.
inline double deltaR(const Vector3& v, double eta2, double phi2) {
return deltaR(v.pseudorapidity(), v.azimuthalAngle(), eta2, phi2);
}
/// Calculate the 2D rapidity-azimuthal ("eta-phi") distance between two spatial vectors.
inline double deltaR(double eta1, double phi1, const Vector3& v) {
return deltaR(eta1, phi1, v.pseudorapidity(), v.azimuthalAngle());
}
//@}
}
#endif
diff --git a/include/Rivet/Particle.hh b/include/Rivet/Particle.hh
--- a/include/Rivet/Particle.hh
+++ b/include/Rivet/Particle.hh
@@ -1,305 +1,319 @@
// -*- C++ -*-
#ifndef RIVET_Particle_HH
#define RIVET_Particle_HH
#include "Rivet/Particle.fhh"
#include "Rivet/ParticleBase.hh"
#include "Rivet/Config/RivetCommon.hh"
#include "Rivet/Tools/Utils.hh"
// NOTE: Rivet/Tools/ParticleUtils.hh included at the end
#include "fastjet/PseudoJet.hh"
namespace Rivet {
/// Representation of particles from a HepMC::GenEvent.
class Particle : public ParticleBase {
public:
/// Default constructor.
/// @note A particle without info is useless. This only exists to keep STL containers happy.
Particle()
: ParticleBase(),
_original(0), _id(0), _momentum()
{ }
/// Constructor without GenParticle.
Particle(PdgId pid, const FourMomentum& mom)
: ParticleBase(),
_original(0), _id(pid), _momentum(mom)
{ }
/// Constructor from a HepMC GenParticle.
Particle(const GenParticle& gp)
: ParticleBase(),
_original(&gp), _id(gp.pdg_id()),
_momentum(gp.momentum())
{ }
/// Constructor from a HepMC GenParticle pointer.
Particle(const GenParticle* gp)
: ParticleBase(),
_original(gp), _id(gp->pdg_id()),
_momentum(gp->momentum())
{ }
public:
/// Converter to FastJet3 PseudoJet
virtual fastjet::PseudoJet pseudojet() const {
return fastjet::PseudoJet(mom().px(), mom().py(), mom().pz(), mom().E());
}
/// Cast operator to FastJet3 PseudoJet
operator PseudoJet () const { return pseudojet(); }
public:
/// @name Basic particle specific properties
//@{
/// Get a const reference to the original GenParticle.
const GenParticle* genParticle() const {
return _original;
}
/// Cast operator for conversion to GenParticle*
operator const GenParticle* () const { return genParticle(); }
/// The momentum.
const FourMomentum& momentum() const {
return _momentum;
}
/// Set the momentum.
Particle& setMomentum(const FourMomentum& momentum) {
_momentum = momentum;
return *this;
}
//@}
/// @name Particle ID code accessors
//@{
/// This Particle's PDG ID code.
PdgId pid() const { return _id; }
/// Absolute value of the PDG ID code.
PdgId abspid() const { return std::abs(_id); }
/// This Particle's PDG ID code (alias).
/// @deprecated Prefer the pid/abspid form
PdgId pdgId() const { return _id; }
//@}
/// @name Particle species properties
//@{
/// The charge of this Particle.
double charge() const {
return PID::charge(pid());
}
/// Three times the charge of this Particle (i.e. integer multiple of smallest quark charge).
int threeCharge() const {
return PID::threeCharge(pid());
}
/// Is this a hadron?
bool isHadron() const { return PID::isHadron(pid()); }
/// Is this a meson?
bool isMeson() const { return PID::isMeson(pid()); }
/// Is this a baryon?
bool isBaryon() const { return PID::isBaryon(pid()); }
/// Is this a lepton?
bool isLepton() const { return PID::isLepton(pid()); }
/// Is this a neutrino?
bool isNeutrino() const { return PID::isNeutrino(pid()); }
/// Does this (hadron) contain a b quark?
bool hasBottom() const { return PID::hasBottom(pid()); }
/// Does this (hadron) contain a c quark?
bool hasCharm() const { return PID::hasCharm(pid()); }
// /// Does this (hadron) contain an s quark?
// bool hasStrange() const { return PID::hasStrange(pid()); }
//@}
/// @name Ancestry properties
//@{
/// Check whether a given PID is found in the GenParticle's ancestor list
///
/// @note This question is valid in MC, but may not be answerable
/// experimentally -- use this function with care when replicating
/// experimental analyses!
bool hasAncestor(PdgId pdg_id) const;
- /// @brief Determine whether the particle is from a hadron or tau decay
- ///
- /// Specifically, walk up the ancestor chain until a status 2 hadron or
- /// tau is found, if at all.
- ///
- /// @note This question is valid in MC, but may not be perfectly answerable
- /// experimentally -- use this function with care when replicating
- /// experimental analyses!
- bool fromDecay() const;
-
/// @brief Determine whether the particle is from a b-hadron decay
///
/// @note This question is valid in MC, but may not be perfectly answerable
/// experimentally -- use this function with care when replicating
/// experimental analyses!
bool fromBottom() const;
/// @brief Determine whether the particle is from a c-hadron decay
///
/// @note If a hadron contains b and c quarks it is considered a bottom
/// hadron and NOT a charm hadron.
///
/// @note This question is valid in MC, but may not be perfectly answerable
/// experimentally -- use this function with care when replicating
/// experimental analyses!
bool fromCharm() const;
// /// @brief Determine whether the particle is from a s-hadron decay
// ///
// /// @note If a hadron contains b or c quarks as well as strange it is
// /// considered a b or c hadron, but NOT a strange hadron.
// ///
// /// @note This question is valid in MC, but may not be perfectly answerable
// /// experimentally -- use this function with care when replicating
// /// experimental analyses!
// bool fromStrange() const;
+ /// @brief Determine whether the particle is from a hadron decay
+ ///
+ /// @note This question is valid in MC, but may not be perfectly answerable
+ /// experimentally -- use this function with care when replicating
+ /// experimental analyses!
+ bool fromHadron() const;
+
/// @brief Determine whether the particle is from a tau decay
///
/// @note This question is valid in MC, but may not be perfectly answerable
/// experimentally -- use this function with care when replicating
/// experimental analyses!
- bool fromTau() const;
+ bool fromTau(bool prompt_taus_only=false) const;
+
+ /// @brief Determine whether the particle is from a prompt tau decay
+ ///
+ /// @note This question is valid in MC, but may not be perfectly answerable
+ /// experimentally -- use this function with care when replicating
+ /// experimental analyses!
+ bool fromPromptTau() const { return fromTau(true); }
+
+ /// @brief Determine whether the particle is from a hadron or tau decay
+ ///
+ /// Specifically, walk up the ancestor chain until a status 2 hadron or
+ /// tau is found, if at all.
+ ///
+ /// @note This question is valid in MC, but may not be perfectly answerable
+ /// experimentally -- use this function with care when replicating
+ /// experimental analyses!
+ bool fromDecay() const { return fromHadron() || fromPromptTau(); }
//@}
/// @name Decay info
//@{
/// Whether this particle is stable according to the generator
bool isStable() const {
return genParticle() != NULL && genParticle()->status() == 1 && genParticle()->end_vertex() == NULL;
}
/// Get a list of the direct descendants from the current particle
vector<Particle> children() const {
vector<Particle> rtn;
if (isStable()) return rtn;
/// @todo Remove this const mess crap when HepMC doesn't suck
HepMC::GenVertex* gv = const_cast<HepMC::GenVertex*>( genParticle()->end_vertex() );
/// @todo Would like to do this, but the range objects are broken
// foreach (const GenParticle* gp, gv->particles(HepMC::children))
// rtn += Particle(gp);
for (GenVertex::particle_iterator it = gv->particles_begin(HepMC::children); it != gv->particles_end(HepMC::children); ++it)
rtn += Particle(*it);
return rtn;
}
/// Get a list of all the descendants (including duplication of parents and children) from the current particle
/// @todo Use recursion through replica-avoiding MCUtils functions to avoid bookkeeping duplicates
/// @todo Insist that the current particle is post-hadronization, otherwise throw an exception?
vector<Particle> allDescendants() const {
vector<Particle> rtn;
if (isStable()) return rtn;
/// @todo Remove this const mess crap when HepMC doesn't suck
HepMC::GenVertex* gv = const_cast<HepMC::GenVertex*>( genParticle()->end_vertex() );
/// @todo Would like to do this, but the range objects are broken
// foreach (const GenParticle* gp, gv->particles(HepMC::descendants))
// rtn += Particle(gp);
for (GenVertex::particle_iterator it = gv->particles_begin(HepMC::descendants); it != gv->particles_end(HepMC::descendants); ++it)
rtn += Particle(*it);
return rtn;
}
/// Get a list of all the stable descendants from the current particle
/// @todo Use recursion through replica-avoiding MCUtils functions to avoid bookkeeping duplicates
/// @todo Insist that the current particle is post-hadronization, otherwise throw an exception?
vector<Particle> stableDescendants() const {
vector<Particle> rtn;
if (isStable()) return rtn;
/// @todo Remove this const mess crap when HepMC doesn't suck
HepMC::GenVertex* gv = const_cast<HepMC::GenVertex*>( genParticle()->end_vertex() );
/// @todo Would like to do this, but the range objects are broken
// foreach (const GenParticle* gp, gv->particles(HepMC::descendants))
// if (gp->status() == 1 && gp->end_vertex() == NULL)
// rtn += Particle(gp);
for (GenVertex::particle_iterator it = gv->particles_begin(HepMC::descendants); it != gv->particles_end(HepMC::descendants); ++it)
if ((*it)->status() == 1 && (*it)->end_vertex() == NULL)
rtn += Particle(*it);
return rtn;
}
/// Flight length (divide by mm or cm to get the appropriate units)
double flightLength() const {
if (isStable()) return -1;
if (genParticle() == NULL) return 0;
if (genParticle()->production_vertex() == NULL) return 0;
const HepMC::FourVector v1 = genParticle()->production_vertex()->position();
const HepMC::FourVector v2 = genParticle()->end_vertex()->position();
return sqrt(sqr(v2.x()-v1.x()) + sqr(v2.y()-v1.y()) + sqr(v2.z()-v1.z()));
}
//@}
private:
/// A pointer to the original GenParticle from which this Particle is projected.
const GenParticle* _original;
/// The PDG ID code for this Particle.
PdgId _id;
/// The momentum of this projection of the Particle.
FourMomentum _momentum;
};
/// @name String representation
//@{
/// Print a ParticlePair as a string.
inline std::string to_str(const ParticlePair& pair) {
stringstream out;
out << "["
<< PID::toParticleName(pair.first.pid()) << " @ "
<< pair.first.momentum().E()/GeV << " GeV, "
<< PID::toParticleName(pair.second.pid()) << " @ "
<< pair.second.momentum().E()/GeV << " GeV]";
return out.str();
}
/// Allow ParticlePair to be passed to an ostream.
inline std::ostream& operator<<(std::ostream& os, const ParticlePair& pp) {
os << to_str(pp);
return os;
}
//@}
}
#include "Rivet/Tools/ParticleUtils.hh"
#endif
diff --git a/include/Rivet/Projections/FinalPartons.hh b/include/Rivet/Projections/FinalPartons.hh
new file mode 100644
--- /dev/null
+++ b/include/Rivet/Projections/FinalPartons.hh
@@ -0,0 +1,25 @@
+// -*- C++ -*-
+#ifndef RIVET_FinalPartons_HH
+#define RIVET_FinalPartons_HH
+
+#include "Rivet/Projections/FinalState.hh"
+
+namespace Rivet {
+
+ class FinalPartons : public FinalState {
+ public:
+ FinalPartons(Cut c=Cuts::open()) : FinalState(c) { }
+
+ const Projection* clone() const {
+ return new FinalPartons(*this);
+ }
+
+ void project(const Event& e);
+
+ protected:
+ bool accept(const Particle& p) const;
+ };
+
+}
+
+#endif
diff --git a/include/Rivet/Projections/HeavyHadrons.hh b/include/Rivet/Projections/HeavyHadrons.hh
--- a/include/Rivet/Projections/HeavyHadrons.hh
+++ b/include/Rivet/Projections/HeavyHadrons.hh
@@ -1,88 +1,99 @@
// -*- C++ -*-
#ifndef RIVET_HeavyHadrons_HH
#define RIVET_HeavyHadrons_HH
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
#include "Rivet/Particle.hh"
#include "Rivet/Event.hh"
namespace Rivet {
/// @brief Project out the last pre-decay b and c hadrons.
///
/// This currently defines a c-hadron as one which contains a @a c quark and
/// @a{not} a @a b quark.
///
/// @todo This assumes that the heavy hadrons are unstable... should we also look for stable ones in case the decays are disabled?
class HeavyHadrons : public FinalState {
public:
/// @name Constructors and destructors.
//@{
/// Constructor with specification of the minimum and maximum pseudorapidity
/// \f$ \eta \f$ and the min \f$ p_T \f$ (in GeV).
- HeavyHadrons(double mineta = -MAXDOUBLE,
- double maxeta = MAXDOUBLE,
- double minpt = 0.0*GeV) {
+ HeavyHadrons(const Cut& c=Cuts::open()) {
setName("HeavyHadrons");
- addProjection(UnstableFinalState(mineta, maxeta, minpt), "UFS");
+ addProjection(UnstableFinalState(c), "UFS");
+ }
+
+ /// Constructor with specification of the minimum and maximum pseudorapidity
+ /// \f$ \eta \f$ and the min \f$ p_T \f$ (in GeV).
+ DEPRECATED("Use the version with a Cut argument")
+ HeavyHadrons(double mineta, double maxeta, double minpt=0.0*GeV) {
+ setName("HeavyHadrons");
+ addProjection(UnstableFinalState(Cuts::etaIn(mineta, maxeta) && Cuts::pT > minpt), "UFS");
}
/// Clone on the heap.
virtual const Projection* clone() const {
return new HeavyHadrons(*this);
}
//@}
/// @name Particle accessors
//@{
/// Get all weakly decaying b hadrons (return by reference)
const Particles& bHadrons() const {
return _theBs;
}
/// Get weakly decaying b hadrons with a pTmin cut (return by value)
Particles bHadrons(double pTmin) const {
Particles rtn;
foreach (const Particle& p, bHadrons())
if (p.pT() > pTmin) rtn += p;
return rtn;
}
/// Get all weakly decaying c hadrons (return by reference)
const Particles& cHadrons() const {
return _theCs;
}
/// Get weakly decaying c hadrons with a pTmin cut (return by value)
const Particles cHadrons(double pTmin) const {
Particles rtn;
foreach (const Particle& p, cHadrons())
if (p.pT() > pTmin) rtn += p;
return rtn;
}
//@}
protected:
/// Apply the projection to the event.
virtual void project(const Event& e);
+ /// Compare projections (only difference is in UFS definition)
+ virtual int compare(const Projection& p) const {
+ return mkNamedPCmp(p, "UFS");
+ }
+
/// b and c hadron containers
Particles _theBs, _theCs;
};
}
#endif
diff --git a/include/Rivet/Projections/ParticleFinder.hh b/include/Rivet/Projections/ParticleFinder.hh
--- a/include/Rivet/Projections/ParticleFinder.hh
+++ b/include/Rivet/Projections/ParticleFinder.hh
@@ -1,198 +1,201 @@
// -*- C++ -*-
#ifndef RIVET_ParticleFinder_HH
#define RIVET_ParticleFinder_HH
#include "Rivet/Projection.hh"
#include "Rivet/Cuts.hh"
namespace Rivet {
/// @brief Base class for projections which return subsets of an event's particles
class ParticleFinder : public Projection {
public:
/// @name Object lifetime management
//@{
/// Construction using Cuts object
- ParticleFinder(const Cut & c = Cuts::open()) : _cuts(c), _theParticles() {}
+ ParticleFinder(const Cut& c=Cuts::open())
+ : _cuts(c), _theParticles()
+ { }
/// Virtual destructor for inheritance
virtual ~ParticleFinder() {}
/// Clone on the heap.
virtual const Projection* clone() const = 0;
//@}
/// @name Particle accessors
//@{
/// Get the final-state particles in no particular order, with no cuts.
virtual const Particles& particles() const { return _theParticles; }
/// Access the projected final-state particles.
size_t size() const { return particles().size(); }
/// Is this final state empty?
bool empty() const { return particles().empty(); }
/// @deprecated Is this final state empty?
DEPRECATED("Use empty()")
bool isEmpty() const { return particles().empty(); }
/// @brief Get the final-state particles, with optional cuts.
/// @note Returns a copy rather than a reference, due to cuts
/// @todo Can't this be a const Cut& arg?
Particles particles(const Cut & c) const {
// Just return a copy of particles() if the cut is open
if (c == Cuts::open()) return particles();
// If there is a non-trivial cut...
Particles rtn;
rtn.reserve(size());
foreach (const Particle& p, particles())
if (c->accept(p)) rtn.push_back(p);
return rtn;
}
/// Get the final-state particles, ordered by supplied sorting function object.
/// @note Returns a copy rather than a reference, due to cuts and sorting
/// @todo Can't this be a const Cut& arg?
/// @todo Use a std::function instead of typename F?
template <typename F>
Particles particles(F sorter, const Cut & c=Cuts::open()) const {
/// @todo Will the vector be efficiently std::move'd by value through this function chain?
return sortBy(particles(c), sorter);
}
/// Get the final-state particles, ordered by supplied sorting function object.
/// @note Returns a copy rather than a reference, due to cuts and sorting
/// @todo Can't this be a const Cut& arg?
/// @todo Use a std::function instead of typename F?
template <typename F>
Particles particles(const Cut & c, F sorter) const {
/// @todo Will the vector be efficiently std::move'd by value through this function chain?
return sortBy(particles(c), sorter);
}
/// Get the final-state particles, ordered by decreasing \f$ p_T \f$ and with optional cuts.
///
/// This is a very common use-case, so is available as syntatic sugar for particles(c, cmpMomByPt).
Particles particlesByPt(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByPt);
}
/// Get the final-state particles, ordered by decreasing \f$ p_T \f$ and with a cut on minimum \f$ p_T \f$.
///
/// This is a very common use-case, so is available as syntatic sugar for particles(Cuts::pT >= ptmin, cmpMomByPt).
Particles particlesByPt(double ptmin) const {
return particles(Cuts::pT >= ptmin, cmpMomByPt);
}
/// @name Little-used sorted accessors
/// @deprecated Use the versions with a sorter function argument
//@{
/// Get the final-state particles, ordered by decreasing \f$ p \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByP(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByP);
}
/// Get the final-state particles, ordered by decreasing \f$ E \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByE(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByE);
}
/// Get the final-state particles, ordered by decreasing \f$ E_T \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByEt(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByEt);
}
/// Get the final-state particles, ordered by increasing \f$ \eta \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByEta(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByEta);
}
/// Get the final-state particles, ordered by increasing \f$ |\eta| \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByModEta(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByAbsEta);
}
/// Get the final-state particles, ordered by increasing \f$ y \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByRapidity(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByRap);
}
/// Get the final-state particles, ordered by increasing \f$ |y| \f$.
/// @todo Remove, since there is the templated method or sortByX methods available for these unusual cases?
/// @deprecated Use the version with a sorter function argument
DEPRECATED("Use the version with a sorter function argument")
Particles particlesByModRapidity(const Cut & c=Cuts::open()) const {
return particles(c, cmpMomByAbsRap);
}
//@}
//@}
/// @todo Replace with cuts() accessor
///virtual Cut cuts() const { return _cuts; }
/// @name For JetAlg compatibility
//@{
typedef Particle entity_type;
typedef Particles collection_type;
/// Template-usable interface common to JetAlg.
const collection_type& entities() const {
return particles();
}
//@}
protected:
/// Apply the projection to the event.
virtual void project(const Event& e) = 0;
/// Compare projections.
virtual int compare(const Projection& p) const;
protected:
+
/// The applicable cuts
Cut _cuts;
/// The final-state particles.
Particles _theParticles;
};
}
#endif
diff --git a/include/Rivet/Projections/PrimaryHadrons.hh b/include/Rivet/Projections/PrimaryHadrons.hh
--- a/include/Rivet/Projections/PrimaryHadrons.hh
+++ b/include/Rivet/Projections/PrimaryHadrons.hh
@@ -1,51 +1,56 @@
// -*- C++ -*-
#ifndef RIVET_PrimaryHadrons_HH
#define RIVET_PrimaryHadrons_HH
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
#include "Rivet/Particle.hh"
#include "Rivet/Event.hh"
namespace Rivet {
/// @brief Project out the first hadrons from hadronisation.
///
/// @todo Also be able to return taus? Prefer a separate tau finder.
/// @todo This assumes that the primary hadrons are unstable... should we also look for stable primary hadrons?
class PrimaryHadrons : public FinalState {
public:
/// @name Constructors and destructors.
//@{
+ /// Constructor with cuts argument
+ PrimaryHadrons(const Cut& c=Cuts::open()) {
+ setName("PrimaryHadrons");
+ addProjection(UnstableFinalState(c), "UFS");
+ }
+
/// Constructor with specification of the minimum and maximum pseudorapidity
/// \f$ \eta \f$ and the min \f$ p_T \f$ (in GeV).
- PrimaryHadrons(double mineta = -MAXDOUBLE,
- double maxeta = MAXDOUBLE,
- double minpt = 0.0*GeV) {
+ PrimaryHadrons(double mineta, double maxeta, double minpt=0.0*GeV) {
setName("PrimaryHadrons");
- addProjection(UnstableFinalState(mineta, maxeta, minpt), "UFS");
+ addProjection(UnstableFinalState(Cuts::etaIn(mineta, maxeta) && Cuts::pT > minpt), "UFS");
}
/// Clone on the heap.
virtual const Projection* clone() const {
return new PrimaryHadrons(*this);
}
//@}
+
protected:
/// Apply the projection to the event.
virtual void project(const Event& e);
};
}
#endif
diff --git a/include/Rivet/Projections/TauFinder.hh b/include/Rivet/Projections/TauFinder.hh
--- a/include/Rivet/Projections/TauFinder.hh
+++ b/include/Rivet/Projections/TauFinder.hh
@@ -1,79 +1,77 @@
// -*- C++ -*-
#ifndef RIVET_TauFinder_HH
#define RIVET_TauFinder_HH
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
/// @brief Convenience finder of unstable taus
/// @todo Inherit directly from ParticleFinder, not FinalState
class TauFinder : public FinalState {
public:
enum DecayType { ANY=0, LEPTONIC=1, HADRONIC };
/// @name Constructors
//@{
/// @todo Why accept a FinalState? Find taus which decay to particles in this FS? Document the logic.
TauFinder(const FinalState& inputfs, DecayType decaytype=ANY) {
_init(inputfs, decaytype);
}
TauFinder(DecayType decaytype=ANY) {
_init(UnstableFinalState(), decaytype);
}
/// Clone on the heap.
virtual const Projection* clone() const {
return new TauFinder(*this);
}
//@}
- /// Access to the found bosons
- ///
- /// @note Currently either 0 or 1 boson can be found.
+ /// Access to the found taus
const Particles& taus() const { return _taus; }
protected:
/// Apply the projection on the supplied event.
void project(const Event& e);
/// Compare projections.
//int compare(const Projection& p) const;
public:
/// Clear the projection
void clear() {
_theParticles.clear();
_taus.clear();
}
private:
/// Common implementation of constructor operation, taking FS params.
void _init(const FinalState& inputfs, DecayType decaytype);
/// List of found taus
/// @todo Fill _theParticles instead, when inheriting from ParticleFinder?
Particles _taus;
};
}
#endif
diff --git a/include/Rivet/Projections/UnstableFinalState.hh b/include/Rivet/Projections/UnstableFinalState.hh
--- a/include/Rivet/Projections/UnstableFinalState.hh
+++ b/include/Rivet/Projections/UnstableFinalState.hh
@@ -1,67 +1,67 @@
// -*- C++ -*-
#ifndef RIVET_UnstableFinalState_HH
#define RIVET_UnstableFinalState_HH
#include "Rivet/Projections/FinalState.hh"
namespace Rivet {
/// @brief Project out all physical-but-decayed particles in an event.
///
/// The particles returned by the UFS are unique unstable particles, such as
/// hadrons which are decayed by the generator. If, for example, you set Ks
/// and Lambda particles stable in the generator, they will not be returned by
/// the UFS. Also, you should be aware that all unstable particles in a decay
/// chain are returned: if you are looking for something like the number of B
/// hadrons in an event and there is a decay chain from e.g. B** -> B, you
/// will count both B mesons unless you are careful to check for
/// ancestor/descendent relations between the particles. Duplicate particles
/// in the event record, i.e. those which differ only in bookkeeping details
/// or photon emissions, are stripped from the returned particles collection.
///
/// @todo Inherit directly from ParticleFinder, rename as UnstableFinder, and make TauFinder inherit/use
class UnstableFinalState : public FinalState {
public:
/// @name Standard constructors and destructors.
//@{
/// Cut-based / default constructor
- UnstableFinalState(const Cut& c=Cuts::open()) : FinalState(c)
+ UnstableFinalState(const Cut& c=Cuts::open())
+ : FinalState(c)
{
setName("UnstableFinalState");
}
/// Constructor from cuts.
///
/// May specify the minimum and maximum pseudorapidity \f$ \eta \f$ and the
/// min \f$ p_T \f$
- UnstableFinalState(double mineta,
- double maxeta,
- double minpt = 0.0*GeV)
- : FinalState(mineta, maxeta, minpt)
+ DEPRECATED("Use the version with a Cut argument")
+ UnstableFinalState(double mineta, double maxeta, double minpt=0.0*GeV)
+ : FinalState(Cuts::etaIn(mineta, maxeta) && Cuts::pT > minpt)
{
setName("UnstableFinalState");
}
/// Clone on the heap.
virtual const Projection* clone() const {
return new UnstableFinalState(*this);
}
//@}
protected:
/// Apply the projection to the event.
virtual void project(const Event& e);
};
}
#endif
diff --git a/src/Analyses/ALICE_2011_S8909580.cc b/src/Analyses/ALICE_2011_S8909580.cc
--- a/src/Analyses/ALICE_2011_S8909580.cc
+++ b/src/Analyses/ALICE_2011_S8909580.cc
@@ -1,103 +1,103 @@
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
class ALICE_2011_S8909580 : public Analysis {
public:
ALICE_2011_S8909580()
: Analysis("ALICE_2011_S8909580")
{}
public:
void init() {
- const UnstableFinalState ufs(-15, 15);
+ const UnstableFinalState ufs(Cuts::abseta < 15);
addProjection(ufs, "UFS");
_histPtK0s = bookHisto1D(1, 1, 1);
_histPtLambda = bookHisto1D(2, 1, 1);
_histPtAntiLambda = bookHisto1D(3, 1, 1);
_histPtXi = bookHisto1D(4, 1, 1);
_histPtPhi = bookHisto1D(5, 1, 1);
_temp_h_Lambdas = bookHisto1D("TMP/h_Lambdas", refData(6, 1, 1));
_temp_h_Kzeros = bookHisto1D("TMP/h_Kzeros", refData(6, 1, 1));
_h_LamKzero = bookScatter2D(6, 1, 1);
}
void analyze(const Event& event) {
const double weight = event.weight();
const UnstableFinalState& ufs = applyProjection<UnstableFinalState>(event, "UFS");
foreach (const Particle& p, ufs.particles()) {
const double absrap = p.absrap();
const double pT = p.pT()/GeV;
if (absrap < 0.8) {
switch(p.pid()) {
case 3312:
case -3312:
if ( !( p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
_histPtXi->fill(pT, weight);
}
break;
if (absrap < 0.75) {
case 310:
_histPtK0s->fill(pT, weight);
_temp_h_Kzeros->fill(pT, 2*weight);
break;
case 3122:
if ( !( p.hasAncestor(3322) || p.hasAncestor(-3322) ||
p.hasAncestor(3312) || p.hasAncestor(-3312) ||
p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
_histPtLambda->fill(pT, weight);
_temp_h_Lambdas->fill(pT, weight);
}
break;
case -3122:
if ( !( p.hasAncestor(3322) || p.hasAncestor(-3322) ||
p.hasAncestor(3312) || p.hasAncestor(-3312) ||
p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
_histPtAntiLambda->fill(pT, weight);
_temp_h_Lambdas->fill(pT, weight);
}
break;
}
if (absrap<0.6) {
case 333:
_histPtPhi->fill(pT, weight);
break;
}
}
}
}
}
void finalize() {
scale(_histPtK0s, 1./(1.5*sumOfWeights()));
scale(_histPtLambda, 1./(1.5*sumOfWeights()));
scale(_histPtAntiLambda, 1./(1.5*sumOfWeights()));
scale(_histPtXi, 1./(1.6*sumOfWeights()));
scale(_histPtPhi, 1./(1.2*sumOfWeights()));
divide(_temp_h_Lambdas, _temp_h_Kzeros, _h_LamKzero);
}
private:
Histo1DPtr _histPtK0s, _histPtLambda, _histPtAntiLambda, _histPtXi, _histPtPhi;
Histo1DPtr _temp_h_Lambdas, _temp_h_Kzeros;
Scatter2DPtr _h_LamKzero;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(ALICE_2011_S8909580);
}
diff --git a/src/Analyses/ATLAS_2010_S8914702.cc b/src/Analyses/ATLAS_2010_S8914702.cc
--- a/src/Analyses/ATLAS_2010_S8914702.cc
+++ b/src/Analyses/ATLAS_2010_S8914702.cc
@@ -1,202 +1,202 @@
// -*- C++ -*-
#include <iostream>
#include <sstream>
#include <string>
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/LeadingParticlesFinalState.hh"
#include "Rivet/Jet.hh"
#include "Rivet/Projections/FastJets.hh"
#include "fastjet/internal/base.hh"
#include "fastjet/JetDefinition.hh"
#include "fastjet/AreaDefinition.hh"
#include "fastjet/ClusterSequence.hh"
#include "fastjet/ClusterSequenceArea.hh"
#include "fastjet/PseudoJet.hh"
namespace Rivet {
class ATLAS_2010_S8914702 : public Analysis {
public:
/// Constructor
ATLAS_2010_S8914702()
: Analysis("ATLAS_2010_S8914702")
{
_eta_bins.push_back( 0.00);
_eta_bins.push_back( 0.60);
_eta_bins.push_back( 1.37);
_eta_bins.push_back( 1.52);
_eta_bins.push_back( 1.81);
_eta_bins_areaoffset.push_back(0.0);
_eta_bins_areaoffset.push_back(1.5);
_eta_bins_areaoffset.push_back(3.0);
}
/// Book histograms and initialise projections before the run
void init() {
FinalState fs;
addProjection(fs, "FS");
FastJets fj(fs, FastJets::KT, 0.5);
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec());
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
LeadingParticlesFinalState photonfs(FinalState(-1.81, 1.81, 15.0*GeV));
photonfs.addParticleId(PID::PHOTON);
addProjection(photonfs, "LeadingPhoton");
int hist_bin = 0;
for (int i = 0; i < (int)_eta_bins.size()-1; ++i) {
if (fabs(_eta_bins[i] - 1.37) < .0001) continue;
_h_Et_photon[i] = bookHisto1D(1, 1, hist_bin+1);
hist_bin += 1;
}
}
int getEtaBin(double eta_w, bool area_eta) const {
double eta = fabs(eta_w);
int v_iter = 0;
if (!area_eta) {
for (v_iter=0; v_iter < (int)_eta_bins.size()-1; ++v_iter) {
if (eta >= _eta_bins.at(v_iter) && eta < _eta_bins.at(v_iter+1)) break;
}
return min(v_iter,(int)_eta_bins.size()-2);
} else {
for (v_iter=0; v_iter < (int)_eta_bins_areaoffset.size()-1; ++v_iter) {
if (eta >= _eta_bins_areaoffset.at(v_iter) && eta < _eta_bins_areaoffset.at(v_iter+1)) break;
}
return v_iter;
}
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
Particles photons = applyProjection<LeadingParticlesFinalState>(event, "LeadingPhoton").particles();
if (photons.size() != 1) {
vetoEvent;
}
FourMomentum leadingPhoton = photons[0].momentum();
double eta_P = leadingPhoton.eta();
double phi_P = leadingPhoton.phi();
if(fabs(eta_P)>=1.37 && fabs(eta_P)<1.52){
vetoEvent;
}
int eta_bin = getEtaBin(eta_P,false);
Particles fs = applyProjection<FinalState>(event, "FS").particles();
FourMomentum mom_in_EtCone;
foreach (const Particle& p, fs) {
// check if it's in the cone of .4
if (deltaR(eta_P, phi_P, p.eta(), p.phi()) >= 0.4) continue;
// check if it's in the 5x7 central core
- if (fabs(eta_P-p.eta()) < .025*7.0*0.5 &&
- fabs(phi_P-p.phi()) < (PI/128.)*5.0*0.5) continue;
+ if (fabs(eta_P-p.eta()) < .025*5.0*0.5 &&
+ fabs(phi_P-p.phi()) < (PI/128.)*7.0*0.5) continue;
mom_in_EtCone += p.momentum();
}
MSG_DEBUG("Done with initial EtCone.");
// Now compute the median energy density
_ptDensity.clear();
_sigma.clear();
_Njets.clear();
std::vector< std::vector<double> > ptDensities;
std::vector<double> emptyVec;
ptDensities.assign(_eta_bins_areaoffset.size()-1,emptyVec);
FastJets fast_jets = applyProjection<FastJets>(event, "KtJetsD05");
const fastjet::ClusterSequenceArea* clust_seq_area = fast_jets.clusterSeqArea();
foreach (const fastjet::PseudoJet& jet, fast_jets.pseudoJets(0.0*GeV)) {
//const double y = jet.absrap();
const double eta = fabs(jet.eta());
const double pt = fabs(jet.perp());
// Get the cluster sequence
double area = clust_seq_area->area(jet);
if (area > 10e-4 && fabs(eta)<_eta_bins_areaoffset[_eta_bins_areaoffset.size()-1]) {
ptDensities.at(getEtaBin(fabs(eta),true)).push_back(pt/area);
}
}
for (int b = 0; b < (int)_eta_bins_areaoffset.size()-1; ++b) {
double median = 0.0;
double sigma = 0.0;
int Njets = 0;
if (ptDensities[b].size() > 0) {
std::sort(ptDensities[b].begin(), ptDensities[b].end());
int nDens = ptDensities[b].size();
if (nDens % 2 == 0) {
median = (ptDensities[b][nDens/2]+ptDensities[b][(nDens-2)/2])/2;
} else {
median = ptDensities[b][(nDens-1)/2];
}
sigma = ptDensities[b][(int)(.15865*nDens)];
Njets = nDens;
}
_ptDensity.push_back(median);
_sigma.push_back(sigma);
_Njets.push_back(Njets);
}
// Now figure out the correction
float EtCone_area = PI*.4*.4 - (7.0*.025)*(5.0*PI/128.);
float correction = _ptDensity[getEtaBin(eta_P,true)]*EtCone_area;
MSG_DEBUG("Jet area correction done.");
// Shouldn't need to subtract photon
// NB. Using expected cut at hadron/particle level, not cut at reco level
if (mom_in_EtCone.Et() - correction/*-leadingPhoton.Et()*/ > 4.0*GeV) {
vetoEvent;
}
MSG_DEBUG("Passed isolation cut.");
_h_Et_photon[eta_bin]->fill(leadingPhoton.Et(), weight);
}
/// Normalise histograms etc., after the run
void finalize() {
for (int i = 0; i < (int)_eta_bins.size()-1; ++i) {
if (fabs(_eta_bins[i] - 1.37) < .0001) continue;
scale(_h_Et_photon[i], crossSection()/sumOfWeights());
}
}
private:
Histo1DPtr _h_Et_photon[6];
fastjet::AreaDefinition* _area_def;
std::vector<float> _eta_bins;
std::vector<float> _eta_bins_areaoffset;
std::vector<float> _ptDensity;
std::vector<float> _sigma;
std::vector<float> _Njets;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(ATLAS_2010_S8914702);
}
diff --git a/src/Analyses/ATLAS_2011_I921594.cc b/src/Analyses/ATLAS_2011_I921594.cc
--- a/src/Analyses/ATLAS_2011_I921594.cc
+++ b/src/Analyses/ATLAS_2011_I921594.cc
@@ -1,145 +1,145 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/LeadingParticlesFinalState.hh"
#include "Rivet/Projections/FastJets.hh"
namespace Rivet {
/// Inclusive isolated prompt photon analysis with full 2010 LHC data
class ATLAS_2011_I921594 : public Analysis {
public:
/// Constructor
ATLAS_2011_I921594()
: Analysis("ATLAS_2011_I921594")
{
_eta_bins.push_back( 0.00);
_eta_bins.push_back( 0.60);
_eta_bins.push_back( 1.37);
_eta_bins.push_back( 1.52);
_eta_bins.push_back( 1.81);
_eta_bins.push_back( 2.37);
_eta_bins_areaoffset.push_back(0.0);
_eta_bins_areaoffset.push_back(1.5);
_eta_bins_areaoffset.push_back(3.0);
}
/// Book histograms and initialise projections before the run
void init() {
FinalState fs;
addProjection(fs, "FS");
// Consider the final state jets for the energy density calculation
FastJets fj(fs, FastJets::KT, 0.5);
/// @todo Memory leak! (Once per run, not serious)
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec()); ///< @todo FastJets should deep copy the pointer if possible
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
// Consider the leading pt photon with |eta|<2.37 and pT>45 GeV
LeadingParticlesFinalState photonfs(FinalState(-2.37, 2.37, 45*GeV));
photonfs.addParticleId(PID::PHOTON);
addProjection(photonfs, "LeadingPhoton");
// Book the dsigma/dEt (in eta bins) histograms
for (size_t i = 0; i < _eta_bins.size()-1; i++) {
if (fuzzyEquals(_eta_bins[i], 1.37)) continue; // skip this bin
_h_Et_photon[i] = bookHisto1D(1, 1, i+1);
}
}
/// Return eta bin for either dsigma/dET histogram (area_eta=false) or energy density correction (area_eta=true)
size_t _getEtaBin(double eta_w, bool area_eta) const {
const double eta = fabs(eta_w);
if (!area_eta) {
return binIndex(eta, _eta_bins);
} else {
return binIndex(eta, _eta_bins_areaoffset);
}
}
/// Perform the per-event analysis
void analyze(const Event& event) {
// Retrieve leading photon
const Particles& photons = applyProjection<LeadingParticlesFinalState>(event, "LeadingPhoton").particles();
if (photons.size() != 1) vetoEvent;
const Particle& leadingPhoton = photons[0];
// Veto events with photon in ECAL crack
if (inRange(leadingPhoton.abseta(), 1.37, 1.52)) vetoEvent;
// Compute isolation energy in cone of radius .4 around photon (all particles)
FourMomentum mom_in_EtCone;
Particles fs = applyProjection<FinalState>(event, "FS").particles();
foreach (const Particle& p, fs) {
// Check if it's outside the cone of 0.4
if (deltaR(leadingPhoton, p) >= 0.4) continue;
// Don't count particles in the 5x7 central core
- if (deltaEta(leadingPhoton, p) < .025*7.0*0.5 &&
- deltaPhi(leadingPhoton, p) < (PI/128.)*5.0*0.5) continue;
+ if (deltaEta(leadingPhoton, p) < .025*5.0*0.5 &&
+ deltaPhi(leadingPhoton, p) < (PI/128.)*7.0*0.5) continue;
// Increment isolation energy
mom_in_EtCone += p.momentum();
}
// Get the area-filtered jet inputs for computing median energy density, etc.
vector<double> ptDensity, ptSigma, nJets;
vector< vector<double> > ptDensities(_eta_bins_areaoffset.size()-1);
FastJets fast_jets = applyProjection<FastJets>(event, "KtJetsD05");
const fastjet::ClusterSequenceArea* clust_seq_area = fast_jets.clusterSeqArea();
foreach (const Jet& jet, fast_jets.jets()) {
const double area = clust_seq_area->area(jet);
if (area > 10e-4 && jet.abseta() < _eta_bins_areaoffset.back())
ptDensities.at( _getEtaBin(jet.abseta(), true) ).push_back(jet.pT()/area);
}
// Compute the median energy density, etc.
for (size_t b = 0; b < _eta_bins_areaoffset.size()-1; b++) {
const int njets = ptDensities[b].size();
const double ptmedian = (njets > 0) ? median(ptDensities[b]) : 0;
const double ptsigma = (njets > 0) ? ptDensities[b][(size_t)(0.15865*njets)] : 0;
nJets.push_back(njets);
ptDensity.push_back(ptmedian);
ptSigma.push_back(ptsigma);
}
// Compute the isolation energy correction (cone area*energy density)
const double etCone_area = PI*sqr(0.4) - (7.0*.025)*(5.0*PI/128.);
const double correction = ptDensity[_getEtaBin(leadingPhoton.abseta(), true)]*etCone_area;
// Apply isolation cut on area-corrected value
if (mom_in_EtCone.Et() - correction > 4*GeV) vetoEvent;
// Fill histograms
const size_t eta_bin = _getEtaBin(leadingPhoton.abseta(), false);
_h_Et_photon[eta_bin]->fill(leadingPhoton.Et(), event.weight());
}
/// Normalise histograms etc., after the run
void finalize() {
for (size_t i = 0; i < _eta_bins.size()-1; i++) {
if (fuzzyEquals(_eta_bins[i], 1.37)) continue;
scale(_h_Et_photon[i], crossSection()/picobarn/sumOfWeights());
}
}
private:
Histo1DPtr _h_Et_photon[5];
fastjet::AreaDefinition* _area_def;
vector<double> _eta_bins, _eta_bins_areaoffset;
};
DECLARE_RIVET_PLUGIN(ATLAS_2011_I921594);
}
diff --git a/src/Analyses/ATLAS_2011_I930220.cc b/src/Analyses/ATLAS_2011_I930220.cc
--- a/src/Analyses/ATLAS_2011_I930220.cc
+++ b/src/Analyses/ATLAS_2011_I930220.cc
@@ -1,141 +1,141 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FastJets.hh"
#include "Rivet/Projections/HeavyHadrons.hh"
#include "Rivet/Tools/BinnedHistogram.hh"
namespace Rivet {
/// @brief ATLAS inclusive b-jet pT spectrum, di-jet mass and di-jet chi
class ATLAS_2011_I930220: public Analysis {
public:
ATLAS_2011_I930220()
: Analysis("ATLAS_2011_I930220")
{ }
void init() {
FinalState fs(-3.5, 3.5);
addProjection(fs, "FinalState");
FastJets fj(fs, FastJets::ANTIKT, 0.4);
fj.useInvisibles();
addProjection(fj, "Jets");
- addProjection(HeavyHadrons(-3.5, 3.5, 5*GeV), "BHadrons");
+ addProjection(HeavyHadrons(Cuts::abseta < 3.5 && Cuts::pT > 5*GeV), "BHadrons");
double ybins[] = { 0.0, 0.3, 0.8, 1.2, 2.1 };
for (size_t i = 0; i < 4; ++i)
_bjetpT_SV0.addHistogram(ybins[i], ybins[i+1], bookHisto1D(i+1, 1, 1));
_bjetpT_SV0_All = bookHisto1D(5, 1, 1);
_bjetpT_pTRel = bookHisto1D(6, 1, 1);
_dijet_mass = bookHisto1D(7, 1, 1);
_dijet_phi = bookHisto1D(8, 1, 1);
_dijet_chi_110_370 = bookHisto1D(9, 1, 1);
_dijet_chi_370_850 = bookHisto1D(10, 1, 1);
_chiCounter1 = 0.0;
_chiCounter2 = 0.0;
_phiCounter = 0.0;
}
void analyze(const Event& evt) {
const double weight = evt.weight();
const Particles& bHadrons = applyProjection<HeavyHadrons>(evt, "BHadrons").bHadrons();
const Jets& jets = applyProjection<JetAlg>(evt, "Jets").jetsByPt(15*GeV);
FourMomentum leadingJet, subleadingJet;
int leadJet = 0, subJet = 0;
foreach (const Jet& j, jets) {
bool hasB = false;
foreach (const Particle& b, bHadrons)
if (deltaR(j, b) < 0.3) { hasB = true; break; }
// Identify and classify the leading and subleading jets
if (j.absrap() < 2.1) { ///< Move this into the jets defn
if (!leadJet) {
leadingJet = j.momentum();
leadJet = (hasB && j.pT() > 40*GeV) ? 2 : 1;
}
else if (leadJet && !subJet) {
subleadingJet = j.momentum();
subJet = (hasB && j.pT() > 40*GeV) ? 2 : 1;
}
if (hasB) {
_bjetpT_SV0.fill(j.absrap(), j.pT()/GeV, weight);
_bjetpT_SV0_All->fill(j.pT()/GeV, weight);
_bjetpT_pTRel->fill(j.pT()/GeV, weight);
}
}
}
// Di-b-jet plots require both the leading and subleading jets to be b-tagged and have pT > 40 GeV
if (leadJet == 2 && subJet == 2) {
const double mass = FourMomentum( leadingJet + subleadingJet ).mass();
_dijet_mass->fill(mass/GeV, weight);
// Plot dphi for high-mass di-b-jets
if (mass > 110*GeV) {
_phiCounter += weight;
const double d_phi = deltaPhi( leadingJet.phi(), subleadingJet.phi() );
_dijet_phi->fill(fabs(d_phi), weight);
}
// Plot chi for low y_boost di-b-jets (in two high-mass bins)
const double y_boost = 0.5 * (leadingJet.rapidity() + subleadingJet.rapidity());
const double chi = exp( fabs( leadingJet.rapidity() - subleadingJet.rapidity() ) );
if ( fabs(y_boost) < 1.1 ) {
if (inRange(mass/GeV, 110, 370)) {
_chiCounter1 += weight;
_dijet_chi_110_370->fill(chi, weight);
} else if (inRange(mass/GeV, 370, 850)) {
_chiCounter2 += weight;
_dijet_chi_370_850->fill(chi, weight);
}
}
}
}
void finalize() {
// Normalizing to cross-section and mass
// Additional factors represent the division by rapidity
const double xsec = crossSectionPerEvent()/(picobarn);
const double chiScale1 = 1 / _chiCounter1 / 260.0;
const double chiScale2 = 1 / _chiCounter2 / 480.0;
const double phiScale = 1 / _phiCounter;
_bjetpT_SV0.scale(xsec/2, this);
scale(_bjetpT_SV0_All, xsec);
scale(_bjetpT_pTRel, xsec);
scale(_dijet_mass, xsec);
scale(_dijet_phi, phiScale );
scale(_dijet_chi_110_370, chiScale1);
scale(_dijet_chi_370_850, chiScale2);
}
private:
BinnedHistogram<double> _bjetpT_SV0;
Histo1DPtr _bjetpT_SV0_All;
Histo1DPtr _bjetpT_pTRel;
Histo1DPtr _dijet_mass;
Histo1DPtr _dijet_phi;
Histo1DPtr _dijet_chi_110_370;
Histo1DPtr _dijet_chi_370_850;
double _chiCounter1;
double _chiCounter2;
double _phiCounter;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(ATLAS_2011_I930220);
}
diff --git a/src/Analyses/ATLAS_2011_S9120807.cc b/src/Analyses/ATLAS_2011_S9120807.cc
--- a/src/Analyses/ATLAS_2011_S9120807.cc
+++ b/src/Analyses/ATLAS_2011_S9120807.cc
@@ -1,223 +1,223 @@
// -*- C++ -*-
#include <iostream>
#include <sstream>
#include <string>
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/IdentifiedFinalState.hh"
#include "Rivet/Jet.hh"
#include "Rivet/Projections/FastJets.hh"
#include "fastjet/internal/base.hh"
#include "fastjet/JetDefinition.hh"
#include "fastjet/AreaDefinition.hh"
#include "fastjet/ClusterSequence.hh"
#include "fastjet/ClusterSequenceArea.hh"
#include "fastjet/PseudoJet.hh"
namespace Rivet {
/// @brief Measurement of isolated diphoton + X differential cross-sections
///
/// Inclusive isolated gamma gamma cross-sections, differential in M(gg), pT(gg),
/// dphi(gg)
///
/// @author Giovanni Marchiori
class ATLAS_2011_S9120807 : public Analysis {
public:
/// Constructor
ATLAS_2011_S9120807()
: Analysis("ATLAS_2011_S9120807")
{
_eta_bins_areaoffset.push_back(0.0);
_eta_bins_areaoffset.push_back(1.5);
_eta_bins_areaoffset.push_back(3.0);
}
public:
/// Book histograms and initialise projections before the run
void init() {
FinalState fs;
addProjection(fs, "FS");
FastJets fj(fs, FastJets::KT, 0.5);
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec());
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
IdentifiedFinalState photonfs(Cuts::abseta < 2.37 && Cuts::pT > 16*GeV);
photonfs.acceptId(PID::PHOTON);
addProjection(photonfs, "Photon");
_h_M = bookHisto1D(1, 1, 1);
_h_pT = bookHisto1D(2, 1, 1);
_h_dPhi = bookHisto1D(3, 1, 1);
}
/// @todo Prefer to use Rivet::binIndex()
size_t getEtaBin(double eta_w) const {
const double aeta = fabs(eta_w);
size_t v_iter = 0;
for (; v_iter+1 < _eta_bins_areaoffset.size(); ++v_iter) {
if (inRange(aeta, _eta_bins_areaoffset[v_iter], _eta_bins_areaoffset[v_iter+1])) break;
}
return v_iter;
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
///
/// require at least 2 photons in final state
///
Particles photons = applyProjection<IdentifiedFinalState>(event, "Photon").particlesByPt();
if (photons.size() < 2) {
vetoEvent;
}
///
/// compute the median energy density
///
std::vector<double> _ptDensity;
std::vector< std::vector<double> > ptDensities;
std::vector<double> emptyVec;
ptDensities.assign(_eta_bins_areaoffset.size()-1, emptyVec);
const fastjet::ClusterSequenceArea* clust_seq_area = applyProjection<FastJets>(event, "KtJetsD05").clusterSeqArea();
foreach (const fastjet::PseudoJet& jet, applyProjection<FastJets>(event, "KtJetsD05").pseudoJets(0.0*GeV)) {
double eta = fabs(jet.eta());
double pt = fabs(jet.perp());
/// get the cluster sequence
double area = clust_seq_area->area(jet);
if(area > 10e-4 && fabs(eta) < _eta_bins_areaoffset[_eta_bins_areaoffset.size()-1]) {
ptDensities.at(getEtaBin(fabs(eta))).push_back(pt/area);
}
}
for(int b=0; b<(int)_eta_bins_areaoffset.size()-1; b++) {
double median = 0.0;
if(ptDensities[b].size() > 0) {
std::sort(ptDensities[b].begin(), ptDensities[b].end());
int nDens = ptDensities[b].size();
if( nDens%2 == 0 )
median = (ptDensities[b][nDens/2] + ptDensities[b][(nDens-2)/2]) / 2;
else
median = ptDensities[b][(nDens-1)/2];
}
_ptDensity.push_back(median);
}
///
/// Loop over photons and fill vector of isolated ones
///
Particles isolated_photons;
foreach (const Particle& photon, photons) {
///
/// remove photons in crack
///
double eta_P = photon.eta();
if (fabs(eta_P)>=1.37 && fabs(eta_P)<1.52) continue;
double phi_P = photon.phi();
///
/// compute isolation
///
/// std EtCone
Particles fs = applyProjection<FinalState>(event, "FS").particles();
FourMomentum mom_in_EtCone;
foreach (const Particle& p, fs) {
/// check if it's in the cone of .4
if (deltaR(eta_P, phi_P, p.eta(), p.phi()) >= 0.4) continue;
/// check if it's in the 5x7 central core
- if (fabs(eta_P-p.eta()) < .025*7.0*0.5 &&
- fabs(phi_P-p.phi()) < (M_PI/128.)*5.0*0.5) continue;
+ if (fabs(eta_P-p.eta()) < .025*5.0*0.5 &&
+ fabs(phi_P-p.phi()) < (M_PI/128.)*7.0*0.5) continue;
mom_in_EtCone += p.momentum();
}
/// now figure out the correction (area*density)
double EtCone_area = M_PI*.4*.4 - (7.0*.025)*(5.0*M_PI/128.);
double correction = _ptDensity[getEtaBin(eta_P)]*EtCone_area;
/// shouldn't need to subtract photon
/// note: using expected cut at hadron/particle level, not cut at reco level
if (mom_in_EtCone.Et()-correction > 4.0*GeV) {
continue;
}
/// add photon to list of isolated ones
isolated_photons.push_back(photon);
}
///
/// require at least two isolated photons
///
if (isolated_photons.size() < 2) {
vetoEvent;
}
///
/// select leading pT pair
///
std::sort(isolated_photons.begin(), isolated_photons.end(), cmpMomByPt);
FourMomentum y1=isolated_photons[0].momentum();
FourMomentum y2=isolated_photons[1].momentum();
///
/// require the two photons to be separated (dR>0.4)
///
if (deltaR(y1, y2)<0.4) {
vetoEvent;
}
FourMomentum yy = y1+y2;
double Myy = yy.mass()/GeV;
double pTyy = yy.pT()/GeV;
double dPhiyy = deltaPhi(y1.phi(), y2.phi());
_h_M->fill(Myy, weight);
_h_pT->fill(pTyy, weight);
_h_dPhi->fill(dPhiyy, weight);
}
/// Normalise histograms etc., after the run
void finalize() {
scale(_h_M, crossSection()/sumOfWeights());
scale(_h_pT, crossSection()/sumOfWeights());
scale(_h_dPhi, crossSection()/sumOfWeights());
}
private:
Histo1DPtr _h_M;
Histo1DPtr _h_pT;
Histo1DPtr _h_dPhi;
fastjet::AreaDefinition* _area_def;
std::vector<double> _eta_bins_areaoffset;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(ATLAS_2011_S9120807);
}
diff --git a/src/Analyses/ATLAS_2012_I1091481.cc b/src/Analyses/ATLAS_2012_I1091481.cc
--- a/src/Analyses/ATLAS_2012_I1091481.cc
+++ b/src/Analyses/ATLAS_2012_I1091481.cc
@@ -1,161 +1,179 @@
-// -*- C++ -*-
-#include "Rivet/Analysis.hh"
-#include "Rivet/Projections/ChargedFinalState.hh"
-#include <iostream>
-#include <fstream>
-
-namespace Rivet {
-
-
- class ATLAS_2012_I1091481 : public Analysis {
- public:
-
- /// Constructor
- ATLAS_2012_I1091481()
- : Analysis("ATLAS_2012_I1091481")
- {
- }
-
-
- public:
-
- /// Book histograms and initialise projections before the run
- void init() {
-
- /// @todo Initialise and register projections here
- ChargedFinalState cfs100(-2.5, 2.5, 0.1*GeV);
- addProjection(cfs100,"CFS100");
- ChargedFinalState cfs500(-2.5, 2.5, 0.5*GeV);
- addProjection(cfs500,"CFS500");
-
-
- // collision energy
- int isqrts = -1;
- if (fuzzyEquals(sqrtS(), 900*GeV)) isqrts = 2;
- else if (fuzzyEquals(sqrtS(), 7*TeV)) isqrts = 1;
- assert(isqrts >= 0);
-
- _sE_10_100 = bookHisto1D(isqrts, 1, 1);
- _sE_1_100 = bookHisto1D(isqrts, 1, 2);
- _sE_10_500 = bookHisto1D(isqrts, 1, 3);
-
- _sEta_10_100 = bookHisto1D(isqrts, 2, 1);
- _sEta_1_100 = bookHisto1D(isqrts, 2, 2);
- _sEta_10_500 = bookHisto1D(isqrts, 2, 3);
- }
-
- // Recalculate particle energy assuming pion mass
- double getPionEnergy(const Particle& p) {
- double m_pi = 0.1396*GeV;
- double p2 = p.p3().mod2()/(GeV*GeV);
- return sqrt(pow(m_pi,2) + p2);
- }
-
- // S_eta core for one event
- //
- // -1 + 1/Nch * |sum_j^Nch exp[i*(xi eta_j - Phi_j)]|^2
- //
- double getSeta(const Particles& part, double xi) {
- std::complex<double> c_eta (0.0, 0.0);
- foreach (const Particle& p, part) {
- double eta = p.eta();
- double phi = p.phi();
- double arg = xi*eta-phi;
- std::complex<double> temp(cos(arg), sin(arg));
- c_eta += temp;
- }
- // Not 100% sure about the -1 here
- return std::norm(c_eta)/part.size() - 1.0;
- }
-
- // S_E core for one event
- //
- // -1 + 1/Nch * |sum_j^Nch exp[i*(omega X_j - Phi_j)]|^2
- double getSE(const Particles& part, double omega) {
- double Xj = 0.0;
- std::complex<double> c_E (0.0, 0.0);
- for (unsigned int i=0; i<part.size(); i++) {
- Xj += 0.5*getPionEnergy(part[i]);
- double phi = part[i].phi();
- double arg = omega*Xj - phi;
- std::complex<double> temp(cos(arg), sin(arg));
- c_E += temp;
- Xj += 0.5*getPionEnergy(part[i]);
- }
- // Not 100% sure about the -1 here
- return std::norm(c_E)/part.size() - 1.0;
- }
-
- // Convenient fill function
- void fillS(Histo1DPtr h, const Particles& part, double weight, bool SE=true) {
- // Loop over bins, take bin centers as parameter values
- for (size_t i=0; i< h->numBins(); i++) {
- double x = h->bin(i).xMid();
- double y;
- if (SE) y = getSE(part, x);
- else y = getSeta(part, x);
- h->fill(x, y*weight);
- }
- }
-
- /// Perform the per-event analysis
- void analyze(const Event& event) {
- double weight = event.weight();
-
- // Charged fs
- const ChargedFinalState& cfs100 = applyProjection<ChargedFinalState>(event, "CFS100");
- const Particles part100 = cfs100.particles(cmpMomByEta);
- const ChargedFinalState& cfs500 = applyProjection<ChargedFinalState>(event, "CFS500");
- const Particles part500 = cfs500.particles(cmpMomByEta);
-
- // Veto event if the most inclusive phase space has less than 10 particles and the max pT is > 10 GeV
- if (part100.size() < 11) vetoEvent;
- double ptmax = cfs100.particlesByPt()[0].pT()/GeV;
- if (ptmax > 10.0) vetoEvent;
-
- // Fill the pt>100, pTmax<10 GeV histos
- fillS(_sE_10_100, part100, weight, true);
- fillS(_sEta_10_100, part100, weight, false);
-
- // Fill the pt>100, pTmax<1 GeV histos
- if (ptmax < 1.0) {
- fillS(_sE_1_100, part100, weight, true);
- fillS(_sEta_1_100, part100, weight, false);
- }
-
- // Fill the pt>500, pTmax<10 GeV histos
- if (part500.size() > 10) {
- fillS(_sE_10_500, part500, weight, true);
- fillS(_sEta_10_500, part500, weight, false);
- }
- }
-
- /// Normalise histograms etc., after the run
- void finalize() {
- // The scaling takes the multiple fills per event into account
- // --- not sure about the normalisation
- scale(_sE_10_100, 1.0/(sumOfWeights()*_sE_10_100->numBins()));
- scale(_sE_1_100 , 1.0/(sumOfWeights()*_sE_1_100 ->numBins()));
- scale(_sE_10_500, 1.0/(sumOfWeights()*_sE_10_500->numBins()));
-
- scale(_sEta_10_100, 1.0/(sumOfWeights()*_sEta_10_100->numBins()));
- scale(_sEta_1_100 , 1.0/(sumOfWeights()*_sEta_1_100 ->numBins()));
- scale(_sEta_10_500, 1.0/(sumOfWeights()*_sEta_10_500->numBins()));
- }
-
- //@}
-
- private:
-
- Histo1DPtr _sE_10_100;
- Histo1DPtr _sE_1_100;
- Histo1DPtr _sE_10_500;
-
- Histo1DPtr _sEta_10_100;
- Histo1DPtr _sEta_1_100;
- Histo1DPtr _sEta_10_500;
- };
-
- // The hook for the plugin system
- DECLARE_RIVET_PLUGIN(ATLAS_2012_I1091481);
-}
+// -*- C++ -*-
+#include "Rivet/Analysis.hh"
+#include "Rivet/Projections/ChargedFinalState.hh"
+
+namespace Rivet {
+
+
+ class ATLAS_2012_I1091481 : public Analysis {
+ public:
+
+ /// Constructor
+ ATLAS_2012_I1091481()
+ : Analysis("ATLAS_2012_I1091481")
+ { }
+
+
+ /// Book histograms and initialise projections before the run
+ void init() {
+
+ ChargedFinalState cfs100(Cuts::abseta < 2.5 && Cuts::pT > 0.1*GeV);
+ addProjection(cfs100,"CFS100");
+ ChargedFinalState cfs500(Cuts::abseta < 2.5 && Cuts::pT > 0.5*GeV);
+ addProjection(cfs500,"CFS500");
+
+ // collision energy
+ int isqrts = -1;
+ if (fuzzyEquals(sqrtS(), 900*GeV)) isqrts = 2;
+ if (fuzzyEquals(sqrtS(), 7*TeV)) isqrts = 1;
+ assert(isqrts > 0);
+
+ _sE_10_100 = bookHisto1D(isqrts, 1, 1);
+ _sE_1_100 = bookHisto1D(isqrts, 1, 2);
+ _sE_10_500 = bookHisto1D(isqrts, 1, 3);
+
+ _sEta_10_100 = bookHisto1D(isqrts, 2, 1);
+ _sEta_1_100 = bookHisto1D(isqrts, 2, 2);
+ _sEta_10_500 = bookHisto1D(isqrts, 2, 3);
+
+ norm_inclusive = 0.;
+ norm_lowPt = 0.;
+ norm_pt500 = 0.;
+ }
+
+
+ // Recalculate particle energy assuming pion mass
+ double getPionEnergy(const Particle& p) {
+ double m_pi = 0.1396*GeV;
+ double p2 = p.p3().mod2()/(GeV*GeV);
+ return sqrt(sqr(m_pi) + p2);
+ }
+
+
+ // S_eta core for one event
+ //
+ // -1 + 1/Nch * |sum_j^Nch exp[i*(xi eta_j - Phi_j)]|^2
+ //
+ double getSeta(const Particles& part, double xi) {
+ std::complex<double> c_eta (0.0, 0.0);
+ foreach (const Particle& p, part) {
+ double eta = p.eta();
+ double phi = p.phi();
+ double arg = xi*eta-phi;
+ std::complex<double> temp(cos(arg), sin(arg));
+ c_eta += temp;
+ }
+ return std::norm(c_eta)/part.size() - 1.0;
+ }
+
+
+ // S_E core for one event
+ //
+ // -1 + 1/Nch * |sum_j^Nch exp[i*(omega X_j - Phi_j)]|^2
+ //
+ double getSE(const Particles& part, double omega) {
+ double Xj = 0.0;
+ std::complex<double> c_E (0.0, 0.0);
+ for (unsigned int i=0; i < part.size(); ++i) {
+ Xj += 0.5*getPionEnergy(part[i]);
+ double phi = part[i].phi();
+ double arg = omega*Xj - phi;
+ std::complex<double> temp(cos(arg), sin(arg));
+ c_E += temp;
+ Xj += 0.5*getPionEnergy(part[i]);
+ }
+ return std::norm(c_E)/part.size() - 1.0;
+ }
+
+
+ // Convenient fill function
+ void fillS(Histo1DPtr h, const Particles& part, double weight, bool SE=true) {
+ // Loop over bins, take bin centers as parameter values
+ for(size_t i=0; i < h->numBins(); ++i) {
+ double x = h->bin(i).xMid();
+ double width = h->bin(i).xMax() - h->bin(i).xMin();
+ double y;
+ if(SE) y = getSE(part, x);
+ else y = getSeta(part, x);
+ h->fill(x, y * width * weight);
+ // Histo1D objects will be converted to Scatter2D objects for plotting
+ // As part of this conversion, Rivet will divide by bin width
+ // However, we want the (x,y) of the Scatter2D to be the (binCenter, sumW) of
+ // the current Histo1D. This is why in the above line we multiply by bin width,
+ // so as to undo later division by bin width.
+ //
+ // Could have used Scatter2D objects in the first place, but they cannot be merged
+ // as easily as Histo1Ds can using yodamerge (missing ScaledBy attribute)
+ }
+ }
+
+
+ /// Perform the per-event analysis
+ void analyze(const Event& event) {
+ double weight = event.weight();
+
+ // Charged fs
+ const ChargedFinalState& cfs100 = applyProjection<ChargedFinalState>(event, "CFS100");
+ const Particles part100 = cfs100.particles(cmpMomByEta);
+ const ChargedFinalState& cfs500 = applyProjection<ChargedFinalState>(event, "CFS500");
+ const Particles& part500 = cfs500.particles(cmpMomByEta);
+
+ // Veto event if the most inclusive phase space has less than 10 particles and the max pT is > 10 GeV
+ if (part100.size() < 11) vetoEvent;
+ double ptmax = cfs100.particlesByPt()[0].pT()/GeV;
+ if (ptmax > 10.0) vetoEvent;
+
+ // Fill the pt>100, pTmax<10 GeV histos
+ fillS(_sE_10_100, part100, weight, true);
+ fillS(_sEta_10_100, part100, weight, false);
+ norm_inclusive += weight;
+
+ // Fill the pt>100, pTmax<1 GeV histos
+ if (ptmax < 1.0) {
+ fillS(_sE_1_100, part100, weight, true);
+ fillS(_sEta_1_100, part100, weight, false);
+ norm_lowPt += weight;
+ }
+
+ // Fill the pt>500, pTmax<10 GeV histos
+ if (part500.size() > 10) {
+ fillS(_sE_10_500, part500, weight, true );
+ fillS(_sEta_10_500, part500, weight, false);
+ norm_pt500 += weight;
+ }
+ }
+
+ /// Normalise histograms etc., after the run
+ void finalize() {
+ // The scaling takes the multiple fills per event into account
+ scale(_sE_10_100, 1.0/norm_inclusive);
+ scale(_sE_1_100 , 1.0/norm_lowPt);
+ scale(_sE_10_500, 1.0/norm_pt500);
+
+ scale(_sEta_10_100, 1.0/norm_inclusive);
+ scale(_sEta_1_100 , 1.0/norm_lowPt);
+ scale(_sEta_10_500, 1.0/norm_pt500);
+ }
+
+ //@}
+
+
+ private:
+
+ Histo1DPtr _sE_10_100;
+ Histo1DPtr _sE_1_100;
+ Histo1DPtr _sE_10_500;
+
+ Histo1DPtr _sEta_10_100;
+ Histo1DPtr _sEta_1_100;
+ Histo1DPtr _sEta_10_500;
+
+ double norm_inclusive;
+ double norm_lowPt;
+ double norm_pt500;
+ };
+
+
+ DECLARE_RIVET_PLUGIN(ATLAS_2012_I1091481);
+
+}
diff --git a/src/Analyses/ATLAS_2012_I1093738.cc b/src/Analyses/ATLAS_2012_I1093738.cc
--- a/src/Analyses/ATLAS_2012_I1093738.cc
+++ b/src/Analyses/ATLAS_2012_I1093738.cc
@@ -1,288 +1,288 @@
// -*- C++ -*-
#include <iostream>
#include <sstream>
#include <string>
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/LeadingParticlesFinalState.hh"
#include "Rivet/Projections/VetoedFinalState.hh"
#include "Rivet/Jet.hh"
#include "Rivet/Projections/FastJets.hh"
#include "fastjet/internal/base.hh"
#include "fastjet/JetDefinition.hh"
#include "fastjet/AreaDefinition.hh"
#include "fastjet/ClusterSequence.hh"
#include "fastjet/ClusterSequenceArea.hh"
#include "fastjet/PseudoJet.hh"
namespace Rivet {
/// @brief Measurement of isolated gamma + jet + X differential cross-sections
///
/// Inclusive isolated gamma + jet cross-sections, differential in pT(gamma), for
/// various photon and jet rapidity configurations.
///
/// @author Giovanni Marchiori
class ATLAS_2012_I1093738 : public Analysis {
public:
// Constructor
ATLAS_2012_I1093738()
: Analysis("ATLAS_2012_I1093738")
{
_eta_bins_ph.push_back(0.0);
_eta_bins_ph.push_back(1.37);
_eta_bins_ph.push_back(1.52);
_eta_bins_ph.push_back(2.37);
_eta_bins_jet.push_back(0.0);
_eta_bins_jet.push_back(1.2);
_eta_bins_jet.push_back(2.8);
_eta_bins_jet.push_back(4.4);
_eta_bins_areaoffset.push_back(0.0);
_eta_bins_areaoffset.push_back(1.5);
_eta_bins_areaoffset.push_back(3.0);
}
public:
// Book histograms and initialise projections before the run
void init() {
// Final state
FinalState fs;
addProjection(fs, "FS");
// Voronoi eta-phi tassellation with KT jets, for ambient energy density calculation
FastJets fj(fs, FastJets::KT, 0.5);
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec());
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
// Leading photon
LeadingParticlesFinalState photonfs(FinalState(-1.37, 1.37, 25.0*GeV));
photonfs.addParticleId(PID::PHOTON);
addProjection(photonfs, "LeadingPhoton");
// FS excluding the leading photon
VetoedFinalState vfs(fs);
vfs.addVetoOnThisFinalState(photonfs);
addProjection(vfs, "JetFS");
// Jets
FastJets jetpro(vfs, FastJets::ANTIKT, 0.4);
jetpro.useInvisibles();
addProjection(jetpro, "Jets");
_h_phbarrel_jetcentral_SS = bookHisto1D(1, 1, 1);
_h_phbarrel_jetmedium_SS = bookHisto1D(2, 1, 1);
_h_phbarrel_jetforward_SS = bookHisto1D(3, 1, 1);
_h_phbarrel_jetcentral_OS = bookHisto1D(4, 1, 1);
_h_phbarrel_jetmedium_OS = bookHisto1D(5, 1, 1);
_h_phbarrel_jetforward_OS = bookHisto1D(6, 1, 1);
}
int getEtaBin(double eta_w, int what) const {
double eta = fabs(eta_w);
int v_iter = 0;
if (what==0) {
for (v_iter=0; v_iter < (int)_eta_bins_ph.size()-1; v_iter++){
if (eta >= _eta_bins_ph.at(v_iter) && eta < _eta_bins_ph.at(v_iter+1))
break;
}
}
else if (what==1) {
for (v_iter=0; v_iter < (int)_eta_bins_jet.size()-1; v_iter++){
if (eta >= _eta_bins_jet.at(v_iter) && eta < _eta_bins_jet.at(v_iter+1))
break;
}
}
else {
for (v_iter=0; v_iter < (int)_eta_bins_areaoffset.size()-1; v_iter++){
if (eta >= _eta_bins_areaoffset.at(v_iter) && eta < _eta_bins_areaoffset.at(v_iter+1))
break;
}
}
return v_iter;
}
// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
// Get the photon
const FinalState& photonfs = applyProjection<FinalState>(event, "LeadingPhoton");
if (photonfs.particles().size() < 1) vetoEvent;
const FourMomentum photon = photonfs.particles().front().momentum();
double eta_P = photon.eta();
double phi_P = photon.phi();
// Get the jet
Jets jets = applyProjection<FastJets>(event, "Jets").jetsByPt(20.0*GeV);
if (jets.size()==0) {
vetoEvent;
}
FourMomentum leadingJet = jets[0].momentum();
// Require jet separated from photon
if (deltaR(eta_P, phi_P, leadingJet.eta(), leadingJet.phi())<1.0) {
vetoEvent;
}
// Veto if leading jet is outside plotted rapidity regions
const double abs_y1 = fabs(leadingJet.rapidity());
if (abs_y1 > 4.4) {
vetoEvent;
}
// compute the median event energy density
const unsigned int skipnhardjets = 0;
_ptDensity.clear();
_sigma.clear();
_Njets.clear();
std::vector< std::vector<double> > ptDensities;
std::vector<double> emptyVec;
ptDensities.assign(_eta_bins_areaoffset.size()-1,emptyVec);
FastJets fast_jets = applyProjection<FastJets>(event, "KtJetsD05");
const fastjet::ClusterSequenceArea* clust_seq_area = fast_jets.clusterSeqArea();
foreach (const fastjet::PseudoJet& jet, fast_jets.pseudoJets(0.0*GeV)) {
double eta = fabs(jet.eta());
double pt = fabs(jet.perp());
double area = clust_seq_area->area(jet);
if (area > 10e-4 && fabs(eta)<_eta_bins_areaoffset[_eta_bins_areaoffset.size()-1]) {
ptDensities.at(getEtaBin(fabs(eta),2)).push_back(pt/area);
}
}
for (int b = 0; b<(int)_eta_bins_areaoffset.size()-1; b++) {
double median = 0.0;
double sigma = 0.0;
int Njets = 0;
if (ptDensities[b].size() > skipnhardjets) {
std::sort(ptDensities[b].begin(), ptDensities[b].end());
int nDens = ptDensities[b].size() - skipnhardjets;
if ( nDens%2 == 0 ) {
median = (ptDensities[b][nDens/2]+ptDensities[b][(nDens-2)/2])/2;
} else {
median = ptDensities[b][(nDens-1)/2];
}
sigma = ptDensities[b][(int)(.15865*nDens)];
Njets = nDens;
}
_ptDensity.push_back(median);
_sigma.push_back(sigma);
_Njets.push_back(Njets);
}
// compute photon isolation
// std EtCone
Particles fs = applyProjection<FinalState>(event, "JetFS").particles();
FourMomentum mom_in_EtCone;
float iso_dR = 0.4;
- float cluster_eta_width = 0.25*7.0;
- float cluster_phi_width = (PI/128.)*5.0;
+ float cluster_eta_width = 0.25*5.0;
+ float cluster_phi_width = (PI/128.)*7.0;
foreach (const Particle& p, fs) {
// check if it's in the cone of .4
if (deltaR(eta_P, phi_P, p.eta(), p.phi()) >= iso_dR) continue;
// check if it's in the 5x7 central core
if (fabs(eta_P-p.eta()) < cluster_eta_width*0.5 &&
fabs(phi_P-p.phi()) < cluster_phi_width*0.5) continue;
mom_in_EtCone += p.momentum();
}
// now figure out the correction (area*density)
float EtCone_area = PI*iso_dR*iso_dR - cluster_eta_width*cluster_phi_width;
float correction = _ptDensity[getEtaBin(eta_P,2)]*EtCone_area;
// require photon to be isolated
if (mom_in_EtCone.Et()-correction > 4.0*GeV) vetoEvent;
int photon_jet_sign = sign( leadingJet.rapidity() * photon.rapidity() );
// Fill histos
float abs_jet_rapidity = fabs(leadingJet.rapidity());
float photon_pt = photon.pT()/GeV;
float abs_photon_eta = fabs(photon.eta());
if (abs_photon_eta<1.37) {
if (abs_jet_rapidity < 1.2) {
if (photon_jet_sign >= 1) {
_h_phbarrel_jetcentral_SS->fill(photon_pt, weight);
} else {
_h_phbarrel_jetcentral_OS->fill(photon_pt, weight);
}
} else if (abs_jet_rapidity < 2.8) {
if (photon_jet_sign >= 1) {
_h_phbarrel_jetmedium_SS->fill(photon_pt, weight);
} else {
_h_phbarrel_jetmedium_OS->fill(photon_pt, weight);
}
} else if (abs_jet_rapidity < 4.4) {
if (photon_jet_sign >= 1) {
_h_phbarrel_jetforward_SS->fill(photon_pt, weight);
} else {
_h_phbarrel_jetforward_OS->fill(photon_pt, weight);
}
}
}
}
/// Normalise histograms etc., after the run
void finalize() {
scale(_h_phbarrel_jetcentral_SS, crossSection()/sumOfWeights());
scale(_h_phbarrel_jetcentral_OS, crossSection()/sumOfWeights());
scale(_h_phbarrel_jetmedium_SS, crossSection()/sumOfWeights());
scale(_h_phbarrel_jetmedium_OS, crossSection()/sumOfWeights());
scale(_h_phbarrel_jetforward_SS, crossSection()/sumOfWeights());
scale(_h_phbarrel_jetforward_OS, crossSection()/sumOfWeights());
}
private:
Histo1DPtr _h_phbarrel_jetcentral_SS;
Histo1DPtr _h_phbarrel_jetmedium_SS;
Histo1DPtr _h_phbarrel_jetforward_SS;
Histo1DPtr _h_phbarrel_jetcentral_OS;
Histo1DPtr _h_phbarrel_jetmedium_OS;
Histo1DPtr _h_phbarrel_jetforward_OS;
fastjet::AreaDefinition* _area_def;
std::vector<float> _eta_bins_ph;
std::vector<float> _eta_bins_jet;
std::vector<float> _eta_bins_areaoffset;
std::vector<float> _ptDensity;
std::vector<float> _sigma;
std::vector<float> _Njets;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(ATLAS_2012_I1093738);
}
diff --git a/src/Analyses/ATLAS_2012_I1199269.cc b/src/Analyses/ATLAS_2012_I1199269.cc
--- a/src/Analyses/ATLAS_2012_I1199269.cc
+++ b/src/Analyses/ATLAS_2012_I1199269.cc
@@ -1,193 +1,193 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/IdentifiedFinalState.hh"
#include "Rivet/Projections/FastJets.hh"
/// @todo These should be unnecessary from 2.2.0 onward when Rivet::Jet is rewritten
#include "fastjet/internal/base.hh"
#include "fastjet/JetDefinition.hh"
#include "fastjet/AreaDefinition.hh"
#include "fastjet/ClusterSequence.hh"
#include "fastjet/ClusterSequenceArea.hh"
#include "fastjet/PseudoJet.hh"
namespace Rivet {
/// @brief Measurement of isolated diphoton + X differential cross-sections
///
/// Inclusive isolated gamma gamma cross-sections, differential in M(gg), pT(gg),
/// dphi(gg), cos(theta*)_CS
///
/// @author Giovanni Marchiori
///
class ATLAS_2012_I1199269 : public Analysis {
public:
/// Constructor
ATLAS_2012_I1199269()
: Analysis("ATLAS_2012_I1199269")
{
_eta_bins_areaoffset += 0.0, 1.5, 3.0;
}
/// Book histograms and initialise projections before the run
void init() {
FinalState fs;
addProjection(fs, "FS");
/// @todo Clean-up when jets are better
FastJets fj(fs, FastJets::KT, 0.5);
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec());
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
IdentifiedFinalState photonfs(Cuts::abseta < 2.37 && Cuts::pT > 22*GeV);
photonfs.acceptId(PID::PHOTON);
addProjection(photonfs, "Photon");
_h_M = bookHisto1D(1, 1, 1);
_h_pT = bookHisto1D(2, 1, 1);
_h_dPhi = bookHisto1D(3, 1, 1);
_h_cosThetaStar = bookHisto1D(4, 1, 1);
}
// int getBinIndex(double x, const vector<double>& binedges) const {
// /// @todo Add linear and log bin guessing, use binary split if sufficiently large, etc.
// for (size_t i = 0; i < _binedges.size()-1; ++i)
// if (inRange(x, binedges[i], binedges[i+1])) return (int) i;
// return -1;
// }
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
/// Require at least 2 photons in final state
const Particles photons = applyProjection<IdentifiedFinalState>(event, "Photon").particlesByPt();
if (photons.size() < 2) vetoEvent;
/// Compute the median energy density
_ptDensity.clear();
_sigma.clear();
_Njets.clear();
/// @todo Nicer way to do this... in C++11?
vector<vector<double> > ptDensities;
vector<double> emptyVec;
ptDensities.assign(_eta_bins_areaoffset.size()-1, emptyVec);
// Get jets, and corresponding jet areas
const fastjet::ClusterSequenceArea* clust_seq_area = applyProjection<FastJets>(event, "KtJetsD05").clusterSeqArea();
foreach (const fastjet::PseudoJet& jet, applyProjection<FastJets>(event, "KtJetsD05").pseudoJets(0.0*GeV)) {
const double aeta = fabs(jet.eta());
const double pt = jet.perp();
const double area = clust_seq_area->area(jet);
if (area < 1e-3) continue;
const int ieta = binIndex(aeta, _eta_bins_areaoffset);
if (ieta != -1) ptDensities[ieta].push_back(pt/area);
}
// Compute median jet properties over the jets in the event
for (size_t b = 0; b < _eta_bins_areaoffset.size()-1; ++b) {
double median = 0.0;
double sigma = 0.0;
int Njets = 0;
if (ptDensities[b].size() > 0) {
std::sort(ptDensities[b].begin(), ptDensities[b].end());
int nDens = ptDensities[b].size();
median = (nDens % 2 == 0) ? (ptDensities[b][nDens/2]+ptDensities[b][(nDens-2)/2])/2 : ptDensities[b][(nDens-1)/2];
sigma = ptDensities[b][(int)(.15865*nDens)];
Njets = nDens;
}
_ptDensity.push_back(median);
_sigma.push_back(sigma);
_Njets.push_back(Njets);
}
// Loop over photons and fill vector of isolated ones
Particles isolated_photons;
foreach (const Particle& photon, photons) {
/// Remove photons in ECAL crack region
if (inRange(photon.abseta(), 1.37, 1.52)) continue;
const double eta_P = photon.eta();
const double phi_P = photon.phi();
// Compute isolation via particles within an R=0.4 cone of the photon
Particles fs = applyProjection<FinalState>(event, "FS").particles();
FourMomentum mom_in_EtCone;
foreach (const Particle& p, fs) {
// Reject if not in cone
if (deltaR(photon.momentum(), p.momentum()) > 0.4) continue;
// Reject if in the 5x7 cell central core
- if (fabs(eta_P - p.eta()) < 0.025 * 7 * 0.5 &&
- fabs(phi_P - p.phi()) < PI/128. * 5 * 0.5) continue;
+ if (fabs(eta_P - p.eta()) < 0.025 * 5 * 0.5 &&
+ fabs(phi_P - p.phi()) < PI/128. * 7 * 0.5) continue;
// Sum momentum
mom_in_EtCone += p.momentum();
}
// Now figure out the correction (area*density)
const double EtCone_area = PI*sqr(0.4) - (7*.025)*(5*PI/128.); // cone area - central core rectangle
const double correction = _ptDensity[binIndex(fabs(eta_P), _eta_bins_areaoffset)] * EtCone_area;
// Discard the photon if there is more than 4 GeV of cone activity
// NOTE: Shouldn't need to subtract photon itself (it's in the central core)
// NOTE: using expected cut at hadron/particle level, not at reco level
if (mom_in_EtCone.Et() - correction > 4*GeV) continue;
// Add isolated photon to list
isolated_photons.push_back(photon);
}
// Require at least two isolated photons and select leading pT pair
if (isolated_photons.size() < 2) vetoEvent;
sortByPt(isolated_photons);
FourMomentum y1 = isolated_photons[0].momentum();
FourMomentum y2 = isolated_photons[1].momentum();
// Leading photon should have pT > 25 GeV
if (y1.pT() < 25*GeV) vetoEvent;
// Require the two photons to be separated by dR > 0.4
if (deltaR(y1, y2) < 0.4) vetoEvent;
FourMomentum yy = y1 + y2;
const double Myy = yy.mass();
_h_M->fill(Myy/GeV, weight);
const double pTyy = yy.pT();
_h_pT->fill(pTyy/GeV, weight);
const double dPhiyy = mapAngle0ToPi(y1.phi() - y2.phi());
_h_dPhi->fill(dPhiyy, weight);
const double costhetayy = 2 * y1.pT() * y2.pT() * sinh(y1.eta() - y2.eta()) / Myy / add_quad(Myy, pTyy);
_h_cosThetaStar->fill(costhetayy, weight);
}
/// Normalise histograms etc., after the run
void finalize() {
scale(_h_M, crossSection()/sumOfWeights());
scale(_h_pT, crossSection()/sumOfWeights());
scale(_h_dPhi, crossSection()/sumOfWeights());
scale(_h_cosThetaStar, crossSection()/sumOfWeights());
}
private:
Histo1DPtr _h_M, _h_pT, _h_dPhi, _h_cosThetaStar;
fastjet::AreaDefinition* _area_def;
vector<double> _eta_bins;
vector<double> _eta_bins_areaoffset;
vector<double> _ptDensity;
vector<double> _sigma;
vector<double> _Njets;
};
DECLARE_RIVET_PLUGIN(ATLAS_2012_I1199269);
}
diff --git a/src/Analyses/ATLAS_2013_I1219109.cc b/src/Analyses/ATLAS_2013_I1219109.cc
new file mode 100644
--- /dev/null
+++ b/src/Analyses/ATLAS_2013_I1219109.cc
@@ -0,0 +1,159 @@
+// -*- C++ -*-
+#include "Rivet/Analysis.hh"
+#include "Rivet/Projections/FinalState.hh"
+#include "Rivet/Projections/WFinder.hh"
+#include "Rivet/Projections/VetoedFinalState.hh"
+#include "Rivet/Projections/FastJets.hh"
+#include "Rivet/Projections/HeavyHadrons.hh"
+
+namespace Rivet {
+
+
+ /// @brief ATLAS W+b measurement
+ class ATLAS_2013_I1219109: public Analysis {
+ public:
+
+ ATLAS_2013_I1219109(string name = "ATLAS_2013_I1219109")
+ : Analysis(name)
+ {
+ // the electron mode is used by default
+ _mode = 1;
+ }
+
+
+ void init() {
+ FinalState fs;
+ addProjection(fs, "FinalState");
+
+ Cut cuts = Cuts::abseta < 2.5 && Cuts::pT >= 25*GeV;
+
+ // W finder for electrons and muons
+ WFinder wf(fs, cuts, _mode==3? PID::MUON : PID::ELECTRON, 0.0*GeV, MAXDOUBLE, 0.0, 0.1,
+ WFinder::CLUSTERNODECAY, WFinder::NOTRACK, WFinder::TRANSMASS);
+ addProjection(wf, "WF");
+
+ // jets
+ VetoedFinalState jet_fs(fs);
+ jet_fs.addVetoOnThisFinalState(getProjection<WFinder>("WF"));
+ FastJets fj(jet_fs, FastJets::ANTIKT, 0.4);
+ fj.useInvisibles();
+ addProjection(fj, "Jets");
+ addProjection(HeavyHadrons(Cuts::abseta < 2.5 && Cuts::pT > 5*GeV), "BHadrons");
+
+
+ // book histograms
+ _njet = bookHisto1D(1, 1, _mode); // dSigma / dNjet
+ _jet1_bPt = bookHisto1D(2, 1, _mode); // dSigma / dBjetPt for Njet = 1
+ _jet2_bPt = bookHisto1D(2, 2, _mode); // dSigma / dBjetPt for Njet = 2
+
+ }
+
+
+ void analyze(const Event& event) {
+
+ const double weight = event.weight();
+
+ // retrieve W boson candidate
+ const WFinder& wf = applyProjection<WFinder>(event, "WF");
+ if( wf.bosons().size() != 1 ) vetoEvent; // only one W boson candidate
+ if( !(wf.mT() > 60.0*GeV) ) vetoEvent;
+ //const Particle& Wboson = wf.boson();
+
+
+ // retrieve constituent neutrino
+ const Particle& neutrino = wf.constituentNeutrino();
+ if( !(neutrino.pT() > 25.0*GeV) ) vetoEvent;
+
+ // retrieve constituent lepton
+ const Particle& lepton = wf.constituentLepton();
+
+ // count good jets, check if good jet contains B hadron
+ const Particles& bHadrons = applyProjection<HeavyHadrons>(event, "BHadrons").bHadrons();
+ const Jets& jets = applyProjection<JetAlg>(event, "Jets").jetsByPt(25*GeV);
+ int goodjets = 0, bjets = 0;
+ double bPt = 0.;
+ foreach(const Jet& j, jets) {
+ if( (j.abseta() < 2.1) && (deltaR(lepton, j) > 0.5) ) {
+ // this jet passes the selection!
+ ++goodjets;
+ // j.bTagged() uses ghost association which is
+ // more elegant, but not what has been used in
+ // this analysis originally, will match B had-
+ // rons in eta-phi space instead
+ foreach(const Particle& b, bHadrons) {
+ if( deltaR(j, b) < 0.3 ) {
+ // jet matched to B hadron!
+ if(!bPt) bPt = j.pT() * GeV; // leading b-jet pT
+ ++bjets; // count number of b-jets
+ break;
+ }
+ }
+ }
+ }
+ if( goodjets > 2 ) vetoEvent; // at most two jets
+ if( !bjets ) vetoEvent; // at least one of them b-tagged
+
+ double njets = double(goodjets);
+ double ncomb = 3.0;
+ _njet->fill(njets, weight);
+ _njet->fill(ncomb, weight);
+
+ if( goodjets == 1) _jet1_bPt->fill(bPt, weight);
+ else if(goodjets == 2) _jet2_bPt->fill(bPt, weight);
+ }
+
+
+ void finalize() {
+
+ // Print summary info
+ const double xs_pb(crossSection() / picobarn);
+ const double sumw(sumOfWeights());
+ MSG_INFO("Cross-Section/pb: " << xs_pb );
+ MSG_INFO("Sum of weights : " << sumw );
+ MSG_INFO("nEvents : " << numEvents());
+
+ const double sf(xs_pb / sumw);
+
+ scale(_njet, sf);
+ scale(_jet1_bPt, sf);
+ scale(_jet2_bPt, sf);
+ }
+
+ protected:
+
+ size_t _mode;
+
+ private:
+
+ Histo1DPtr _njet;
+ Histo1DPtr _jet1_bPt;
+ Histo1DPtr _jet2_bPt;
+
+ bool _isMuon;
+
+ };
+
+ class ATLAS_2013_I1219109_EL : public ATLAS_2013_I1219109 {
+ public:
+ ATLAS_2013_I1219109_EL()
+ : ATLAS_2013_I1219109("ATLAS_2013_I1219109_EL")
+ {
+ _mode = 2;
+ }
+ };
+
+ class ATLAS_2013_I1219109_MU : public ATLAS_2013_I1219109 {
+ public:
+ ATLAS_2013_I1219109_MU()
+ : ATLAS_2013_I1219109("ATLAS_2013_I1219109_MU")
+ {
+ _mode = 3;
+ }
+ };
+
+ // The hook for the plugin system
+ DECLARE_RIVET_PLUGIN(ATLAS_2013_I1219109);
+ DECLARE_RIVET_PLUGIN(ATLAS_2013_I1219109_EL);
+ DECLARE_RIVET_PLUGIN(ATLAS_2013_I1219109_MU);
+
+}
diff --git a/src/Analyses/ATLAS_2013_I1263495.cc b/src/Analyses/ATLAS_2013_I1263495.cc
--- a/src/Analyses/ATLAS_2013_I1263495.cc
+++ b/src/Analyses/ATLAS_2013_I1263495.cc
@@ -1,149 +1,149 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/LeadingParticlesFinalState.hh"
#include "Rivet/Projections/FastJets.hh"
namespace Rivet {
/// Inclusive isolated prompt photon analysis with 2011 LHC data
class ATLAS_2013_I1263495 : public Analysis {
public:
/// Constructor
ATLAS_2013_I1263495()
: Analysis("ATLAS_2013_I1263495")
{
_eta_bins.push_back( 0.00);
_eta_bins.push_back( 1.37);
_eta_bins.push_back( 1.52);
_eta_bins.push_back( 2.37);
_eta_bins_areaoffset.push_back(0.0);
_eta_bins_areaoffset.push_back(1.5);
_eta_bins_areaoffset.push_back(3.0);
}
/// Book histograms and initialise projections before the run
void init() {
FinalState fs;
addProjection(fs, "FS");
// Consider the final state jets for the energy density calculation
FastJets fj(fs, FastJets::KT, 0.5);
/// @todo Memory leak! (Once per run, not serious)
_area_def = new fastjet::AreaDefinition(fastjet::VoronoiAreaSpec()); ///< @todo FastJets should deep copy the pointer if possible
fj.useJetArea(_area_def);
addProjection(fj, "KtJetsD05");
// Consider the leading pt photon with |eta| < 2.37 and pT > 100 GeV
LeadingParticlesFinalState photonfs(FinalState(-2.37, 2.37, 100.0*GeV));
photonfs.addParticleId(PID::PHOTON);
addProjection(photonfs, "LeadingPhoton");
// Book the dsigma/dEt (in eta bins) histograms
for (size_t i = 0; i < _eta_bins.size() - 1; i++) {
if (fuzzyEquals(_eta_bins[i], 1.37)) continue; // skip this bin
_h_Et_photon[i] = bookHisto1D(1, 1, i+1);
}
// Book the dsigma/d|eta| histogram
_h_eta_photon = bookHisto1D(1,2,1);
}
/// Return eta bin for either dsigma/dET histogram (area_eta=false) or energy density correction (area_eta=true)
size_t _getEtaBin(double eta_w, bool area_eta) const {
const double eta = fabs(eta_w);
if (!area_eta) {
return binIndex(eta, _eta_bins);
} else {
return binIndex(eta, _eta_bins_areaoffset);
}
}
/// Perform the per-event analysis
void analyze(const Event& event) {
// Retrieve leading photon
Particles photons = applyProjection<LeadingParticlesFinalState>(event, "LeadingPhoton").particles();
if (photons.size() != 1) vetoEvent;
const Particle& leadingPhoton = photons[0];
// Veto events with photon in ECAL crack
if (inRange(leadingPhoton.abseta(), 1.37, 1.52)) vetoEvent;
// Compute isolation energy in cone of radius .4 around photon (all particles)
FourMomentum mom_in_EtCone;
Particles fs = applyProjection<FinalState>(event, "FS").particles();
foreach (const Particle& p, fs) {
// Check if it's outside the cone of 0.4
if (deltaR(leadingPhoton, p) >= 0.4) continue;
// Don't count particles in the 5x7 central core
- if (deltaEta(leadingPhoton, p) < .025*7.0*0.5 &&
- deltaPhi(leadingPhoton, p) < (PI/128.)*5.0*0.5) continue;
+ if (deltaEta(leadingPhoton, p) < .025*5.0*0.5 &&
+ deltaPhi(leadingPhoton, p) < (PI/128.)*7.0*0.5) continue;
// Increment isolation energy
mom_in_EtCone += p.momentum();
}
// Get the area-filtered jet inputs for computing median energy density, etc.
vector<double> ptDensity, ptSigma, nJets;
vector< vector<double> > ptDensities(_eta_bins_areaoffset.size()-1);
FastJets fast_jets =applyProjection<FastJets>(event, "KtJetsD05");
const fastjet::ClusterSequenceArea* clust_seq_area = fast_jets.clusterSeqArea();
foreach (const Jet& jet, fast_jets.jets()) {
const double area = clust_seq_area->area(jet);
if (area > 10e-4 && jet.abseta() < _eta_bins_areaoffset.back())
ptDensities.at( _getEtaBin(jet.abseta(), true) ).push_back(jet.pT()/area);
}
// Compute the median energy density, etc.
for (size_t b = 0; b < _eta_bins_areaoffset.size()-1; b++) {
const int njets = ptDensities[b].size();
const double ptmedian = (njets > 0) ? median(ptDensities[b]) : 0;
const double ptsigma = (njets > 0) ? ptDensities[b][(size_t)(0.15865*njets)] : 0;
nJets.push_back(njets);
ptDensity.push_back(ptmedian);
ptSigma.push_back(ptsigma);
}
// Compute the isolation energy correction (cone area*energy density)
const double etCone_area = PI*sqr(0.4) - (7.0*.025)*(5.0*PI/128.);
const double correction = ptDensity[_getEtaBin(leadingPhoton.abseta(), true)]*etCone_area;
// Apply isolation cut on area-corrected value
if (mom_in_EtCone.Et() - correction > 7*GeV) vetoEvent;
// Fill histograms
const size_t eta_bin = _getEtaBin(leadingPhoton.abseta(), false);
_h_Et_photon[eta_bin]->fill(leadingPhoton.Et(), event.weight());
_h_eta_photon->fill(leadingPhoton.abseta(), event.weight());
}
/// Normalise histograms etc., after the run
void finalize() {
for (size_t i = 0; i < _eta_bins.size()-1; i++) {
if (fuzzyEquals(_eta_bins[i], 1.37)) continue;
scale(_h_Et_photon[i], crossSection()/picobarn/sumOfWeights());
}
scale(_h_eta_photon, crossSection()/picobarn/sumOfWeights());
}
private:
Histo1DPtr _h_Et_photon[3];
Histo1DPtr _h_eta_photon;
fastjet::AreaDefinition* _area_def;
vector<double> _eta_bins, _eta_bins_areaoffset;
};
DECLARE_RIVET_PLUGIN(ATLAS_2013_I1263495);
}
diff --git a/src/Analyses/ATLAS_2014_I1306294.cc b/src/Analyses/ATLAS_2014_I1306294.cc
new file mode 100644
--- /dev/null
+++ b/src/Analyses/ATLAS_2014_I1306294.cc
@@ -0,0 +1,233 @@
+// -*- C++ -*-
+#include "Rivet/Analysis.hh"
+#include "Rivet/Projections/FinalState.hh"
+#include "Rivet/Projections/ZFinder.hh"
+#include "Rivet/Projections/FastJets.hh"
+#include "Rivet/Projections/HeavyHadrons.hh"
+#include "Rivet/Projections/VetoedFinalState.hh"
+
+namespace Rivet {
+
+ class ATLAS_2014_I1306294 : public Analysis {
+ public:
+
+ /// @name Constructors etc.
+ //@{
+
+ /// Constructors
+ ATLAS_2014_I1306294(std::string name="ATLAS_2014_I1306294")
+ : Analysis(name)
+ {
+ _mode = 1;
+ setNeedsCrossSection(true);
+ }
+
+ //@}
+
+ public:
+
+ /// @name Analysis methods
+ //@{
+
+ /// Book histograms and initialise projections before the run
+ void init() {
+
+ FinalState fs;
+
+ Cut cuts = Cuts::etaIn(-2.5,2.5) & (Cuts::pT > 20.0*GeV);
+
+ ZFinder zfinder(fs, cuts, _mode==1? PID::ELECTRON : PID::MUON, 76.0*GeV, 106.0*GeV, 0.1, ZFinder::CLUSTERNODECAY, ZFinder::NOTRACK);
+ addProjection(zfinder, "ZFinder");
+
+ //FastJets jetpro1( getProjection<ZFinder>("ZFinder").remainingFinalState(), FastJets::ANTIKT, 0.4);
+ VetoedFinalState jet_fs(fs);
+ jet_fs.addVetoOnThisFinalState(getProjection<ZFinder>("ZFinder"));
+ FastJets jetpro1(jet_fs, FastJets::ANTIKT, 0.4);
+ jetpro1.useInvisibles();
+ addProjection(jetpro1, "AntiKtJets04");
+ addProjection(HeavyHadrons(), "BHadrons");
+
+ //Histograms with data binning
+ _h_bjet_Pt = bookHisto1D( 3, 1, 1);
+ _h_bjet_Y = bookHisto1D( 5, 1, 1);
+ _h_bjet_Yboost = bookHisto1D( 7, 1, 1);
+ _h_bjet_DY20 = bookHisto1D( 9, 1, 1);
+ _h_bjet_ZdPhi20 = bookHisto1D(11, 1, 1);
+ _h_bjet_ZdR20 = bookHisto1D(13, 1, 1);
+ _h_bjet_ZPt = bookHisto1D(15, 1, 1);
+ _h_bjet_ZY = bookHisto1D(17, 1, 1);
+ _h_2bjet_dR = bookHisto1D(21, 1, 1);
+ _h_2bjet_Mbb = bookHisto1D(23, 1, 1);
+ _h_2bjet_ZPt = bookHisto1D(25, 1, 1);
+ _h_2bjet_ZY = bookHisto1D(27, 1, 1);
+ }
+
+ //==========================================================================================
+
+
+ /// Perform the per-event analysis
+ void analyze(const Event& e) {
+
+
+ //---------------------------
+ const double weight = e.weight();
+
+ // -- check we have a Z:
+ const ZFinder& zfinder = applyProjection<ZFinder>(e, "ZFinder");
+
+ if(zfinder.bosons().size() != 1) vetoEvent;
+
+ const ParticleVector boson_s = zfinder.bosons();
+ const Particle boson_f = boson_s[0] ;
+ const ParticleVector zleps = zfinder.constituents();
+ //---------------------------
+
+
+ //---------------------------
+ //------------- stop processing the event if no true b-partons or hadrons are found
+ const Particles& allBs = applyProjection<HeavyHadrons>(e, "BHadrons").bHadrons(5.0*GeV);
+ Particles stableBs;
+ foreach(Particle p, allBs) {
+ if(p.abseta() < 2.5) stableBs += p;
+ }
+ if( stableBs.empty() ) vetoEvent;
+
+
+ //---------------------------
+ // -- get the b-jets:
+ const Jets& jets = applyProjection<JetAlg>(e, "AntiKtJets04").jetsByPt(Cuts::pT >20.0*GeV && Cuts::abseta <2.4);
+ Jets b_jets;
+ foreach(const Jet& jet, jets) {
+ //veto overlaps with Z leptons:
+ bool veto = false;
+ foreach(const Particle& zlep, zleps) {
+ if(deltaR(jet, zlep) < 0.5) veto = true;
+ }
+ if(veto) continue;
+
+ foreach(const Particle& bhadron, stableBs) {
+ if( deltaR(jet, bhadron) <= 0.3 ) {
+ b_jets.push_back(jet);
+ break; // match
+ }
+ } // end loop on b-hadrons
+ }
+
+ //and make sure we have at least 1:
+ if(b_jets.empty()) vetoEvent;
+
+ //---------------------------
+ // fill the plots:
+ const double ZpT = boson_f.pT()/GeV;
+ const double ZY = boson_f.absrap();
+
+ _h_bjet_ZPt->fill(ZpT, weight);
+ _h_bjet_ZY ->fill(ZY, weight);
+
+ foreach(const Jet& jet, b_jets) {
+
+ _h_bjet_Pt->fill(jet.pT()/GeV, weight );
+ _h_bjet_Y ->fill(jet.absrap(), weight );
+
+ const double Yboost = 0.5 * fabs(boson_f.rapidity() + jet.rapidity());
+
+ _h_bjet_Yboost->fill(Yboost, weight );
+
+ if(ZpT > 20.) {
+
+ const double ZBDY = fabs( boson_f.rapidity() - jet.rapidity() );
+ const double ZBDPHI = fabs( deltaPhi(jet.phi(), boson_f.phi()) );
+ const double ZBDR = deltaR(jet, boson_f, RAPIDITY);
+ _h_bjet_DY20->fill( ZBDY, weight);
+ _h_bjet_ZdPhi20->fill(ZBDPHI, weight);
+ _h_bjet_ZdR20->fill( ZBDR, weight);
+ }
+
+ } //loop over b-jets
+
+ if (b_jets.size() < 2) return;
+
+ _h_2bjet_ZPt->fill(ZpT, weight);
+ _h_2bjet_ZY ->fill(ZY, weight);
+
+ const double BBDR = deltaR(b_jets[0], b_jets[1], RAPIDITY);
+ const double Mbb = (b_jets[0].momentum() + b_jets[1].momentum()).mass();
+
+ _h_2bjet_dR ->fill(BBDR, weight);
+ _h_2bjet_Mbb->fill(Mbb, weight);
+
+ } // end of analysis loop
+
+
+ /// Normalise histograms etc., after the run
+ void finalize() {
+
+ const double normfac = crossSection() / sumOfWeights();
+
+ scale( _h_bjet_Pt, normfac);
+ scale( _h_bjet_Y, normfac);
+ scale( _h_bjet_Yboost, normfac);
+ scale( _h_bjet_DY20, normfac);
+ scale( _h_bjet_ZdPhi20, normfac);
+ scale( _h_bjet_ZdR20, normfac);
+ scale( _h_bjet_ZPt, normfac);
+ scale( _h_bjet_ZY, normfac);
+ scale( _h_2bjet_dR, normfac);
+ scale( _h_2bjet_Mbb, normfac);
+ scale( _h_2bjet_ZPt, normfac);
+ scale( _h_2bjet_ZY, normfac);
+ }
+
+ //@}
+
+
+ protected:
+
+ // Data members like post-cuts event weight counters go here
+ size_t _mode;
+
+
+ private:
+
+ Histo1DPtr _h_bjet_Pt;
+ Histo1DPtr _h_bjet_Y;
+ Histo1DPtr _h_bjet_Yboost;
+ Histo1DPtr _h_bjet_DY20;
+ Histo1DPtr _h_bjet_ZdPhi20;
+ Histo1DPtr _h_bjet_ZdR20;
+ Histo1DPtr _h_bjet_ZPt;
+ Histo1DPtr _h_bjet_ZY;
+ Histo1DPtr _h_2bjet_dR;
+ Histo1DPtr _h_2bjet_Mbb;
+ Histo1DPtr _h_2bjet_ZPt;
+ Histo1DPtr _h_2bjet_ZY;
+
+ };
+
+
+ class ATLAS_2014_I1306294_EL : public ATLAS_2014_I1306294 {
+ public:
+ ATLAS_2014_I1306294_EL()
+ : ATLAS_2014_I1306294("ATLAS_2014_I1306294_EL")
+ {
+ _mode = 1;
+ }
+ };
+
+ class ATLAS_2014_I1306294_MU : public ATLAS_2014_I1306294 {
+ public:
+ ATLAS_2014_I1306294_MU()
+ : ATLAS_2014_I1306294("ATLAS_2014_I1306294_MU")
+ {
+ _mode = 2;
+ }
+ };
+
+
+ // The hook for the plugin system
+ DECLARE_RIVET_PLUGIN(ATLAS_2014_I1306294);
+ DECLARE_RIVET_PLUGIN(ATLAS_2014_I1306294_MU);
+ DECLARE_RIVET_PLUGIN(ATLAS_2014_I1306294_EL);
+
+}
+
diff --git a/src/Analyses/CMS_2011_S8978280.cc b/src/Analyses/CMS_2011_S8978280.cc
--- a/src/Analyses/CMS_2011_S8978280.cc
+++ b/src/Analyses/CMS_2011_S8978280.cc
@@ -1,134 +1,114 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
/// @brief CMS strange particle spectra (Ks, Lambda, Cascade) in pp at 900 and 7000 GeV
/// @author Kevin Stenson
class CMS_2011_S8978280 : public Analysis {
public:
/// Constructor
- CMS_2011_S8978280() : Analysis("CMS_2011_S8978280") {}
+ CMS_2011_S8978280()
+ : Analysis("CMS_2011_S8978280")
+ { }
void init() {
- // Need wide range of eta because cut on rapidity not pseudorapidity
- UnstableFinalState ufs(-8.0, 8.0, 0.0*GeV);
+ UnstableFinalState ufs(Cuts::absrap < 2);
addProjection(ufs, "UFS");
// Particle distributions versus rapidity and transverse momentum
- if (fuzzyEquals(sqrtS()/GeV, 900.0*GeV)){
- _h_dNKshort_dy = bookHisto1D(1, 1, 1);
- _h_dNKshort_dpT = bookHisto1D(2, 1, 1);
- _h_dNLambda_dy = bookHisto1D(3, 1, 1);
- _h_dNLambda_dpT = bookHisto1D(4, 1, 1);
- _h_dNXi_dy = bookHisto1D(5, 1, 1);
- _h_dNXi_dpT = bookHisto1D(6, 1, 1);
+ if (fuzzyEquals(sqrtS()/GeV, 900*GeV)){
+ _h_dNKshort_dy = bookHisto1D(1, 1, 1);
+ _h_dNKshort_dpT = bookHisto1D(2, 1, 1);
+ _h_dNLambda_dy = bookHisto1D(3, 1, 1);
+ _h_dNLambda_dpT = bookHisto1D(4, 1, 1);
+ _h_dNXi_dy = bookHisto1D(5, 1, 1);
+ _h_dNXi_dpT = bookHisto1D(6, 1, 1);
+ //
+ _h_LampT_KpT = bookScatter2D(7, 1, 1);
+ _h_XipT_LampT = bookScatter2D(8, 1, 1);
+ _h_Lamy_Ky = bookScatter2D(9, 1, 1);
+ _h_Xiy_Lamy = bookScatter2D(10, 1, 1);
- _h_LampT_KpT = bookScatter2D(7, 1, 1);
- _h_XipT_LampT = bookScatter2D(8, 1, 1);
- _h_Lamy_Ky = bookScatter2D(9, 1, 1);
- _h_Xiy_Lamy = bookScatter2D(10, 1, 1);
- } else if (fuzzyEquals(sqrtS()/GeV, 7000.0*GeV)){
- _h_dNKshort_dy = bookHisto1D(1, 1, 2);
- _h_dNKshort_dpT = bookHisto1D(2, 1, 2);
- _h_dNLambda_dy = bookHisto1D(3, 1, 2);
- _h_dNLambda_dpT = bookHisto1D(4, 1, 2);
- _h_dNXi_dy = bookHisto1D(5, 1, 2);
- _h_dNXi_dpT = bookHisto1D(6, 1, 2);
-
- _h_LampT_KpT = bookScatter2D(7, 1, 2);
- _h_XipT_LampT = bookScatter2D(8, 1, 2);
- _h_Lamy_Ky = bookScatter2D(9, 1, 2);
- _h_Xiy_Lamy = bookScatter2D(10, 1, 2);
+ } else if (fuzzyEquals(sqrtS()/GeV, 7000*GeV)){
+ _h_dNKshort_dy = bookHisto1D(1, 1, 2);
+ _h_dNKshort_dpT = bookHisto1D(2, 1, 2);
+ _h_dNLambda_dy = bookHisto1D(3, 1, 2);
+ _h_dNLambda_dpT = bookHisto1D(4, 1, 2);
+ _h_dNXi_dy = bookHisto1D(5, 1, 2);
+ _h_dNXi_dpT = bookHisto1D(6, 1, 2);
+ //
+ _h_LampT_KpT = bookScatter2D(7, 1, 2);
+ _h_XipT_LampT = bookScatter2D(8, 1, 2);
+ _h_Lamy_Ky = bookScatter2D(9, 1, 2);
+ _h_Xiy_Lamy = bookScatter2D(10, 1, 2);
}
}
void analyze(const Event& event) {
const double weight = event.weight();
const UnstableFinalState& parts = applyProjection<UnstableFinalState>(event, "UFS");
+ foreach (const Particle& p, parts.particles()) {
+ switch (p.abspid()) {
+ case PID::K0S:
+ _h_dNKshort_dy->fill(p.absrap(), weight);
+ _h_dNKshort_dpT->fill(p.pT(), weight);
+ break;
- foreach (const Particle& p, parts.particles()) {
- const double pT = p.pT();
- const double y = p.absrap();
- const PdgId pid = p.abspid();
+ case PID::LAMBDA:
+ // Lambda should not have Cascade or Omega ancestors since they should not decay. But just in case...
+ if ( !( p.hasAncestor(3322) || p.hasAncestor(-3322) || p.hasAncestor(3312) || p.hasAncestor(-3312) || p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
+ _h_dNLambda_dy->fill(p.absrap(), weight);
+ _h_dNLambda_dpT->fill(p.pT(), weight);
+ }
+ break;
- if (y < 2.0) {
- switch (pid) {
- case PID::K0S:
- _h_dNKshort_dy->fill(y, weight);
- _h_dNKshort_dpT->fill(pT, weight);
- break;
- case PID::LAMBDA:
- // Lambda should not have Cascade or Omega ancestors since they should not decay. But just in case...
- if ( !( p.hasAncestor(3322) || p.hasAncestor(-3322) || p.hasAncestor(3312) || p.hasAncestor(-3312) || p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
- _h_dNLambda_dy->fill(y, weight);
- _h_dNLambda_dpT->fill(pT, weight);
- }
- break;
- case PID::XIMINUS:
- // Cascade should not have Omega ancestors since it should not decay. But just in case...
- if ( !( p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
- _h_dNXi_dy->fill(y, weight);
- _h_dNXi_dpT->fill(pT, weight);
- }
- break;
+ case PID::XIMINUS:
+ // Cascade should not have Omega ancestors since it should not decay. But just in case...
+ if ( !( p.hasAncestor(3334) || p.hasAncestor(-3334) ) ) {
+ _h_dNXi_dy->fill(p.absrap(), weight);
+ _h_dNXi_dpT->fill(p.pT(), weight);
}
+ break;
}
+
}
}
void finalize() {
-
- divide(_h_dNLambda_dpT,_h_dNKshort_dpT,
- _h_LampT_KpT);
-
- divide(_h_dNXi_dpT,_h_dNLambda_dpT,
- _h_XipT_LampT);
-
- divide(_h_dNLambda_dy,_h_dNKshort_dy,
- _h_Lamy_Ky);
-
- divide(_h_dNXi_dy,_h_dNLambda_dy,
- _h_Xiy_Lamy);
-
- double normpT = 1.0/sumOfWeights();
- double normy = 0.5*normpT; // Accounts for using |y| instead of y
+ divide(_h_dNLambda_dpT,_h_dNKshort_dpT, _h_LampT_KpT);
+ divide(_h_dNXi_dpT,_h_dNLambda_dpT, _h_XipT_LampT);
+ divide(_h_dNLambda_dy,_h_dNKshort_dy, _h_Lamy_Ky);
+ divide(_h_dNXi_dy,_h_dNLambda_dy, _h_Xiy_Lamy);
+ const double normpT = 1.0/sumOfWeights();
+ const double normy = 0.5*normpT; // Accounts for using |y| instead of y
scale(_h_dNKshort_dy, normy);
scale(_h_dNKshort_dpT, normpT);
scale(_h_dNLambda_dy, normy);
scale(_h_dNLambda_dpT, normpT);
scale(_h_dNXi_dy, normy);
scale(_h_dNXi_dpT, normpT);
}
private:
// Particle distributions versus rapidity and transverse momentum
- Histo1DPtr _h_dNKshort_dy;
- Histo1DPtr _h_dNKshort_dpT;
- Histo1DPtr _h_dNLambda_dy;
- Histo1DPtr _h_dNLambda_dpT;
- Histo1DPtr _h_dNXi_dy;
- Histo1DPtr _h_dNXi_dpT;
-
- Scatter2DPtr _h_LampT_KpT;
- Scatter2DPtr _h_XipT_LampT;
- Scatter2DPtr _h_Lamy_Ky;
- Scatter2DPtr _h_Xiy_Lamy;
+ Histo1DPtr _h_dNKshort_dy, _h_dNKshort_dpT, _h_dNLambda_dy, _h_dNLambda_dpT, _h_dNXi_dy, _h_dNXi_dpT;
+ Scatter2DPtr _h_LampT_KpT, _h_XipT_LampT, _h_Lamy_Ky, _h_Xiy_Lamy;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(CMS_2011_S8978280);
}
diff --git a/src/Analyses/CMS_2012_PAS_QCD_11_010.cc b/src/Analyses/CMS_2012_PAS_QCD_11_010.cc
--- a/src/Analyses/CMS_2012_PAS_QCD_11_010.cc
+++ b/src/Analyses/CMS_2012_PAS_QCD_11_010.cc
@@ -1,89 +1,89 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/ChargedFinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
#include "Rivet/Projections/FastJets.hh"
namespace Rivet {
class CMS_2012_PAS_QCD_11_010 : public Analysis {
public:
CMS_2012_PAS_QCD_11_010()
: Analysis("CMS_2012_PAS_QCD_11_010")
{ }
void init() {
- const FastJets jets(ChargedFinalState(-2.5, 2.5, 0.5*GeV), FastJets::ANTIKT, 0.5);
+ const FastJets jets(ChargedFinalState(Cuts::abseta < 2.5 && Cuts::pT > 0.5*GeV), FastJets::ANTIKT, 0.5);
addProjection(jets, "Jets");
- const UnstableFinalState ufs(-2.0, 2.0, 0.6*GeV);
+ const UnstableFinalState ufs(Cuts::abseta < 2 && Cuts::pT > 0.6*GeV);
addProjection(ufs, "UFS");
_h_nTrans_Lambda = bookProfile1D(1, 1, 1);
_h_nTrans_Kaon = bookProfile1D(2, 1, 1);
_h_ptsumTrans_Lambda = bookProfile1D(3, 1, 1);
_h_ptsumTrans_Kaon = bookProfile1D(4, 1, 1);
}
void analyze(const Event& event) {
const double weight = event.weight();
Jets jets = applyProjection<FastJets>(event, "Jets").jetsByPt(1.0*GeV);
if (jets.size() < 1) vetoEvent;
if (fabs(jets[0].eta()) >= 2) { // cuts on leading jets
vetoEvent;
}
FourMomentum p_lead = jets[0].momentum();
const double pTlead = p_lead.pT();
const UnstableFinalState& ufs = applyProjection<UnstableFinalState>(event, "UFS");
int numTrans_Kaon = 0;
int numTrans_Lambda = 0;
double ptSumTrans_Kaon = 0.;
double ptSumTrans_Lambda = 0.;
foreach (const Particle& p, ufs.particles()) {
double dphi = deltaPhi(p, p_lead);
double pT = p.pT();
const PdgId id = p.abspid();
if (dphi > PI/3. && dphi < 2./3.*PI) {
if (id == 310 && pT > 0.6*GeV) {
ptSumTrans_Kaon += pT/GeV;
numTrans_Kaon++;
}
else if (id == 3122 && pT > 1.5*GeV) {
ptSumTrans_Lambda += pT/GeV;
numTrans_Lambda++;
}
}
}
_h_nTrans_Kaon->fill(pTlead/GeV, numTrans_Kaon / (8.0 * PI/3.0), weight);
_h_nTrans_Lambda->fill(pTlead/GeV, numTrans_Lambda / (8.0 * PI/3.0), weight);
_h_ptsumTrans_Kaon->fill(pTlead/GeV, ptSumTrans_Kaon / (GeV * (8.0 * PI/3.0)), weight);
_h_ptsumTrans_Lambda->fill(pTlead/GeV, ptSumTrans_Lambda / (GeV * (8.0 * PI/3.0)), weight);
}
void finalize() { }
private:
Profile1DPtr _h_nTrans_Kaon;
Profile1DPtr _h_nTrans_Lambda;
Profile1DPtr _h_ptsumTrans_Kaon;
Profile1DPtr _h_ptsumTrans_Lambda;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(CMS_2012_PAS_QCD_11_010);
}
diff --git a/src/Analyses/CMS_2013_I1218372.cc b/src/Analyses/CMS_2013_I1218372.cc
--- a/src/Analyses/CMS_2013_I1218372.cc
+++ b/src/Analyses/CMS_2013_I1218372.cc
@@ -1,164 +1,164 @@
// Samantha Dooling DESY
// February 2012
//
// -*- C++ -*-
// =============================
//
-// Ratio of the energy deposited in the pseudorapditiy range
+// Ratio of the energy deposited in the pseudorapidity range
// -6.6 < eta < -5.2 for events with a charged particle jet
//
// =============================
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/ChargedFinalState.hh"
#include "Rivet/Projections/FastJets.hh"
#include "Rivet/Projections/Beam.hh"
#include "Rivet/Projections/VetoedFinalState.hh"
namespace Rivet {
class CMS_2013_I1218372 : public Analysis {
public:
/// Constructor
CMS_2013_I1218372()
: Analysis("CMS_2013_I1218372")
{ }
void init() {
// gives the range of eta and min pT for the final state from which I get the jets
FastJets jetpro (ChargedFinalState(-2.5, 2.5, 0.3*GeV), FastJets::ANTIKT, 0.5);
addProjection(jetpro, "Jets");
// skip Neutrinos and Muons
VetoedFinalState fsv(FinalState(-7.0, -4.0, 0.*GeV));
fsv.vetoNeutrinos();
fsv.addVetoPairId(PID::MUON);
addProjection(fsv, "fsv");
// for the hadron level selection
VetoedFinalState sfsv(FinalState(-MAXDOUBLE, MAXDOUBLE, 0.*GeV));
sfsv.vetoNeutrinos();
sfsv.addVetoPairId(PID::MUON);
addProjection(sfsv, "sfsv");
//counters
passedSumOfWeights = 0.;
inclEflow = 0.;
// Temporary histograms to fill the energy flow for leading jet events.
// Ratios are calculated in finalyze().
int id = 0;
if (fuzzyEquals(sqrtS()/GeV, 900, 1e-3)) id=1;
if (fuzzyEquals(sqrtS()/GeV, 2760, 1e-3)) id=2;
if (fuzzyEquals(sqrtS()/GeV, 7000, 1e-3)) id=3;
_h_ratio = bookScatter2D(id, 1, 1);
_tmp_jet = bookHisto1D ("TMP/eflow_jet" ,refData(id, 1, 1)); // Leading jet energy flow in pt
_tmp_njet = bookHisto1D ("TMP/number_jet" ,refData(id, 1, 1)); // Number of events in pt
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
// Skip if the event is empty
const FinalState& fsv = applyProjection<FinalState>(event, "fsv");
if (fsv.empty()) vetoEvent;
// ====================== Minimum Bias selection
const FinalState& sfsv = applyProjection<FinalState>(event, "sfsv");
- Particles parts = sfsv.particlesByRapidity();
+ Particles parts = sfsv.particles(cmpMomByRap);
if (parts.empty()) vetoEvent;
// find dymax
double dymax = 0;
int gap_pos = -1;
- for (size_t i=0; i < parts.size()-1; ++i) {
+ for (size_t i = 0; i < parts.size()-1; ++i) {
double dy = parts[i+1].rapidity() - parts[i].rapidity();
if (dy > dymax) {
- dymax = dy;
+ dymax = dy;
gap_pos = i;
}
}
// calculate mx2 and my2
FourMomentum xmom;
for (int i=0; i<=gap_pos; ++i) {
xmom += parts[i].momentum();
}
double mx2 = xmom.mass2();
if (mx2<0) vetoEvent;
FourMomentum ymom;
for (size_t i=gap_pos+1; i<parts.size(); ++i) {
ymom += parts[i].momentum();
}
double my2 = ymom.mass2();
if (my2<0) vetoEvent;
// calculate xix and xiy and xidd
double xix = mx2 / sqr(sqrtS());
double xiy = my2 / sqr(sqrtS());
double xidd = mx2*my2 / sqr(sqrtS()*0.938*GeV);
// combine the selection: xi cuts
bool passedHadronCuts = false;
if (fuzzyEquals(sqrtS()/GeV, 900, 1e-3) && (xix > 0.1 || xiy > 0.4 || xidd > 0.5)) passedHadronCuts = true;
if (fuzzyEquals(sqrtS()/GeV, 2760, 1e-3) && (xix > 0.07 || xiy > 0.2 || xidd > 0.5)) passedHadronCuts = true;
if (fuzzyEquals(sqrtS()/GeV, 7000, 1e-3) && (xix > 0.04 || xiy > 0.1 || xidd > 0.5)) passedHadronCuts = true;
if (!passedHadronCuts) vetoEvent;
// ============================== MINIMUM BIAS EVENTS
// loop over particles to calculate the energy
passedSumOfWeights += weight;
foreach (const Particle& p, fsv.particles()) {
if (-5.2 > p.eta() && p.eta() > -6.6) inclEflow += weight*p.E()/GeV;
}
// ============================== JET EVENTS
const FastJets& jetpro = applyProjection<FastJets>(event, "Jets");
const Jets& jets = jetpro.jetsByPt(1.0*GeV);
if (jets.size()<1) vetoEvent;
if (fabs(jets[0].eta()) < 2.0) {
_tmp_njet->fill(jets[0].pT()/GeV, weight);
// energy flow
foreach (const Particle& p, fsv.particles()) {
if (p.eta() > -6.6 && p.eta() < -5.2) { // ask for the CASTOR region
_tmp_jet->fill(jets[0].pT()/GeV, weight * p.E()/GeV);
}
}
}
}// analysis
void finalize() {
scale(_tmp_jet, passedSumOfWeights/inclEflow);
divide(_tmp_jet, _tmp_njet, _h_ratio);
}
private:
// counters
double passedSumOfWeights;
double inclEflow;
// histograms
Scatter2DPtr _h_ratio;
Histo1DPtr _tmp_jet;
Histo1DPtr _tmp_njet;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(CMS_2013_I1218372);
}
diff --git a/src/Analyses/CMS_2013_I1256943.cc b/src/Analyses/CMS_2013_I1256943.cc
--- a/src/Analyses/CMS_2013_I1256943.cc
+++ b/src/Analyses/CMS_2013_I1256943.cc
@@ -1,187 +1,188 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/ZFinder.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
/// CMS cross-section and angular correlations in Z boson + b-hadrons events at 7 TeV
class CMS_2013_I1256943 : public Analysis {
public:
/// Constructor
CMS_2013_I1256943()
: Analysis("CMS_2013_I1256943")
{ }
/// Add projections and book histograms
void init() {
_sumW = 0;
_sumW50 = 0;
_sumWpT = 0;
- FinalState fs(-2.4, 2.4, 20.0*GeV);
+ FinalState fs(Cuts::abseta < 2.4 && Cuts::pT > 20*GeV);
addProjection(fs, "FS");
- UnstableFinalState ufs(-2, 2, 15.0*GeV);
+ UnstableFinalState ufs(Cuts::abseta < 2 && Cuts::pT > 15*GeV);
addProjection(ufs, "UFS");
- Cut etacut = Cuts::abseta < 2.4;
+ Cut zetacut = Cuts::abseta < 2.4;
- ZFinder zfindermu(fs, etacut, PID::MUON, 81.0*GeV, 101.0*GeV, 0.1, ZFinder::NOCLUSTER, ZFinder::TRACK, 91.2*GeV);
+ ZFinder zfindermu(fs, zetacut, PID::MUON, 81.0*GeV, 101.0*GeV, 0.1, ZFinder::NOCLUSTER, ZFinder::TRACK, 91.2*GeV);
addProjection(zfindermu, "ZFinderMu");
- ZFinder zfinderel(fs, etacut, PID::ELECTRON, 81.0*GeV, 101.0*GeV, 0.1, ZFinder::NOCLUSTER, ZFinder::TRACK, 91.2*GeV);
+ ZFinder zfinderel(fs, zetacut, PID::ELECTRON, 81.0*GeV, 101.0*GeV, 0.1, ZFinder::NOCLUSTER, ZFinder::TRACK, 91.2*GeV);
addProjection(zfinderel, "ZFinderEl");
+
// Histograms in non-boosted region of Z pT
_h_dR_BB = bookHisto1D(1, 1, 1);
_h_dphi_BB = bookHisto1D(2, 1, 1);
_h_min_dR_ZB = bookHisto1D(3, 1, 1);
_h_A_ZBB = bookHisto1D(4, 1, 1);
// Histograms in boosted region of Z pT (pT > 50 GeV)
_h_dR_BB_boost = bookHisto1D(5, 1, 1);
_h_dphi_BB_boost = bookHisto1D(6, 1, 1);
_h_min_dR_ZB_boost = bookHisto1D(7, 1, 1);
_h_A_ZBB_boost = bookHisto1D(8, 1, 1);
_h_min_ZpT = bookHisto1D(9,1,1);
}
/// Do the analysis
void analyze(const Event& e) {
vector<FourMomentum> Bmom;
const UnstableFinalState& ufs = applyProjection<UnstableFinalState>(e, "UFS");
const ZFinder& zfindermu = applyProjection<ZFinder>(e, "ZFinderMu");
const ZFinder& zfinderel = applyProjection<ZFinder>(e, "ZFinderEl");
// Look for a Z --> mu+ mu- event in the final state
if (zfindermu.empty() && zfinderel.empty()) vetoEvent;
const Particles& z = !zfindermu.empty() ? zfindermu.bosons() : zfinderel.bosons();
const bool is_boosted = ( z[0].pT() > 50*GeV );
// Loop over the unstable particles
foreach (const Particle& p, ufs.particles()) {
const PdgId pid = p.pid();
// Look for particles with a bottom quark
if (PID::hasBottom(pid)) {
bool good_B = false;
const HepMC::GenParticle* pgen = p.genParticle();
HepMC::GenVertex* vgen = pgen -> end_vertex();
// Loop over the decay products of each unstable particle.
// Look for a couple of B hadrons.
for (HepMC::GenVertex::particles_out_const_iterator it = vgen->particles_out_const_begin(); it != vgen->particles_out_const_end(); ++it) {
// If the particle produced has a bottom quark do not count it and go to the next loop cycle.
if (!( PID::hasBottom( (*it)->pdg_id() ) ) ) {
good_B = true;
continue;
} else {
good_B = false;
break;
}
}
if (good_B ) Bmom.push_back( p.momentum() );
}
else continue;
}
// If there are more than two B's in the final state veto the event
if (Bmom.size() != 2 ) vetoEvent;
// Calculate the observables
double dphiBB = fabs(Bmom[0].phi() - Bmom[1].phi());
double dRBB = deltaR(Bmom[0], Bmom[1]);
const FourMomentum& pZ = z[0].momentum();
const bool closest_B = ( deltaR(pZ, Bmom[0]) < deltaR(pZ, Bmom[1]) );
const double mindR_ZB = closest_B ? deltaR(pZ, Bmom[0]) : deltaR(pZ, Bmom[1]);
const double maxdR_ZB = closest_B ? deltaR(pZ, Bmom[1]) : deltaR(pZ, Bmom[0]);
const double AZBB = ( maxdR_ZB - mindR_ZB ) / ( maxdR_ZB + mindR_ZB );
// Get event weight for histogramming
const double weight = e.weight();
// Fill the histograms in the non-boosted region
_h_dphi_BB->fill(dphiBB, weight);
_h_dR_BB->fill(dRBB, weight);
_h_min_dR_ZB->fill(mindR_ZB, weight);
_h_A_ZBB->fill(AZBB, weight);
_sumW += weight;
_sumWpT += weight;
// Fill the histograms in the boosted region
if (is_boosted) {
_sumW50 += weight;
_h_dphi_BB_boost->fill(dphiBB, weight);
_h_dR_BB_boost->fill(dRBB, weight);
_h_min_dR_ZB_boost->fill(mindR_ZB, weight);
_h_A_ZBB_boost->fill(AZBB, weight);
}
// Fill Z pT (cumulative) histogram
_h_min_ZpT->fill(0, weight);
if (pZ.pT() > 40*GeV ) {
_sumWpT += weight;
_h_min_ZpT->fill(40, weight);
}
if (pZ.pT() > 80*GeV ) {
_sumWpT += weight;
_h_min_ZpT->fill(80, weight);
}
if (pZ.pT() > 120*GeV ) {
_sumWpT += weight;
_h_min_ZpT->fill(120, weight);
}
Bmom.clear();
}
/// Finalize
void finalize() {
// Normalize excluding overflow bins (d'oh)
normalize(_h_dR_BB, 0.7*crossSection()*_sumW/sumOfWeights(), false); // d01-x01-y01
normalize(_h_dphi_BB, 0.53*crossSection()*_sumW/sumOfWeights(), false); // d02-x01-y01
normalize(_h_min_dR_ZB, 0.84*crossSection()*_sumW/sumOfWeights(), false); // d03-x01-y01
normalize(_h_A_ZBB, 0.2*crossSection()*_sumW/sumOfWeights(), false); // d04-x01-y01
normalize(_h_dR_BB_boost, 0.84*crossSection()*_sumW50/sumOfWeights(), false); // d05-x01-y01
normalize(_h_dphi_BB_boost, 0.63*crossSection()*_sumW50/sumOfWeights(), false); // d06-x01-y01
normalize(_h_min_dR_ZB_boost, 1*crossSection()*_sumW50/sumOfWeights(), false); // d07-x01-y01
normalize(_h_A_ZBB_boost, 0.25*crossSection()*_sumW50/sumOfWeights(), false); // d08-x01-y01
normalize(_h_min_ZpT, 40*crossSection()*_sumWpT/sumOfWeights(), false); // d09-x01-y01
}
private:
/// @name Weight counters
//@{
double _sumW, _sumW50, _sumWpT;
//@}
/// @name Histograms
//@{
Histo1DPtr _h_dphi_BB, _h_dR_BB, _h_min_dR_ZB, _h_A_ZBB;
Histo1DPtr _h_dphi_BB_boost, _h_dR_BB_boost, _h_min_dR_ZB_boost, _h_A_ZBB_boost, _h_min_ZpT;
//@}
};
// Hook for the plugin system
DECLARE_RIVET_PLUGIN(CMS_2013_I1256943);
}
diff --git a/src/Analyses/MC_HFJETS.cc b/src/Analyses/MC_HFJETS.cc
--- a/src/Analyses/MC_HFJETS.cc
+++ b/src/Analyses/MC_HFJETS.cc
@@ -1,151 +1,151 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FastJets.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
#include "Rivet/Projections/PrimaryHadrons.hh"
#include "Rivet/Projections/HeavyHadrons.hh"
namespace Rivet {
-
+
class MC_HFJETS : public Analysis {
public:
/// Constructor
MC_HFJETS()
: Analysis("MC_HFJETS")
{ }
public:
/// @name Analysis methods
//@{
/// Book histograms and initialise projections before the run
void init() {
FastJets fj(FinalState(-5, 5), FastJets::ANTIKT, 0.6);
fj.useInvisibles();
addProjection(fj, "Jets");
- addProjection(HeavyHadrons(-5, 5, 500*MeV), "BCHadrons");
+ addProjection(HeavyHadrons(Cuts::abseta < 5 && Cuts::pT > 500*MeV), "BCHadrons");
_h_ptCJetLead = bookHisto1D("ptCJetLead", linspace(5, 0, 20, false) + logspace(25, 20, 200));
_h_ptCHadrLead = bookHisto1D("ptCHadrLead", linspace(5, 0, 10, false) + logspace(25, 10, 200));
_h_ptFracC = bookHisto1D("ptfracC", 50, 0, 1.5);
_h_eFracC = bookHisto1D("efracC", 50, 0, 1.5);
_h_ptBJetLead = bookHisto1D("ptBJetLead", linspace(5, 0, 20, false) + logspace(25, 20, 200));
_h_ptBHadrLead = bookHisto1D("ptBHadrLead", linspace(5, 0, 10, false) + logspace(25, 10, 200));
_h_ptFracB = bookHisto1D("ptfracB", 50, 0, 1.5);
_h_eFracB = bookHisto1D("efracB", 50, 0, 1.5);
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
// Get jets and heavy hadrons
const Jets& jets = applyProjection<JetAlg>(event, "Jets").jetsByPt();
const Particles bhadrons = sortByPt(applyProjection<HeavyHadrons>(event, "BCHadrons").bHadrons());
const Particles chadrons = sortByPt(applyProjection<HeavyHadrons>(event, "BCHadrons").cHadrons());
MSG_DEBUG("# b hadrons = " << bhadrons.size() << ", # c hadrons = " << chadrons.size());
// Max HF hadron--jet axis dR to be regarded as a jet tag
const double MAX_DR = 0.3;
// Tag the leading b and c jets with a deltaR < 0.3 match
// b-tagged jet are excluded from also being considered as c-tagged
/// @todo Do this again with the ghost match?
MSG_DEBUG("Getting b/c-tags");
bool gotLeadingB = false, gotLeadingC = false;;
foreach (const Jet& j, jets) {
if (!gotLeadingB) {
FourMomentum leadBJet, leadBHadr;
double dRmin = MAX_DR;
foreach (const Particle& b, bhadrons) {
const double dRcand = min(dRmin, deltaR(j, b));
if (dRcand < dRmin) {
dRmin = dRcand;
leadBJet = j.momentum();
leadBHadr = b.momentum();
MSG_DEBUG("New closest b-hadron jet tag candidate: dR = " << dRmin
<< " for jet pT = " << j.pT()/GeV << " GeV, "
<< " b hadron pT = " << b.pT()/GeV << " GeV, PID = " << b.pid());
}
}
if (dRmin < MAX_DR) {
// A jet has been tagged, so fill the histos and break the loop
_h_ptBJetLead->fill(leadBJet.pT()/GeV, weight);
_h_ptBHadrLead->fill(leadBHadr.pT()/GeV, weight);
_h_ptFracB->fill(leadBHadr.pT() / leadBJet.pT(), weight);
_h_eFracB->fill(leadBHadr.E() / leadBJet.E(), weight);
gotLeadingB = true;
continue; // escape this loop iteration so the same jet isn't c-tagged
}
}
if (!gotLeadingC) {
FourMomentum leadCJet, leadCHadr;
double dRmin = MAX_DR;
foreach (const Particle& c, chadrons) {
const double dRcand = min(dRmin, deltaR(j, c));
if (dRcand < dRmin) {
dRmin = dRcand;
leadCJet = j.momentum();
leadCHadr = c.momentum();
MSG_DEBUG("New closest c-hadron jet tag candidate: dR = " << dRmin
<< " for jet pT = " << j.pT()/GeV << " GeV, "
<< " c hadron pT = " << c.pT()/GeV << " GeV, PID = " << c.pid());
}
}
if (dRmin < MAX_DR) {
// A jet has been tagged, so fill the histos and break the loop
_h_ptCJetLead->fill(leadCJet.pT()/GeV, weight);
_h_ptCHadrLead->fill(leadCHadr.pT()/GeV, weight);
_h_ptFracC->fill(leadCHadr.pT() / leadCJet.pT(), weight);
_h_eFracC->fill(leadCHadr.E() / leadCJet.E(), weight);
gotLeadingB = true;
}
}
// If we've found both a leading b and a leading c jet, break the loop over jets
if (gotLeadingB && gotLeadingC) break;
}
}
/// Normalise histograms etc., after the run
void finalize() {
normalize(_h_ptCJetLead);
normalize(_h_ptCHadrLead);
normalize(_h_ptFracC);
normalize(_h_eFracC);
normalize(_h_ptBJetLead);
normalize(_h_ptBHadrLead);
normalize(_h_ptFracB);
normalize(_h_eFracB);
}
//@}
private:
/// @name Histograms
//@{
Histo1DPtr _h_ptCJetLead, _h_ptCHadrLead, _h_ptFracC, _h_eFracC;
Histo1DPtr _h_ptBJetLead, _h_ptBHadrLead, _h_ptFracB, _h_eFracB;
//@}
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(MC_HFJETS);
}
diff --git a/src/Analyses/MC_IDENTIFIED.cc b/src/Analyses/MC_IDENTIFIED.cc
--- a/src/Analyses/MC_IDENTIFIED.cc
+++ b/src/Analyses/MC_IDENTIFIED.cc
@@ -1,106 +1,104 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/FinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
-
-
/// Generic analysis looking at various distributions of final state particles
/// @todo Rename as MC_HADRONS
class MC_IDENTIFIED : public Analysis {
public:
/// Constructor
MC_IDENTIFIED()
: Analysis("MC_IDENTIFIED")
{ }
public:
/// @name Analysis methods
//@{
/// Book histograms and initialise projections before the run
void init() {
// Projections
- const FinalState cnfs(-5.0, 5.0, 500*MeV);
+ const FinalState cnfs(Cuts::abseta < 5.0 && Cuts::pT > 500*MeV);
addProjection(cnfs, "FS");
- addProjection(UnstableFinalState(-5.0, 5.0, 500*MeV), "UFS");
+ addProjection(UnstableFinalState(Cuts::abseta < 5.0 && Cuts::pT > 500*MeV), "UFS");
// Histograms
// @todo Choose E/pT ranged based on input energies... can't do anything about kin. cuts, though
_histStablePIDs = bookHisto1D("MultsStablePIDs", 3335, -0.5, 3334.5);
_histDecayedPIDs = bookHisto1D("MultsDecayedPIDs", 3335, -0.5, 3334.5);
_histAllPIDs = bookHisto1D("MultsAllPIDs", 3335, -0.5, 3334.5);
_histEtaPi = bookHisto1D("EtaPi", 25, 0, 5);
_histEtaK = bookHisto1D("EtaK", 25, 0, 5);
_histEtaLambda = bookHisto1D("EtaLambda", 25, 0, 5);
}
/// Perform the per-event analysis
void analyze(const Event& event) {
const double weight = event.weight();
// Unphysical (debug) plotting of all PIDs in the event, physical or otherwise
foreach (const GenParticle* gp, particles(event.genEvent())) {
_histAllPIDs->fill(abs(gp->pdg_id()), weight);
}
// Charged + neutral final state PIDs
const FinalState& cnfs = applyProjection<FinalState>(event, "FS");
foreach (const Particle& p, cnfs.particles()) {
_histStablePIDs->fill(p.abspid(), weight);
}
// Unstable PIDs and identified particle eta spectra
const UnstableFinalState& ufs = applyProjection<UnstableFinalState>(event, "UFS");
foreach (const Particle& p, ufs.particles()) {
_histDecayedPIDs->fill(p.pid(), weight);
const double eta_abs = p.abseta();
const PdgId pid = p.abspid(); //if (PID::isMeson(pid) && PID::hasStrange()) {
if (pid == 211 || pid == 111) _histEtaPi->fill(eta_abs, weight);
else if (pid == 321 || pid == 130 || pid == 310) _histEtaK->fill(eta_abs, weight);
else if (pid == 3122) _histEtaLambda->fill(eta_abs, weight);
}
}
/// Finalize
void finalize() {
scale(_histStablePIDs, 1/sumOfWeights());
scale(_histDecayedPIDs, 1/sumOfWeights());
scale(_histAllPIDs, 1/sumOfWeights());
scale(_histEtaPi, 1/sumOfWeights());
scale(_histEtaK, 1/sumOfWeights());
scale(_histEtaLambda, 1/sumOfWeights());
}
//@}
private:
/// @name Histograms
//@{
Histo1DPtr _histStablePIDs, _histDecayedPIDs, _histAllPIDs;
Histo1DPtr _histEtaPi, _histEtaK, _histEtaLambda;
//@}
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(MC_IDENTIFIED);
}
diff --git a/src/Analyses/Makefile.am b/src/Analyses/Makefile.am
--- a/src/Analyses/Makefile.am
+++ b/src/Analyses/Makefile.am
@@ -1,432 +1,434 @@
## Flags for building all plugins
AM_LDFLAGS = $(LDFLAGS) -module -avoid-version -L$(FASTJETLIBPATH)
LIBS = $(FASTJETCONFIGLIBADD)
lib_LTLIBRARIES =
noinst_LTLIBRARIES = libRivetAnalysisTools.la
libRivetAnalysisTools_la_SOURCES = \
MC_ParticleAnalysis.cc \
MC_JetAnalysis.cc \
MC_JetSplittings.cc
## ANALYSIS CATEGORIES
##
## Unvalidated analyses: add new standard analyses here, and only
## move them into the collider-specific standard plugin libraries
## once they have been finished and checked. The --enable-unvalidated
## flag is needed to install the unvalidated analyses.
##
## Preliminary analyses: validated analyses whose experimental paper has not
## been fully accepted for publication should go here. Analyses in this group
## are considered safe to use but the reference data may yet change. In
## progressing from preliminary status to a permanent published analyses
## collection, the analysis name is very likely to change, so you should ensure
## that any Rivet-using scripts are not broken by such name changes when
## upgrading between Rivet versions. These analyses will not be available if
## Rivet is built with the --disable-preliminary configure flag.
##
## Obsolete analyses: as mentioned above, when a preliminary analysis becomes
## permanent its name will change to reflect its newly published status via the
## publication's SPIRES ID. The previous form of the analysis, possibly with
## different reference histograms, will be retained for one major version of
## Rivet with a status of "obsolete" before being removed, to give users time to
## migrate their run scripts, i.e. if an analysis is marked as obsolete in
## version 1.4.2, it will remain in Rivet's distribution until version
## 1.5.0. Obsolete analyses will not be available if Rivet is built with the
## --disable-obsolete configure flag.
lib_LTLIBRARIES += RivetALICEAnalyses.la
RivetALICEAnalyses_la_SOURCES = \
ALICE_2010_S8624100.cc \
ALICE_2010_S8625980.cc \
ALICE_2010_S8706239.cc \
ALICE_2011_S8909580.cc \
ALICE_2011_S8945144.cc \
ALICE_2012_I1181770.cc
lib_LTLIBRARIES += RivetATLASAnalyses.la
RivetATLASAnalyses_la_SOURCES = \
ATLAS_2010_S8591806.cc \
ATLAS_2010_S8817804.cc \
ATLAS_2010_S8894728.cc \
ATLAS_2010_S8914702.cc \
ATLAS_2010_S8918562.cc \
ATLAS_2010_S8919674.cc \
ATLAS_2011_S8924791.cc \
ATLAS_2011_S8971293.cc \
ATLAS_2011_S8994773.cc \
ATLAS_2011_S8983313.cc \
ATLAS_2011_S9002537.cc \
ATLAS_2011_S9120807.cc \
ATLAS_2011_S9126244.cc \
ATLAS_2011_S9128077.cc \
ATLAS_2011_S9131140.cc \
ATLAS_2011_S9212183.cc \
ATLAS_2011_S9225137.cc \
ATLAS_2011_S9019561.cc \
ATLAS_2011_I894867.cc \
ATLAS_2011_I919017.cc \
ATLAS_2011_I921594.cc \
ATLAS_2011_I925932.cc \
ATLAS_2011_I926145.cc \
ATLAS_2011_I930220.cc \
ATLAS_2011_I944826.cc \
ATLAS_2011_I945498.cc \
ATLAS_2011_I954993.cc \
ATLAS_2012_I943401.cc \
ATLAS_2012_I1082936.cc \
ATLAS_2012_I1084540.cc \
ATLAS_2012_I1091481.cc \
ATLAS_2012_I1093734.cc \
ATLAS_2012_I1093738.cc \
ATLAS_2012_I1094564.cc \
ATLAS_2012_I1094568.cc \
ATLAS_2012_I1112263.cc \
ATLAS_2012_I1117704.cc \
ATLAS_2012_I1118269.cc \
ATLAS_2012_I1119557.cc \
ATLAS_2012_I1124167.cc \
ATLAS_2012_I1125575.cc \
ATLAS_2012_I1125961.cc \
ATLAS_2012_I1183818.cc \
ATLAS_2012_I1188891.cc \
ATLAS_2012_I1199269.cc \
ATLAS_2012_I1203852.cc \
ATLAS_2012_I1204447.cc \
ATLAS_2012_I1204784.cc \
ATLAS_2013_I1190187.cc \
ATLAS_2013_I1217867.cc \
+ ATLAS_2013_I1219109.cc \
ATLAS_2013_I1230812.cc \
ATLAS_2013_I1243871.cc \
ATLAS_2013_I1263495.cc \
ATLAS_2014_I1268975.cc \
ATLAS_2014_I1279489.cc \
ATLAS_2014_I1282441.cc \
ATLAS_2014_I1298811.cc \
ATLAS_2014_I1304688.cc \
- ATLAS_2014_I1307756.cc
+ ATLAS_2014_I1307756.cc \
+ ATLAS_2014_I1306294.cc
if ENABLE_PRELIMINARY
RivetATLASAnalyses_la_SOURCES += \
ATLAS_2012_CONF_2012_001.cc
endif
if ENABLE_UNVALIDATED
RivetATLASAnalyses_la_SOURCES += \
ATLAS_2010_CONF_2010_049.cc \
ATLAS_2011_S9041966.cc \
ATLAS_2011_CONF_2011_090.cc \
ATLAS_2011_CONF_2011_098.cc \
ATLAS_2011_S9108483.cc \
ATLAS_2011_S9212353.cc \
ATLAS_2011_S9035664.cc \
ATLAS_2012_I1083318.cc \
ATLAS_2012_I1095236.cc \
ATLAS_2012_I1082009.cc \
ATLAS_2012_I946427.cc \
ATLAS_2012_I1126136.cc \
ATLAS_2012_I1180197.cc \
ATLAS_2012_I1186556.cc \
ATLAS_2012_I1190891.cc \
ATLAS_2012_CONF_2012_103.cc \
ATLAS_2012_CONF_2012_104.cc \
ATLAS_2012_CONF_2012_105.cc \
ATLAS_2012_CONF_2012_109.cc \
ATLAS_2012_CONF_2012_153.cc
endif
lib_LTLIBRARIES += RivetCMSAnalyses.la
RivetCMSAnalyses_la_SOURCES = \
CMS_2010_S8547297.cc \
CMS_2010_S8656010.cc \
CMS_2011_S8884919.cc \
CMS_2011_S8941262.cc \
CMS_2011_S8950903.cc \
CMS_2011_S8957746.cc \
CMS_2011_S8968497.cc \
CMS_2011_S8973270.cc \
CMS_2011_S8978280.cc \
CMS_2011_S9086218.cc \
CMS_2011_S9088458.cc \
CMS_2011_S9120041.cc \
CMS_2011_S9215166.cc \
CMS_2012_I941555.cc \
CMS_2011_I954992.cc \
CMS_2012_I1087342.cc \
CMS_2012_I1090423.cc \
CMS_2012_I1102908.cc \
CMS_2012_I1107658.cc \
CMS_2012_I1184941.cc \
CMS_2012_I1193338.cc \
CMS_2013_I1209721.cc \
CMS_2013_I1218372.cc \
CMS_2013_I1224539_DIJET.cc \
CMS_2013_I1224539_WJET.cc \
CMS_2013_I1224539_ZJET.cc \
CMS_2013_I1256943.cc \
CMS_2013_I1258128.cc \
CMS_2013_I1261026.cc \
CMS_2013_I1265659.cc \
CMS_2013_I1272853.cc \
CMS_2013_I1273574.cc \
CMSTOTEM_2014_I1294140.cc
if ENABLE_PRELIMINARY
RivetCMSAnalyses_la_SOURCES += \
CMS_QCD_10_024.cc \
CMS_2012_PAS_QCD_11_010.cc
endif
lib_LTLIBRARIES += RivetLHCbAnalyses.la
RivetLHCbAnalyses_la_SOURCES = \
LHCB_2010_I867355.cc \
LHCB_2013_I1218996.cc \
LHCB_2013_I1208105.cc
if ENABLE_UNVALIDATED
RivetLHCbAnalyses_la_SOURCES += \
LHCB_2010_S8758301.cc \
LHCB_2011_I917009.cc \
LHCB_2011_I919315.cc \
LHCB_2012_I1119400.cc
endif
lib_LTLIBRARIES += RivetLHCfAnalyses.la
RivetLHCfAnalyses_la_SOURCES = \
LHCF_2012_I1115479.cc
lib_LTLIBRARIES += RivetTOTEMAnalyses.la
RivetTOTEMAnalyses_la_SOURCES = \
TOTEM_2012_I1115294.cc \
TOTEM_2012_002.cc # TODO: update to Inspire ID
lib_LTLIBRARIES += RivetCDFAnalyses.la
RivetCDFAnalyses_la_SOURCES = \
CDF_1988_S1865951.cc \
CDF_1990_S2089246.cc \
CDF_1994_S2952106.cc \
CDF_1996_S3418421.cc \
CDF_1998_S3618439.cc \
CDF_2000_S4155203.cc \
CDF_2000_S4266730.cc \
CDF_2001_S4517016.cc \
CDF_2001_S4563131.cc \
CDF_2001_S4751469.cc \
CDF_2002_S4796047.cc \
CDF_2004_S5839831.cc \
CDF_2005_S6080774.cc \
CDF_2005_S6217184.cc \
CDF_2006_S6450792.cc \
CDF_2006_S6653332.cc \
CDF_2007_S7057202.cc \
CDF_2008_S7540469.cc \
CDF_2008_S7828950.cc \
CDF_2008_S8093652.cc \
CDF_2008_S8095620.cc \
CDF_2009_S8233977.cc \
CDF_2009_S8383952.cc \
CDF_2009_S8436959.cc \
CDF_2010_S8591881_DY.cc \
CDF_2010_S8591881_QCD.cc
if ENABLE_PRELIMINARY
RivetCDFAnalyses_la_SOURCES += \
CDF_2009_NOTE_9936.cc \
CDF_2012_NOTE10874.cc
endif
# if ENABLE_OBSOLETE
# RivetCDFAnalyses_la_SOURCES +=
# endif
if ENABLE_UNVALIDATED
RivetCDFAnalyses_la_SOURCES += \
CDF_1993_S2742446.cc \
CDF_1996_S3108457.cc \
CDF_1996_S3349578.cc \
CDF_1997_S3541940.cc \
CDF_2008_S7541902.cc \
CDF_2008_S7782535.cc
endif
lib_LTLIBRARIES += RivetD0Analyses.la
RivetD0Analyses_la_SOURCES = \
D0_2000_S4480767.cc \
D0_2000_I499943.cc \
D0_2001_S4674421.cc \
D0_2004_S5992206.cc \
D0_2006_S6438750.cc \
D0_2007_S7075677.cc \
D0_2008_S6879055.cc \
D0_2008_S7554427.cc \
D0_2008_S7662670.cc \
D0_2008_S7719523.cc \
D0_2008_S7837160.cc \
D0_2008_S7863608.cc \
D0_2009_S8202443.cc \
D0_2009_S8320160.cc \
D0_2009_S8349509.cc \
D0_2010_S8566488.cc \
D0_2010_S8570965.cc \
D0_2010_S8671338.cc \
D0_2010_S8821313.cc \
D0_2011_I895662.cc
if ENABLE_UNVALIDATED
RivetD0Analyses_la_SOURCES += \
D0_1996_S3214044.cc \
D0_1996_S3324664.cc
endif
lib_LTLIBRARIES += RivetHERAAnalyses.la
RivetHERAAnalyses_la_SOURCES = \
H1_1994_S2919893.cc \
H1_2000_S4129130.cc
if ENABLE_UNVALIDATED
RivetHERAAnalyses_la_SOURCES += \
H1_1995_S3167097.cc \
ZEUS_2001_S4815815.cc
endif
lib_LTLIBRARIES += RivetPetraAnalyses.la
RivetPetraAnalyses_la_SOURCES = \
JADE_1998_S3612880.cc \
TASSO_1990_S2148048.cc
lib_LTLIBRARIES += RivetLEPAnalyses.la
RivetLEPAnalyses_la_SOURCES = \
ALEPH_1991_S2435284.cc \
ALEPH_1996_S3486095.cc \
ALEPH_1996_S3196992.cc \
ALEPH_1999_S4193598.cc \
ALEPH_2001_S4656318.cc \
ALEPH_2002_S4823664.cc \
ALEPH_2004_S5765862.cc \
DELPHI_1995_S3137023.cc \
DELPHI_1996_S3430090.cc \
DELPHI_1999_S3960137.cc \
DELPHI_2000_S4328825.cc \
OPAL_1994_S2927284.cc \
OPAL_1995_S3198391.cc \
OPAL_1996_S3257789.cc \
OPAL_1997_S3396100.cc \
OPAL_1997_S3608263.cc \
OPAL_1998_S3702294.cc \
OPAL_1998_S3749908.cc \
OPAL_1998_S3780481.cc \
OPAL_2000_S4418603.cc \
OPAL_2001_S4553896.cc \
OPAL_2002_S5361494.cc \
OPAL_2004_S6132243.cc \
SLD_1996_S3398250.cc \
SLD_1999_S3743934.cc \
SLD_2002_S4869273.cc \
SLD_2004_S5693039.cc
if ENABLE_PRELIMINARY
RivetLEPAnalyses_la_SOURCES += \
DELPHI_2002_069_CONF_603.cc
endif
if ENABLE_UNVALIDATED
RivetLEPAnalyses_la_SOURCES += \
OPAL_1993_S2692198.cc \
DELPHI_2003_WUD_03_11.cc
endif
lib_LTLIBRARIES += RivetRHICAnalyses.la
RivetRHICAnalyses_la_SOURCES = \
STAR_2006_S6500200.cc \
STAR_2006_S6860818.cc \
STAR_2006_S6870392.cc
if ENABLE_PRELIMINARY
RivetRHICAnalyses_la_SOURCES += \
STAR_2009_UE_HELEN.cc
endif
if ENABLE_UNVALIDATED
RivetRHICAnalyses_la_SOURCES += \
STAR_2008_S7869363.cc \
STAR_2008_S7993412.cc
endif
lib_LTLIBRARIES += RivetSPSAnalyses.la
RivetSPSAnalyses_la_SOURCES = \
UA1_1990_S2044935.cc \
UA5_1982_S875503.cc \
UA5_1986_S1583476.cc \
UA5_1987_S1640666.cc \
UA5_1988_S1867512.cc \
UA5_1989_S1926373.cc
lib_LTLIBRARIES += RivetMiscAnalyses.la
RivetMiscAnalyses_la_SOURCES = \
PDG_HADRON_MULTIPLICITIES.cc \
PDG_HADRON_MULTIPLICITIES_RATIOS.cc \
JADE_OPAL_2000_S4300807.cc \
ARGUS_1993_S2653028.cc \
ARGUS_1993_S2669951.cc \
ARGUS_1993_S2789213.cc \
BABAR_2003_I593379.cc \
BABAR_2005_S6181155.cc \
BABAR_2007_S6895344.cc \
BABAR_2007_S7266081.cc \
BABAR_2013_I1238276.cc \
BELLE_2001_S4598261.cc \
BELLE_2008_I786560.cc \
BELLE_2013_I1216515.cc \
CLEO_2004_S5809304.cc
if ENABLE_UNVALIDATED
RivetMiscAnalyses_la_SOURCES += \
E735_1998_S3905616.cc \
SFM_1984_S1178091.cc
endif
lib_LTLIBRARIES += RivetMCAnalyses.la
RivetMCAnalyses_la_SOURCES = \
EXAMPLE.cc \
EXAMPLE_CUTS.cc \
MC_QCD_PARTONS.cc \
MC_DIPHOTON.cc \
MC_ELECTRONS.cc \
MC_GENERIC.cc \
MC_HINC.cc \
MC_HJETS.cc \
MC_HKTSPLITTINGS.cc \
MC_IDENTIFIED.cc \
MC_JETS.cc \
MC_JETTAGS.cc \
MC_HFJETS.cc \
MC_LEADJETUE.cc \
MC_MUONS.cc \
MC_PDFS.cc \
MC_PHOTONINC.cc \
MC_PHOTONJETS.cc \
MC_PHOTONKTSPLITTINGS.cc \
MC_PHOTONS.cc \
MC_PRINTEVENT.cc \
MC_SUSY.cc \
MC_TTBAR.cc \
MC_VH2BB.cc \
MC_WINC.cc \
MC_WJETS.cc \
MC_WKTSPLITTINGS.cc \
MC_WPOL.cc \
MC_WWINC.cc \
MC_WWJETS.cc \
MC_WWKTSPLITTINGS.cc \
MC_XS.cc \
MC_ZINC.cc \
MC_ZJETS.cc \
MC_ZKTSPLITTINGS.cc \
MC_ZZINC.cc \
MC_ZZJETS.cc \
MC_ZZKTSPLITTINGS.cc
if ENABLE_UNVALIDATED
RivetMCAnalyses_la_SOURCES += \
MC_DIJET.cc \
MC_PHOTONJETUE.cc
endif
diff --git a/src/Analyses/STAR_2006_S6860818.cc b/src/Analyses/STAR_2006_S6860818.cc
--- a/src/Analyses/STAR_2006_S6860818.cc
+++ b/src/Analyses/STAR_2006_S6860818.cc
@@ -1,193 +1,193 @@
// -*- C++ -*-
#include "Rivet/Analysis.hh"
#include "Rivet/Projections/ChargedFinalState.hh"
#include "Rivet/Projections/IdentifiedFinalState.hh"
#include "Rivet/Projections/UnstableFinalState.hh"
namespace Rivet {
/// @brief STAR strange particle spectra in pp at 200 GeV
class STAR_2006_S6860818 : public Analysis {
public:
/// Constructor
STAR_2006_S6860818()
: Analysis("STAR_2006_S6860818"),
_sumWeightSelected(0.0)
{
for (size_t i = 0; i < 4; i++) {
_nBaryon[i] = 0;
_nAntiBaryon[i] = 0;
_nWeightedBaryon[i] = 0.;
_nWeightedAntiBaryon[i] = 0.;
}
}
/// Book projections and histograms
void init() {
- ChargedFinalState bbc1(-5.0,-3.5, 0.0*GeV); // beam-beam-counter trigger
- ChargedFinalState bbc2( 3.5, 5.0, 0.0*GeV); // beam-beam-counter trigger
+ ChargedFinalState bbc1(Cuts::etaIn(-5.0, -3.5)); // beam-beam-counter trigger
+ ChargedFinalState bbc2(Cuts::etaIn( 3.5, 5.0)); // beam-beam-counter trigger
addProjection(bbc1, "BBC1");
addProjection(bbc2, "BBC2");
- UnstableFinalState ufs(-2.5, 2.5, 0.0*GeV);
+ UnstableFinalState ufs(Cuts::abseta < 2.5);
addProjection(ufs, "UFS");
_h_pT_k0s = bookHisto1D(1, 1, 1);
_h_pT_kminus = bookHisto1D(1, 2, 1);
_h_pT_kplus = bookHisto1D(1, 3, 1);
_h_pT_lambda = bookHisto1D(1, 4, 1);
_h_pT_lambdabar = bookHisto1D(1, 5, 1);
_h_pT_ximinus = bookHisto1D(1, 6, 1);
_h_pT_xiplus = bookHisto1D(1, 7, 1);
//_h_pT_omega = bookHisto1D(1, 8, 1);
_h_antibaryon_baryon_ratio = bookScatter2D(2, 1, 1);
_h_lambar_lam = bookScatter2D(2, 2, 1);
_h_xiplus_ximinus = bookScatter2D(2, 3, 1);
_h_pT_vs_mass = bookProfile1D(3, 1, 1);
}
/// Do the analysis
void analyze(const Event& event) {
const ChargedFinalState& bbc1 = applyProjection<ChargedFinalState>(event, "BBC1");
const ChargedFinalState& bbc2 = applyProjection<ChargedFinalState>(event, "BBC2");
if (bbc1.size()<1 || bbc2.size()<1) {
MSG_DEBUG("Failed beam-beam-counter trigger");
vetoEvent;
}
const double weight = event.weight();
const UnstableFinalState& ufs = applyProjection<UnstableFinalState>(event, "UFS");
foreach (const Particle& p, ufs.particles()) {
if (p.absrap() < 0.5) {
const PdgId pid = p.pid();
const double pT = p.pT() / GeV;
switch (abs(pid)) {
case PID::PIPLUS:
if (pid < 0) _h_pT_vs_mass->fill(0.1396, pT, weight);
break;
case PID::PROTON:
if (pid < 0) _h_pT_vs_mass->fill(0.9383, pT, weight);
if (pT > 0.4) {
pid > 0 ? _nBaryon[0]++ : _nAntiBaryon[0]++;
pid > 0 ? _nWeightedBaryon[0]+=weight : _nWeightedAntiBaryon[0]+=weight;
}
break;
case PID::K0S:
if (pT > 0.2) {
_h_pT_k0s->fill(pT, weight/pT);
}
_h_pT_vs_mass->fill(0.5056, pT, weight);
break;
case PID::K0L:
_h_pT_vs_mass->fill(0.5056, pT, weight);
break;
case 113: // rho0(770)
_h_pT_vs_mass->fill(0.7755, pT, weight);
break;
case 313: // K0*(892)
_h_pT_vs_mass->fill(0.8960, pT, weight);
break;
case 333: // phi(1020)
_h_pT_vs_mass->fill(1.0190, pT, weight);
break;
case 3214: // Sigma(1385)
_h_pT_vs_mass->fill(1.3840, pT, weight);
break;
case 3124: // Lambda(1520)
_h_pT_vs_mass->fill(1.5200, pT, weight);
break;
case PID::KPLUS:
if (pid < 0) _h_pT_vs_mass->fill(0.4856, pT, weight);
if (pT > 0.2) {
pid > 0 ? _h_pT_kplus->fill(pT, weight/pT) : _h_pT_kminus->fill(pT, weight/pT);
}
break;
case PID::LAMBDA:
pid > 0 ? _h_pT_vs_mass->fill(1.1050, pT, weight) : _h_pT_vs_mass->fill(1.1250, pT, weight);
if (pT > 0.3) {
pid > 0 ? _h_pT_lambda->fill(pT, weight/pT) : _h_pT_lambdabar->fill(pT, weight/pT);
pid > 0 ? _nBaryon[1]++ : _nAntiBaryon[1]++;
pid > 0 ? _nWeightedBaryon[1]+=weight : _nWeightedAntiBaryon[1]+=weight;
}
break;
case PID::XIMINUS:
pid > 0 ? _h_pT_vs_mass->fill(1.3120, pT, weight) : _h_pT_vs_mass->fill(1.3320, pT, weight);
if (pT > 0.5) {
pid > 0 ? _h_pT_ximinus->fill(pT, weight/pT) : _h_pT_xiplus->fill(pT, weight/pT);
pid > 0 ? _nBaryon[2]++ : _nAntiBaryon[2]++;
pid > 0 ? _nWeightedBaryon[2]+=weight : _nWeightedAntiBaryon[2]+=weight;
}
break;
case PID::OMEGAMINUS:
_h_pT_vs_mass->fill(1.6720, pT, weight);
if (pT > 0.5) {
//_h_pT_omega->fill(pT, weight/pT);
pid > 0 ? _nBaryon[3]++ : _nAntiBaryon[3]++;
pid > 0 ? _nWeightedBaryon[3]+=weight : _nWeightedAntiBaryon[3]+=weight;
}
break;
}
}
}
_sumWeightSelected += event.weight();
}
/// Finalize
void finalize() {
std::vector<Point2D> points;
for (size_t i=0 ; i<4 ; i++) {
if (_nWeightedBaryon[i]==0 || _nWeightedAntiBaryon[i]==0) {
points.push_back(Point2D(i,0,0.5,0));
} else {
double y = _nWeightedAntiBaryon[i]/_nWeightedBaryon[i];
double dy = sqrt( 1./_nAntiBaryon[i] + 1./_nBaryon[i] );
points.push_back(Point2D(i,y,0.5,y*dy));
}
}
_h_antibaryon_baryon_ratio->addPoints( points );
divide(_h_pT_lambdabar,_h_pT_lambda, _h_lambar_lam);
divide(_h_pT_xiplus,_h_pT_ximinus, _h_xiplus_ximinus);
scale(_h_pT_k0s, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_kminus, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_kplus, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_lambda, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_lambdabar, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_ximinus, 1./(2*M_PI*_sumWeightSelected));
scale(_h_pT_xiplus, 1./(2*M_PI*_sumWeightSelected));
//scale(_h_pT_omega, 1./(2*M_PI*_sumWeightSelected));
MSG_DEBUG("sumOfWeights() = " << sumOfWeights());
MSG_DEBUG("_sumWeightSelected = " << _sumWeightSelected);
}
private:
double _sumWeightSelected;
int _nBaryon[4];
int _nAntiBaryon[4];
double _nWeightedBaryon[4];
double _nWeightedAntiBaryon[4];
Histo1DPtr _h_pT_k0s, _h_pT_kminus, _h_pT_kplus, _h_pT_lambda, _h_pT_lambdabar, _h_pT_ximinus, _h_pT_xiplus;
//Histo1DPtr _h_pT_omega;
Scatter2DPtr _h_antibaryon_baryon_ratio;
Profile1DPtr _h_pT_vs_mass;
Scatter2DPtr _h_lambar_lam;
Scatter2DPtr _h_xiplus_ximinus;
};
// The hook for the plugin system
DECLARE_RIVET_PLUGIN(STAR_2006_S6860818);
}
diff --git a/src/Core/Particle.cc b/src/Core/Particle.cc
--- a/src/Core/Particle.cc
+++ b/src/Core/Particle.cc
@@ -1,70 +1,84 @@
#include "Rivet/Particle.hh"
#include "Rivet/Tools/RivetBoost.hh"
#include "Rivet/Tools/ParticleIdUtils.hh"
namespace Rivet {
/// @todo Neaten this up with C++11, via one waliker function and several uses with lamba tests
bool Particle::hasAncestor(PdgId pdg_id) const {
/// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
GenVertex* prodVtx = genParticle()->production_vertex();
if (prodVtx == 0) return false;
foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
if (ancestor->pdg_id() == pdg_id) return true;
}
return false;
}
- bool Particle::fromDecay() const {
- /// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
- GenVertex* prodVtx = genParticle()->production_vertex();
- if (prodVtx == NULL) return false;
- foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
- const PdgId pid = ancestor->pdg_id();
- if (ancestor->status() == 2 && (PID::isHadron(pid) || abs(pid) == PID::TAU)) return true;
- }
- return false;
- }
-
-
bool Particle::fromBottom() const {
/// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
GenVertex* prodVtx = genParticle()->production_vertex();
if (prodVtx == NULL) return false;
foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
const PdgId pid = ancestor->pdg_id();
if (ancestor->status() == 2 && (PID::isHadron(pid) && PID::hasBottom(pid))) return true;
}
return false;
}
bool Particle::fromCharm() const {
/// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
GenVertex* prodVtx = genParticle()->production_vertex();
if (prodVtx == NULL) return false;
foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
const PdgId pid = ancestor->pdg_id();
if (ancestor->status() == 2 && (PID::isHadron(pid) && PID::hasCharm(pid) && !PID::hasBottom(pid))) return true;
}
return false;
}
- bool Particle::fromTau() const {
+
+ bool Particle::fromHadron() const {
/// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
GenVertex* prodVtx = genParticle()->production_vertex();
if (prodVtx == NULL) return false;
foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
const PdgId pid = ancestor->pdg_id();
+ if (ancestor->status() == 2 && PID::isHadron(pid)) return true;
+ }
+ return false;
+ }
+
+
+ bool Particle::fromTau(bool prompt_taus_only) const {
+ /// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
+ if (prompt_taus_only && fromHadron()) return false;
+ GenVertex* prodVtx = genParticle()->production_vertex();
+ if (prodVtx == NULL) return false;
+ foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
+ const PdgId pid = ancestor->pdg_id();
if (ancestor->status() == 2 && abs(pid) == PID::TAU) return true;
}
return false;
}
+ // bool Particle::fromDecay() const {
+ // /// @todo Shouldn't a const vertex be being returned? Ah, HepMC...
+ // GenVertex* prodVtx = genParticle()->production_vertex();
+ // if (prodVtx == NULL) return false;
+ // foreach (const GenParticle* ancestor, particles(prodVtx, HepMC::ancestors)) {
+ // const PdgId pid = ancestor->pdg_id();
+ // if (ancestor->status() == 2 && (PID::isHadron(pid) || abs(pid) == PID::TAU)) return true;
+ // }
+ // return false;
+ // }
+
+
}
diff --git a/src/Projections/FinalPartons.cc b/src/Projections/FinalPartons.cc
new file mode 100644
--- /dev/null
+++ b/src/Projections/FinalPartons.cc
@@ -0,0 +1,46 @@
+// -*- C++ -*-
+
+#include "Rivet/Projections/FinalPartons.hh"
+#include "Rivet/Tools/RivetHepMC.hh"
+
+namespace Rivet {
+
+
+ bool FinalPartons::accept(const Particle& p) const {
+
+ // if *not* a parton, photon, electron, or muon: reject.
+ if (!isParton(p))
+ return false;
+
+ // accept partons if they end on a standard hadronization vertex
+ if (p.genParticle()->end_vertex() != NULL &&
+ p.genParticle()->end_vertex()->id() == 5)
+ return true;
+
+ // reject if p has a parton child.
+ foreach (const Particle& c, p.children()) {
+ if (isParton(c))
+ return false;
+ }
+
+ // reject if from a hadron or tau decay.
+ if (p.fromDecay())
+ return false;
+
+ return _cuts->accept(p);
+ }
+
+
+ void FinalPartons::project(const Event& e) {
+ _theParticles.clear();
+
+ foreach (const GenParticle* gp, Rivet::particles(e.genEvent())) {
+ if (!gp) continue;
+
+ const Particle p(gp);
+ if (accept(p)) _theParticles.push_back(p);
+ }
+ }
+
+
+}
diff --git a/src/Projections/Makefile.am b/src/Projections/Makefile.am
--- a/src/Projections/Makefile.am
+++ b/src/Projections/Makefile.am
@@ -1,44 +1,45 @@
noinst_LTLIBRARIES = libRivetProjections.la
libRivetProjections_la_SOURCES = \
Beam.cc \
BeamThrust.cc \
ChargedFinalState.cc \
ChargedLeptons.cc \
CentralEtHCM.cc \
DISFinalState.cc \
DISKinematics.cc \
DISLepton.cc \
DressedLeptons.cc \
FastJets.cc \
+ FinalPartons.cc \
FinalState.cc \
FoxWolframMoments.cc \
FParameter.cc \
HadronicFinalState.cc \
HeavyHadrons.cc \
Hemispheres.cc \
IdentifiedFinalState.cc \
InitialQuarks.cc \
InvMassFinalState.cc \
JetAlg.cc \
JetShape.cc \
LeadingParticlesFinalState.cc \
MergedFinalState.cc \
MissingMomentum.cc \
NeutralFinalState.cc \
NonHadronicFinalState.cc \
ParisiTensor.cc \
ParticleFinder.cc \
PrimaryHadrons.cc \
PromptFinalState.cc \
Sphericity.cc \
Spherocity.cc \
TauFinder.cc \
Thrust.cc \
TriggerCDFRun0Run1.cc \
TriggerCDFRun2.cc \
TriggerUA5.cc \
UnstableFinalState.cc \
VetoedFinalState.cc \
VisibleFinalState.cc \
WFinder.cc \
ZFinder.cc
diff --git a/src/Projections/UnstableFinalState.cc b/src/Projections/UnstableFinalState.cc
--- a/src/Projections/UnstableFinalState.cc
+++ b/src/Projections/UnstableFinalState.cc
@@ -1,67 +1,64 @@
// -*- C++ -*-
#include "Rivet/Projections/UnstableFinalState.hh"
+/// @todo Replace with PID::isParton()
#define IS_PARTON_PDGID(id) ( abs(id) <= 100 && abs(id) != 22 && (abs(id) < 11 || abs(id) > 18) )
namespace Rivet {
void UnstableFinalState::project(const Event& e) {
_theParticles.clear();
+ /// @todo Replace PID veto list with PID:: functions?
vector<PdgId> vetoIds;
vetoIds += 22; // status 2 photons don't count!
vetoIds += 110; vetoIds += 990; vetoIds += 9990; // Reggeons
//vetoIds += 9902210; // something weird from PYTHIA6
foreach (GenParticle* p, Rivet::particles(e.genEvent())) {
const int st = p->status();
bool passed =
- (st == 1 || (st == 2 && find(vetoIds.begin(), vetoIds.end(), abs(p->pdg_id())) == vetoIds.end())) &&
- !IS_PARTON_PDGID(p->pdg_id()) && ///< Always veto partons?
+ (st == 1 || (st == 2 && !contains(vetoIds, abs(p->pdg_id())))) &&
+ !IS_PARTON_PDGID(p->pdg_id()) && ///< Always veto partons
!p->is_beam() && // Filter beam particles
- _cuts->accept(p->momentum());
+ _cuts->accept(p->momentum());
// Avoid double counting by re-marking as unpassed if particle ID == parent ID
const GenVertex* pv = p->production_vertex();
const GenVertex* dv = p->end_vertex();
if (passed && pv) {
- for (GenVertex::particles_in_const_iterator pp = pv->particles_in_const_begin() ;
- pp != pv->particles_in_const_end() ; ++pp) {
- if ( p->pdg_id() == (*pp)->pdg_id() && (*pp)->status() == 2 ) {
+ for (GenVertex::particles_in_const_iterator pp = pv->particles_in_const_begin(); pp != pv->particles_in_const_end(); ++pp) {
+ if (p->pdg_id() == (*pp)->pdg_id() && (*pp)->status() == 2) {
passed = false;
break;
}
}
}
// Add to output particles collection
- if (passed) {
- _theParticles.push_back(Particle(*p));
- }
+ if (passed) _theParticles.push_back(Particle(*p));
// Log parents and children
if (getLog().isActive(Log::TRACE)) {
MSG_TRACE("ID = " << p->pdg_id()
<< ", status = " << st
<< ", pT = " << p->momentum().perp()
<< ", eta = " << p->momentum().eta()
<< ": result = " << std::boolalpha << passed);
if (pv) {
- for (GenVertex::particles_in_const_iterator pp = pv->particles_in_const_begin() ;
- pp != pv->particles_in_const_end() ; ++pp) {
+ for (GenVertex::particles_in_const_iterator pp = pv->particles_in_const_begin(); pp != pv->particles_in_const_end(); ++pp) {
MSG_TRACE(" parent ID = " << (*pp)->pdg_id());
}
}
if (dv) {
- for (GenVertex::particles_out_const_iterator pp = dv->particles_out_const_begin() ;
- pp != dv->particles_out_const_end() ; ++pp) {
+ for (GenVertex::particles_out_const_iterator pp = dv->particles_out_const_begin(); pp != dv->particles_out_const_end(); ++pp) {
MSG_TRACE(" child ID = " << (*pp)->pdg_id());
}
}
}
}
MSG_DEBUG("Number of unstable final-state particles = " << _theParticles.size());
}
}
diff --git a/src/Projections/WFinder.cc b/src/Projections/WFinder.cc
--- a/src/Projections/WFinder.cc
+++ b/src/Projections/WFinder.cc
@@ -1,155 +1,156 @@
// -*- C++ -*-
#include "Rivet/Projections/WFinder.hh"
#include "Rivet/Projections/ChargedFinalState.hh"
#include "Rivet/Projections/InvMassFinalState.hh"
#include "Rivet/Projections/MissingMomentum.hh"
#include "Rivet/Projections/MergedFinalState.hh"
#include "Rivet/Projections/DressedLeptons.hh"
#include "Rivet/Projections/VetoedFinalState.hh"
namespace Rivet {
+
WFinder::WFinder(const FinalState& inputfs,
const Cut& fsCut,
PdgId pid,
double minmass, double maxmass,
double missingET,
double dRmax,
ClusterPhotons clusterPhotons,
PhotonTracking trackPhotons,
MassWindow masstype,
double masstarget) {
setName("WFinder");
_minmass = minmass;
_maxmass = maxmass;
_masstarget = masstarget;
_pid = pid;
_trackPhotons = trackPhotons;
- _useTransverseMass = (masstype == MASS);
+ _useTransverseMass = (masstype == TRANSMASS);
// Check that the arguments are legal
assert(abs(_pid) == PID::ELECTRON || abs(_pid) == PID::MUON);
_nu_pid = abs(_pid) + 1;
assert(abs(_nu_pid) == PID::NU_E || abs(_nu_pid) == PID::NU_MU);
// Don't make pT or eta cuts on the neutrino
IdentifiedFinalState neutrinos(inputfs);
neutrinos.acceptNeutrinos();
addProjection(neutrinos, "Neutrinos");
// Lepton clusters
IdentifiedFinalState bareleptons(inputfs);
bareleptons.acceptIdPair(pid);
const bool doClustering = (clusterPhotons != NOCLUSTER);
const bool useDecayPhotons = (clusterPhotons == CLUSTERALL);
DressedLeptons leptons(inputfs, bareleptons, dRmax, fsCut, doClustering, useDecayPhotons);
addProjection(leptons, "DressedLeptons");
// Add MissingMomentum proj to calc MET
MissingMomentum vismom(inputfs);
addProjection(vismom, "MissingET");
// Set ETmiss
_etMiss = missingET;
VetoedFinalState remainingFS;
remainingFS.addVetoOnThisFinalState(*this);
addProjection(remainingFS, "RFS");
}
/////////////////////////////////////////////////////
const FinalState& WFinder::remainingFinalState() const {
return getProjection<FinalState>("RFS");
}
int WFinder::compare(const Projection& p) const {
PCmp dlcmp = mkNamedPCmp(p, "DressedLeptons");
if (dlcmp != EQUIVALENT) return dlcmp;
const WFinder& other = dynamic_cast<const WFinder&>(p);
return (cmp(_minmass, other._minmass) || cmp(_maxmass, other._maxmass) ||
cmp(_useTransverseMass, other._useTransverseMass) ||
cmp(_etMiss, other._etMiss) ||
cmp(_pid, other._pid) || cmp(_trackPhotons, other._trackPhotons));
}
void WFinder::project(const Event& e) {
clear();
const DressedLeptons& leptons = applyProjection<DressedLeptons>(e, "DressedLeptons");
const FinalState& neutrinos = applyProjection<FinalState>(e, "Neutrinos");
// Make and register an invariant mass final state for the W decay leptons
vector<pair<PdgId, PdgId> > l_nu_ids;
l_nu_ids += make_pair(abs(_pid), -abs(_nu_pid));
l_nu_ids += make_pair(-abs(_pid), abs(_nu_pid));
InvMassFinalState imfs(l_nu_ids, _minmass, _maxmass, _masstarget);
imfs.useTransverseMass(_useTransverseMass);
Particles tmp;
tmp.insert(tmp.end(), leptons.dressedLeptons().begin(), leptons.dressedLeptons().end());
tmp.insert(tmp.end(), neutrinos.particles().begin(), neutrinos.particles().end());
imfs.calc(tmp);
if (imfs.particlePairs().size() < 1) return;
ParticlePair Wconstituents(imfs.particlePairs()[0]);
Particle p1(Wconstituents.first), p2(Wconstituents.second);
if (threeCharge(p1) == 0) {
_constituentLeptons += p2;
_constituentNeutrinos += p1;
} else {
_constituentLeptons += p1;
_constituentNeutrinos += p2;
}
FourMomentum pW = p1.momentum() + p2.momentum();
const int w3charge = threeCharge(p1) + threeCharge(p2);
assert(abs(w3charge) == 3);
const int wcharge = w3charge/3;
stringstream msg;
string wsign = (wcharge == 1) ? "+" : "-";
string wstr = "W" + wsign;
msg << wstr << " reconstructed from: " << "\n"
<< " " << p1.momentum() << " " << p1.pid() << "\n"
<< " + " << p2.momentum() << " " << p2.pid();
MSG_DEBUG(msg.str());
// Check missing ET
const MissingMomentum& vismom = applyProjection<MissingMomentum>(e, "MissingET");
const double met = vismom.vectorEt().mod();
MSG_TRACE("MET = " << met/GeV << " GeV vs. required = " << _etMiss/GeV << " GeV");
if (met < _etMiss) {
MSG_DEBUG("Not enough missing ET: " << met/GeV << " GeV vs. " << _etMiss/GeV << " GeV");
return;
}
// Make W Particle and insert into particles list
const PdgId wpid = (wcharge == 1) ? PID::WPLUSBOSON : PID::WMINUSBOSON;
_bosons.push_back(Particle(wpid, pW));
// Find the DressedLeptons and neutrinos which survived the IMFS cut such that we can
// extract their original particles
foreach (const Particle& p, _constituentNeutrinos) {
_theParticles.push_back(p);
}
foreach (const Particle& p, _constituentLeptons) {
foreach (const DressedLepton& l, leptons.dressedLeptons()) {
if (p.pid() == l.pid() && p.momentum() == l.momentum()) {
_theParticles.push_back(l.constituentLepton());
if (_trackPhotons) {
_theParticles.insert(_theParticles.end(), l.constituentPhotons().begin(), l.constituentPhotons().end());
}
}
}
}
}
}

File Metadata

Mime Type
text/x-diff
Expires
Tue, Nov 19, 7:41 PM (1 d, 9 h)
Storage Engine
blob
Storage Format
Raw Data
Storage Handle
3805881
Default Alt Text
(645 KB)

Event Timeline