=== TMVAnalysis: use method(s)...
=== - <CutsGA>
=== - <Likelihood>
=== - <LikelihoodPCA>
=== - <PDERS>
=== - <KNN>
=== - <HMatrix>
=== - <Fisher>
=== - <FDA_MT>
=== - <MLP>
=== - <SVM_Gauss>
=== - <BDT>
=== - <BDTD>
=== - <RuleFit>
[1mTMVA -- Toolkit for Multivariate Data Analysis[0m
Version 3.9.5, Aug 09, 2008
Copyright (C) 2005-2008 CERN, MPI-K Heidelberg and Victoria U.
Home page http://tmva.sourceforge.net
All rights reserved, please read http://tmva.sf.net/license.txt
TMVAlogon: use "TMVA" style [TMVA style based on "Plain" with modifications defined in tmvaglob.C]
########################################## TMVAout.1405006.root
--- Factory : You are running ROOT Version: 5.20/00, Jun 24, 2008
--- Factory :
--- Factory : _/_/_/_/_/ _| _| _| _| _|_|
--- Factory : _/ _|_| _|_| _| _| _| _|
--- Factory : _/ _| _| _| _| _| _|_|_|_|
--- Factory : _/ _| _| _| _| _| _|
--- Factory : _/ _| _| _| _| _|
--- Factory :
--- Factory : __________TMVA Version 3.9.5, Aug 09, 2008
--- Factory :
!!!! TopTreeSig TopTreeBkg
--- Factory : Preparing trees for training and testing...
--- DataSet : Parsing option string:
--- DataSet : "NSigTrain=113:NBkgTrain=1121::NSigTest=2:NBkgTest=2:SplitMode=Alternate:NormMode=NumEvents:!V"
--- DataSet : The following options are set:
--- DataSet : - By User:
--- DataSet : SplitMode: "Alternate" [Method for selecting training and testing events (default: random)]
--- DataSet : NormMode: "NumEvents" [Overall renormalisation of event-by-event weights (NumEvents: average weight of 1 per event, independently for signal and background; EqualNumEvents: average weight of 1 per event for signal, and sum of weights for background equal to sum of weights for signal)]
--- DataSet : NSigTrain: "113" [Number of signal training events (default: 0 - all)]
--- DataSet : NBkgTrain: "1121" [Number of background training events (default: 0 - all)]
--- DataSet : NSigTest: "2" [Number of signal testing events (default: 0 - all)]
--- DataSet : NBkgTest: "2" [Number of background testing events (default: 0 - all)]
--- DataSet : V: "False" [Verbose mode]
--- DataSet : - Default:
--- DataSet : SplitSeed: "100" [Seed for random event shuffling]
--- DataSet : Create training and testing trees: looping over signal events ...
--- DataSet : Create training and testing trees: looping over background events ...
--- DataSet : Prepare training and test samples:
--- DataSet : Signal tree - number of events : 115
--- DataSet : Background tree - number of events : 1123
--- DataSet : Preselection:
--- DataSet : No preselection cuts applied on signal or background
--- DataSet : Weight renormalisation mode: "NumEvents": renormalise signal and background
--- DataSet : weights independently so that Sum[i=1..N_j]{w_i} = N_j, j=signal, background
--- DataSet : (note that N_j is the sum of training and test events)
--- DataSet : Event weights scaling factor:
--- DataSet : signal : 3.74073
--- DataSet : background : 5.68761
--- DataSet : Pick alternating training and test events from input tree for signal
--- DataSet : Pick alternating training and test events from input tree for background
--- DataSet : Create training tree
--- DataSet : Create testing tree
--- DataSet : Collected:
--- DataSet : - Training signal entries : 113
--- DataSet : - Training background entries : 1121
--- DataSet : - Testing signal entries : 2
--- DataSet : - Testing background entries : 2
--- DataSet : Compute correlation matrices on tree: TrainingTree
--- DataSet :
--- DataSet : Correlation matrix (signal):
--- DataSet : ---------------------------------------------------------------
--- DataSet : HT Jet1Pt DeltaRJet1Jet2 WTransverseMass
--- DataSet : HT: +1.000 +0.932 +0.058 +0.126
--- DataSet : Jet1Pt: +0.932 +1.000 +0.140 +0.031
--- DataSet : DeltaRJet1Jet2: +0.058 +0.140 +1.000 -0.281
--- DataSet : WTransverseMass: +0.126 +0.031 -0.281 +1.000
--- DataSet : ---------------------------------------------------------------
--- DataSet :
--- DataSet : Correlation matrix (background):
--- DataSet : ---------------------------------------------------------------
--- DataSet : HT Jet1Pt DeltaRJet1Jet2 WTransverseMass
--- DataSet : HT: +1.000 +0.940 +0.482 +0.172
--- DataSet : Jet1Pt: +0.940 +1.000 +0.419 +0.108
--- DataSet : DeltaRJet1Jet2: +0.482 +0.419 +1.000 +0.001
--- DataSet : WTransverseMass: +0.172 +0.108 +0.001 +1.000
--- DataSet : ---------------------------------------------------------------
--- DataSet :
--- DataSet : New variable Transformation NoTransform requested and created.
--- TransBase : Create scatter and profile plots in target-file directory:
--- TransBase : TMVAout.1405006.root:/InputVariables_NoTransform/CorrelationPlots
--- TransBase : Ranking input variables...
--- NoTransform : Ranking result (top variable is best ranked)
--- NoTransform : ----------------------------------------------------------------
--- NoTransform : Rank : Variable : Separation
--- NoTransform : ----------------------------------------------------------------
--- NoTransform : 1 : DeltaRJet1Jet2 : 4.326e-01
--- NoTransform : 2 : HT : 3.501e-01
--- NoTransform : 3 : WTransverseMass : 2.935e-01
--- NoTransform : 4 : Jet1Pt : 2.776e-01
--- NoTransform : ----------------------------------------------------------------
--- Cuts : Parsing option string:
--- Cuts : "H:!V:FitMethod=GA:EffSel:Steps=30:Cycles=3:PopSize=100:SC_steps=10:SC_rate=5:SC_factor=0.95:VarProp=FSmart"
--- Cuts : The following options are set:
--- Cuts : - By User:
--- Cuts : V: "False" [Verbose mode]
--- Cuts : H: "True" [Print classifier-specific help message]
--- Cuts : FitMethod: "GA" [Minimization Method (GA, SA, and MC are the primary methods to be used; the others have been introduced for testing purposes and are depreciated)]
--- Cuts : EffMethod: "EffSel" [Selection Method]
--- Cuts : - Default:
--- Cuts : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- Cuts : Normalise: "False" [Normalise input variables]
--- Cuts : VarTransform: "None" [Variable transformation method]
--- Cuts : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- Cuts : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- Cuts : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- Cuts : VerboseLevel: "Default" [Verbosity level]
--- Cuts : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- Cuts : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- Cuts : CutRangeMin[0]: "-1" [Minimum of allowed cut range (set per variable)]
--- Cuts : CutRangeMin[1]: "-1"
--- Cuts : CutRangeMin[2]: "-1"
--- Cuts : CutRangeMin[3]: "-1"
--- Cuts : CutRangeMax[0]: "-1" [Maximum of allowed cut range (set per variable)]
--- Cuts : CutRangeMax[1]: "-1"
--- Cuts : CutRangeMax[2]: "-1"
--- Cuts : CutRangeMax[3]: "-1"
--- Cuts : VarProp[0]: "FSmart" [Categorisation of cuts]
--- Cuts : VarProp[1]: "FSmart"
--- Cuts : VarProp[2]: "FSmart"
--- Cuts : VarProp[3]: "FSmart"
--- Cuts : Use optimization method: 'Genetic Algorithm'
--- Cuts : Use efficiency computation method: 'Event Selection'
--- Cuts : Use "FSmart" cuts for variable: 'HT'
--- Cuts : Use "FSmart" cuts for variable: 'Jet1Pt'
--- Cuts : Use "FSmart" cuts for variable: 'DeltaRJet1Jet2'
--- Cuts : Use "FSmart" cuts for variable: 'WTransverseMass'
--- Factory : Booking method: CutsGA
--- Likelihood : Parsing option string:
--- Likelihood : "!H:!V:!TransformOutput:PDFInterpol=Spline2:NSmoothSig[0]=10:NSmoothBkg[0]=10:NSmoothBkg[1]=10:NSmooth=10:NAvEvtPerBin=50"
--- Likelihood : The following options are set:
--- Likelihood : - By User:
--- Likelihood : V: "False" [Verbose mode]
--- Likelihood : H: "False" [Print classifier-specific help message]
--- Likelihood : NSmooth: "10" [Number of smoothing iterations for the input histograms]
--- Likelihood : NAvEvtPerBin: "50" [Average number of events per PDF bin]
--- Likelihood : TransformOutput: "False" [Transform likelihood output by inverse sigmoid function]
--- Likelihood : - Default:
--- Likelihood : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- Likelihood : Normalise: "False" [Normalise input variables]
--- Likelihood : VarTransform: "None" [Variable transformation method]
--- Likelihood : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- Likelihood : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- Likelihood : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- Likelihood : VerboseLevel: "Default" [Verbosity level]
--- Likelihood : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- Likelihood : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- Likelihood : NSmoothSig[0]: "10" [Number of smoothing iterations for the input histograms]
--- Likelihood : NSmoothSig[1]: "-1"
--- Likelihood : NSmoothSig[2]: "-1"
--- Likelihood : NSmoothSig[3]: "-1"
--- Likelihood : NSmoothBkg[0]: "10" [Number of smoothing iterations for the input histograms]
--- Likelihood : NSmoothBkg[1]: "10"
--- Likelihood : NSmoothBkg[2]: "-1"
--- Likelihood : NSmoothBkg[3]: "-1"
--- Likelihood : NAvEvtPerBinSig[0]: "-1" [Average num of events per PDF bin and variable (signal)]
--- Likelihood : NAvEvtPerBinSig[1]: "-1"
--- Likelihood : NAvEvtPerBinSig[2]: "-1"
--- Likelihood : NAvEvtPerBinSig[3]: "-1"
--- Likelihood : NAvEvtPerBinBkg[0]: "-1" [Average num of events per PDF bin and variable (background)]
--- Likelihood : NAvEvtPerBinBkg[1]: "-1"
--- Likelihood : NAvEvtPerBinBkg[2]: "-1"
--- Likelihood : NAvEvtPerBinBkg[3]: "-1"
--- Likelihood : PDFInterpol[0]: "Spline2" [Method of interpolating reference histograms (e.g. Spline2 or KDE)]
--- Likelihood : PDFInterpol[1]: "Spline2"
--- Likelihood : PDFInterpol[2]: "Spline2"
--- Likelihood : PDFInterpol[3]: "Spline2"
--- Likelihood : KDEtype: "Gauss" [KDE kernel type (1=Gauss)]
--- Likelihood : KDEiter: "Nonadaptive" [Number of iterations (1=non-adaptive, 2=adaptive)]
--- Likelihood : KDEFineFactor: "1" [Fine tuning factor for Adaptive KDE: Factor to multyply the width of the kernel]
--- Likelihood : KDEborder: "None" [Border effects treatment (1=no treatment , 2=kernel renormalization, 3=sample mirroring)]
--- Factory : Booking method: Likelihood
--- Likelihood : Parsing option string:
--- Likelihood : "!H:!V:!TransformOutput:PDFInterpol=Spline2:NSmoothSig[0]=10:NSmoothBkg[0]=10:NSmooth=5:NAvEvtPerBin=50:VarTransform=PCA"
--- Likelihood : The following options are set:
--- Likelihood : - By User:
--- Likelihood : VarTransform: "PCA" [Variable transformation method]
--- Likelihood : V: "False" [Verbose mode]
--- Likelihood : H: "False" [Print classifier-specific help message]
--- Likelihood : NSmooth: "5" [Number of smoothing iterations for the input histograms]
--- Likelihood : NAvEvtPerBin: "50" [Average number of events per PDF bin]
--- Likelihood : TransformOutput: "False" [Transform likelihood output by inverse sigmoid function]
--- Likelihood : - Default:
--- Likelihood : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- Likelihood : Normalise: "False" [Normalise input variables]
--- Likelihood : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- Likelihood : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- Likelihood : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- Likelihood : VerboseLevel: "Default" [Verbosity level]
--- Likelihood : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- Likelihood : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- Likelihood : NSmoothSig[0]: "10" [Number of smoothing iterations for the input histograms]
--- Likelihood : NSmoothSig[1]: "-1"
--- Likelihood : NSmoothSig[2]: "-1"
--- Likelihood : NSmoothSig[3]: "-1"
--- Likelihood : NSmoothBkg[0]: "10" [Number of smoothing iterations for the input histograms]
--- Likelihood : NSmoothBkg[1]: "-1"
--- Likelihood : NSmoothBkg[2]: "-1"
--- Likelihood : NSmoothBkg[3]: "-1"
--- Likelihood : NAvEvtPerBinSig[0]: "-1" [Average num of events per PDF bin and variable (signal)]
--- Likelihood : NAvEvtPerBinSig[1]: "-1"
--- Likelihood : NAvEvtPerBinSig[2]: "-1"
--- Likelihood : NAvEvtPerBinSig[3]: "-1"
--- Likelihood : NAvEvtPerBinBkg[0]: "-1" [Average num of events per PDF bin and variable (background)]
--- Likelihood : NAvEvtPerBinBkg[1]: "-1"
--- Likelihood : NAvEvtPerBinBkg[2]: "-1"
--- Likelihood : NAvEvtPerBinBkg[3]: "-1"
--- Likelihood : PDFInterpol[0]: "Spline2" [Method of interpolating reference histograms (e.g. Spline2 or KDE)]
--- Likelihood : PDFInterpol[1]: "Spline2"
--- Likelihood : PDFInterpol[2]: "Spline2"
--- Likelihood : PDFInterpol[3]: "Spline2"
--- Likelihood : KDEtype: "Gauss" [KDE kernel type (1=Gauss)]
--- Likelihood : KDEiter: "Nonadaptive" [Number of iterations (1=non-adaptive, 2=adaptive)]
--- Likelihood : KDEFineFactor: "1" [Fine tuning factor for Adaptive KDE: Factor to multyply the width of the kernel]
--- Likelihood : KDEborder: "None" [Border effects treatment (1=no treatment , 2=kernel renormalization, 3=sample mirroring)]
--- DataSet : New variable Transformation PCATransform requested and created.
--- TransBase : Create scatter and profile plots in target-file directory:
--- TransBase : TMVAout.1405006.root:/InputVariables_PCATransform/CorrelationPlots
--- TransBase : Ranking input variables...
--- PCATransform : Ranking result (top variable is best ranked)
--- PCATransform : ----------------------------------------------------------------
--- PCATransform : Rank : Variable : Separation
--- PCATransform : ----------------------------------------------------------------
--- PCATransform : 1 : WTransverseMass : 4.334e-01
--- PCATransform : 2 : DeltaRJet1Jet2 : 3.530e-01
--- PCATransform : 3 : HT : 3.526e-01
--- PCATransform : 4 : Jet1Pt : 2.118e-01
--- PCATransform : ----------------------------------------------------------------
--- Likelihood : Use principal component transformation
--- Factory : Booking method: LikelihoodPCA
--- PDERS : Parsing option string:
--- PDERS : "!H:!V:NormTree=T:VolumeRangeMode=Adaptive:KernelEstimator=Gauss:GaussSigma=0.3:NEventsMin=400:NEventsMax=600"
--- PDERS : The following options are set:
--- PDERS : - By User:
--- PDERS : V: "False" [Verbose mode]
--- PDERS : H: "False" [Print classifier-specific help message]
--- PDERS : VolumeRangeMode: "Adaptive" [Method to determine volume size]
--- PDERS : KernelEstimator: "Gauss" [Kernel estimation function]
--- PDERS : NEventsMin: "400" [nEventsMin for adaptive volume range]
--- PDERS : NEventsMax: "600" [nEventsMax for adaptive volume range]
--- PDERS : GaussSigma: "0.3" [Width (wrt volume size) of Gaussian kernel estimator]
--- PDERS : NormTree: "True" [Normalize binary search tree]
--- PDERS : - Default:
--- PDERS : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- PDERS : Normalise: "False" [Normalise input variables]
--- PDERS : VarTransform: "None" [Variable transformation method]
--- PDERS : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- PDERS : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- PDERS : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- PDERS : VerboseLevel: "Default" [Verbosity level]
--- PDERS : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- PDERS : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- PDERS : DeltaFrac: "3" [nEventsMin/Max for minmax and rms volume range]
--- PDERS : MaxVIterations: "150" [MaxVIterations for adaptive volume range]
--- PDERS : InitialScale: "0.99" [InitialScale for adaptive volume range]
--- Factory : Booking method: PDERS
--- HMatrix : Parsing option string:
--- HMatrix : "!H:!V"
--- HMatrix : The following options are set:
--- HMatrix : - By User:
--- HMatrix : V: "False" [Verbose mode]
--- HMatrix : H: "False" [Print classifier-specific help message]
--- HMatrix : - Default:
--- HMatrix : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- HMatrix : Normalise: "True" [Normalise input variables]
--- HMatrix : VarTransform: "None" [Variable transformation method]
--- HMatrix : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- HMatrix : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- HMatrix : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- HMatrix : VerboseLevel: "Default" [Verbosity level]
--- HMatrix : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- HMatrix : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- Factory : Booking method: HMatrix
--- Fisher : Parsing option string:
--- Fisher : "H:!V:!Normalise:CreateMVAPdfs:Fisher:NbinsMVAPdf=50:NsmoothMVAPdf=1"
--- Fisher : The following options are set:
--- Fisher : - By User:
--- Fisher : Normalise: "False" [Normalise input variables]
--- Fisher : NbinsMVAPdf: "50" [Number of bins used for the PDFs of classifier outputs]
--- Fisher : NsmoothMVAPdf: "1" [Number of smoothing iterations for classifier PDFs]
--- Fisher : V: "False" [Verbose mode]
--- Fisher : H: "True" [Print classifier-specific help message]
--- Fisher : CreateMVAPdfs: "True" [Create PDFs for classifier outputs (signal and background)]
--- Fisher : Method: "Fisher" [Discrimination method]
--- Fisher : - Default:
--- Fisher : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- Fisher : VarTransform: "None" [Variable transformation method]
--- Fisher : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- Fisher : VerboseLevel: "Default" [Verbosity level]
--- Fisher : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- Factory : Booking method: Fisher
--- FDA : Parsing option string:
--- FDA : "H:!V:Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):FitMethod=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=2:UseImprove:UseMinos:SetBatch"
--- FDA : The following options are set:
--- FDA : - By User:
--- FDA : V: "False" [Verbose mode]
--- FDA : H: "True" [Print classifier-specific help message]
--- FDA : Formula: "(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3" [The discrimination formula]
--- FDA : ParRanges: "(-1,1);(-10,10);(-10,10);(-10,10);(-10,10)" [Parameter ranges]
--- FDA : FitMethod: "MINUIT" [Optimisation Method]
--- FDA : - Default:
--- FDA : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- FDA : Normalise: "False" [Normalise input variables]
--- FDA : VarTransform: "None" [Variable transformation method]
--- FDA : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- FDA : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- FDA : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- FDA : VerboseLevel: "Default" [Verbosity level]
--- FDA : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- FDA : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- FDA : Converger: "None" [FitMethod uses Converger to improve result]
--- FDA : User-defined formula string : "(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3"
--- FDA : TFormula-compatible formula string: "[0]+[1]*[5]+[2]*[6]+[3]*[7]+[4]*[8]"
--- FDA : Creating and compiling formula
--- FDA_Fitter_M...: Parsing option string:
--- FDA_Fitter_M...: "!H:!V:!Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:!ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):!FitMethod=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=2:UseImprove:UseMinos:SetBatch"
--- FDA_Fitter_M...: The following options are set:
--- FDA_Fitter_M...: - By User:
--- FDA_Fitter_M...: ErrorLevel: "1" [TMinuit: error level: 0.5=logL fit, 1=chi-squared fit]
--- FDA_Fitter_M...: PrintLevel: "-1" [TMinuit: output level: -1=least, 0, +1=all garbage]
--- FDA_Fitter_M...: FitStrategy: "2" [TMinuit: fit strategy: 2=best]
--- FDA_Fitter_M...: UseImprove: "True" [TMinuit: use IMPROVE]
--- FDA_Fitter_M...: UseMinos: "True" [TMinuit: use MINOS]
--- FDA_Fitter_M...: SetBatch: "True" [TMinuit: use batch mode]
--- FDA_Fitter_M...: - Default:
--- FDA_Fitter_M...: PrintWarnings: "False" [TMinuit: suppress warnings]
--- FDA_Fitter_M...: MaxCalls: "1000" [TMinuit: approximate maximum number of function calls]
--- FDA_Fitter_M...: Tolerance: "0.1" [TMinuit: tolerance to the function value at the minimum]
--- Factory : Booking method: FDA_MT
--- MLP : Parsing option string:
--- MLP : "H:!V:!Normalise:NeuronType=tanh:NCycles=200:HiddenLayers=N+1,N:TestRate=5"
--- MLP : The following options are set:
--- MLP : - By User:
--- MLP : Normalise: "False" [Normalise input variables]
--- MLP : V: "False" [Verbose mode]
--- MLP : H: "True" [Print classifier-specific help message]
--- MLP : NCycles: "200" [Number of training cycles]
--- MLP : HiddenLayers: "N+1,N" [Specification of hidden layer architecture (N stands for number of variables; any integers may also be used)]
--- MLP : NeuronType: "tanh" [Neuron activation function type]
--- MLP : TestRate: "5" [Test for overtraining performed at each #th epochs]
--- MLP : - Default:
--- MLP : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- MLP : VarTransform: "None" [Variable transformation method]
--- MLP : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- MLP : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- MLP : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- MLP : VerboseLevel: "Default" [Verbosity level]
--- MLP : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- MLP : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- MLP : NeuronInputType: "sum" [Neuron input function type]
--- MLP : TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)]
--- MLP : LearningRate: "0.02" [ANN learning rate parameter]
--- MLP : DecayRate: "0.01" [Decay rate for learning parameter]
--- MLP : BPMode: "sequential" [Back-propagation learning mode: sequential or batch]
--- MLP : BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events]
--- MLP : Building Network
--- MLP : Initializing weights
--- Factory : Booking method: MLP
--- SVM : Parsing option string:
--- SVM : "Sigma=2:C=1:Tol=0.001:Kernel=Gauss"
--- SVM : The following options are set:
--- SVM : - By User:
--- SVM : C: "1" [C parameter]
--- SVM : Tol: "0.001" [Tolerance parameter]
--- SVM : Sigma: "2" [Kernel parameter: sigma]
--- SVM : Kernel: "Gauss" [Uses kernel function]
--- SVM : - Default:
--- SVM : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- SVM : Normalise: "True" [Normalise input variables]
--- SVM : VarTransform: "None" [Variable transformation method]
--- SVM : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- SVM : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- SVM : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- SVM : V: "False" [Verbose mode]
--- SVM : VerboseLevel: "Default" [Verbosity level]
--- SVM : H: "False" [Print classifier-specific help message]
--- SVM : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- SVM : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- SVM : MaxIter: "1000" [Maximum number of training loops]
--- SVM : Order: "3" [Polynomial Kernel parameter: polynomial order]
--- SVM : Theta: "1" [Sigmoid Kernel parameter: theta]
--- SVM : Kappa: "1" [Sigmoid Kernel parameter: kappa]
--- Factory : Booking method: SVM_Gauss
--- BDT : Parsing option string:
--- BDT : "!H:!V:NTrees=400:BoostType=AdaBoost:SeparationType=GiniIndex:nCuts=20:PruneMethod=CostComplexity:PruneStrength=1.5"
--- BDT : The following options are set:
--- BDT : - By User:
--- BDT : V: "False" [Verbose mode]
--- BDT : H: "False" [Print classifier-specific help message]
--- BDT : NTrees: "400" [Number of trees in the forest]
--- BDT : BoostType: "AdaBoost" [Boosting type for the trees in the forest]
--- BDT : SeparationType: "GiniIndex" [Separation criterion for node splitting]
--- BDT : nCuts: "20" [Number of steps during node cut optimisation]
--- BDT : PruneStrength: "1.5" [Pruning strength]
--- BDT : PruneMethod: "CostComplexity" [Method used for pruning (removal) of statistically insignificant branches]
--- BDT : - Default:
--- BDT : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- BDT : Normalise: "False" [Normalise input variables]
--- BDT : VarTransform: "None" [Variable transformation method]
--- BDT : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- BDT : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- BDT : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- BDT : VerboseLevel: "Default" [Verbosity level]
--- BDT : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- BDT : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- BDT : AdaBoostBeta: "1" [Parameter for AdaBoost algorithm]
--- BDT : UseRandomisedTrees: "False" [Choose at each node splitting a random set of variables]
--- BDT : UseNvars: "4" [Number of variables used if randomised Tree option is chosen]
--- BDT : UseWeightedTrees: "True" [Use weighted trees or simple average in classification from the forest]
--- BDT : UseYesNoLeaf: "True" [Use Sig or Bkg categories, or the purity=S/(S+B) as classification of the leaf node]
--- BDT : NodePurityLimit: "0.5" [In boosting/pruning, nodes with purity > NodePurityLimit are signal; background otherwise.]
--- BDT : nEventsMin: "20" [Minimum number of events required in a leaf node (default: max(20, N_train/(Nvar^2)/10) ) ]
--- BDT : PruneBeforeBoost: "False" [Flag to prune the tree before applying boosting algorithm]
--- BDT : NoNegWeightsInTraining: "False" [Ignore negative event weights in the training process]
--- BDT : Events with negative event weights are ignored during the BDT training (option NoNegWeightsInTraining=0
--- BDT : <InitEventSample> Internally I use 1234 for Training and 0 for Validation
--- Factory : Booking method: BDT
--- BDT : Parsing option string:
--- BDT : "!H:!V:NTrees=400:BoostType=AdaBoost:SeparationType=GiniIndex:nCuts=20:PruneMethod=CostComplexity:PruneStrength=1.5:VarTransform=Decorrelate"
--- BDT : The following options are set:
--- BDT : - By User:
--- BDT : VarTransform: "Decorrelate" [Variable transformation method]
--- BDT : V: "False" [Verbose mode]
--- BDT : H: "False" [Print classifier-specific help message]
--- BDT : NTrees: "400" [Number of trees in the forest]
--- BDT : BoostType: "AdaBoost" [Boosting type for the trees in the forest]
--- BDT : SeparationType: "GiniIndex" [Separation criterion for node splitting]
--- BDT : nCuts: "20" [Number of steps during node cut optimisation]
--- BDT : PruneStrength: "1.5" [Pruning strength]
--- BDT : PruneMethod: "CostComplexity" [Method used for pruning (removal) of statistically insignificant branches]
--- BDT : - Default:
--- BDT : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- BDT : Normalise: "False" [Normalise input variables]
--- BDT : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- BDT : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- BDT : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- BDT : VerboseLevel: "Default" [Verbosity level]
--- BDT : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- BDT : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- BDT : AdaBoostBeta: "1" [Parameter for AdaBoost algorithm]
--- BDT : UseRandomisedTrees: "False" [Choose at each node splitting a random set of variables]
--- BDT : UseNvars: "4" [Number of variables used if randomised Tree option is chosen]
--- BDT : UseWeightedTrees: "True" [Use weighted trees or simple average in classification from the forest]
--- BDT : UseYesNoLeaf: "True" [Use Sig or Bkg categories, or the purity=S/(S+B) as classification of the leaf node]
--- BDT : NodePurityLimit: "0.5" [In boosting/pruning, nodes with purity > NodePurityLimit are signal; background otherwise.]
--- BDT : nEventsMin: "20" [Minimum number of events required in a leaf node (default: max(20, N_train/(Nvar^2)/10) ) ]
--- BDT : PruneBeforeBoost: "False" [Flag to prune the tree before applying boosting algorithm]
--- BDT : NoNegWeightsInTraining: "False" [Ignore negative event weights in the training process]
--- DataSet : New variable Transformation DecorrTransform requested and created.
--- TransBase : Create scatter and profile plots in target-file directory:
--- TransBase : TMVAout.1405006.root:/InputVariables_DecorrTransform/CorrelationPlots
--- TransBase : Ranking input variables...
--- DecorrTransform: Ranking result (top variable is best ranked)
--- DecorrTransform: ----------------------------------------------------------------
--- DecorrTransform: Rank : Variable : Separation
--- DecorrTransform: ----------------------------------------------------------------
--- DecorrTransform: 1 : DeltaRJet1Jet2 : 4.357e-01
--- DecorrTransform: 2 : HT : 3.270e-01
--- DecorrTransform: 3 : WTransverseMass : 2.748e-01
--- DecorrTransform: 4 : Jet1Pt : 2.644e-01
--- DecorrTransform: ----------------------------------------------------------------
--- BDT : Events with negative event weights are ignored during the BDT training (option NoNegWeightsInTraining=0
--- BDT : <InitEventSample> Internally I use 1234 for Training and 0 for Validation
--- Factory : Booking method: BDTD
--- RuleFit : Parsing option string:
--- RuleFit : "H:!V:RuleFitModule=RFTMVA:Model=ModRuleLinear:MinImp=0.001:RuleMinDist=0.001:NTrees=20:fEventsMin=0.01:fEventsMax=0.5:GDTau=-1.0:GDTauPrec=0.01:GDStep=0.01:GDNSteps=10000:GDErrScale=1.02"
--- RuleFit : The following options are set:
--- RuleFit : - By User:
--- RuleFit : V: "False" [Verbose mode]
--- RuleFit : H: "True" [Print classifier-specific help message]
--- RuleFit : GDTau: "-1" [Gradient-directed path: default fit cut-off]
--- RuleFit : GDTauPrec: "0.01" [Gradient-directed path: precision of tau]
--- RuleFit : GDStep: "0.01" [Gradient-directed path: step size]
--- RuleFit : GDNSteps: "10000" [Gradient-directed path: number of steps]
--- RuleFit : GDErrScale: "1.02" [Stop scan when error>scale*errmin]
--- RuleFit : fEventsMin: "0.01" [Minimum fraction of events in a splittable node]
--- RuleFit : fEventsMax: "0.5" [Maximum fraction of events in a splittable node]
--- RuleFit : nTrees: "20" [Number of trees in forest.]
--- RuleFit : RuleMinDist: "0.001" [Minimum distance between rules]
--- RuleFit : MinImp: "0.001" [Minimum rule importance accepted]
--- RuleFit : Model: "ModRuleLinear" [Model to be used]
--- RuleFit : RuleFitModule: "RFTMVA" [Which RuleFit module to use]
--- RuleFit : - Default:
--- RuleFit : D: "False" [Use-decorrelated-variables flag (depreciated)]
--- RuleFit : Normalise: "False" [Normalise input variables]
--- RuleFit : VarTransform: "None" [Variable transformation method]
--- RuleFit : VarTransformType: "Signal" [Use signal or background events to derive for variable transformation (the transformation is applied on both types of, course)]
--- RuleFit : NbinsMVAPdf: "60" [Number of bins used for the PDFs of classifier outputs]
--- RuleFit : NsmoothMVAPdf: "2" [Number of smoothing iterations for classifier PDFs]
--- RuleFit : VerboseLevel: "Default" [Verbosity level]
--- RuleFit : CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
--- RuleFit : TxtWeightFilesOnly: "True" [If True: write all training results (weights) as text files (False: some are written in ROOT format)]
--- RuleFit : GDPathEveFrac: "0.5" [Fraction of events used for the path search]
--- RuleFit : GDValidEveFrac: "0.5" [Fraction of events used for the validation]
--- RuleFit : ForestType: "AdaBoost" [Method to use for forest generation]
--- RuleFit : RFWorkDir: "./rulefit" [Friedmans RuleFit module: working dir]
--- RuleFit : RFNrules: "2000" [Friedmans RuleFit module: maximum number of rules]
--- RuleFit : RFNendnodes: "4" [Friedmans RuleFit module: average number of end nodes]
--- Factory : Booking method: RuleFit
--- Factory : Training all methods...
--- Factory : Train method: CutsGA
--- Cuts : Option for variable: HT: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: Jet1Pt: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: DeltaRJet1Jet2: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: WTransverseMass: 'ForceSmart' (#: 3)
--- Cuts :
--- Cuts : [1m================================================================[0m
--- Cuts : [1mH e l p f o r c l a s s i f i e r [ Cuts ] :[0m
--- Cuts :
--- Cuts : [1m--- Short description:[0m
--- Cuts :
--- Cuts : The optimisation of rectangular cuts performed by TMVA maximises
--- Cuts : the background rejection at given signal efficiency, and scans
--- Cuts : over the full range of the latter quantity. Three optimisation
--- Cuts : methods are optional: Monte Carlo sampling (MC), a Genetics
--- Cuts : Algorithm (GA), and Simulated Annealing (SA). GA and SA are
--- Cuts : expected to perform best.
--- Cuts :
--- Cuts : The difficulty to find the optimal cuts strongly increases with
--- Cuts : the dimensionality (number of input variables) of the problem.
--- Cuts : This behavior is due to the non-uniqueness of the solution space.
--- Cuts :
--- Cuts : [1m--- Performance optimisation:[0m
--- Cuts :
--- Cuts : If the dimensionality exceeds, say, 4 input variables, it is
--- Cuts : advisable to scrutinize the separation power of the variables,
--- Cuts : and to remove the weakest ones. If some among the input variables
--- Cuts : can be described by a single cut (e.g., because signal tends to be
--- Cuts : larger than background), this can be indicated to MethodCuts via
--- Cuts : the "Fsmart" options (see option string). Choosing this option
--- Cuts : reduces the number of requirements for the variable from 2 (min/max)
--- Cuts : to a single one (TMVA finds out whether it is to be interpreted as
--- Cuts : min or max).
--- Cuts :
--- Cuts : [1m--- Performance tuning via configuration options:[0m
--- Cuts :
--- Cuts : Monte Carlo sampling:
--- Cuts :
--- Cuts : Apart form the "Fsmart" option for the variables, the only way
--- Cuts : to improve the MC sampling is to increase the sampling rate. This
--- Cuts : is done via the configuration option "MC_NRandCuts". The execution
--- Cuts : time scales linearly with the sampling rate.
--- Cuts :
--- Cuts : Genetic Algorithm:
--- Cuts :
--- Cuts : The algorithm terminates if no significant fitness increase has
--- Cuts : been achieved within the last "nsteps" steps of the calculation.
--- Cuts : Wiggles in the ROC curve or constant background rejection of 1
--- Cuts : indicate that the GA failed to always converge at the true maximum
--- Cuts : fitness. In such a case, it is recommended to broaden the search
--- Cuts : by increasing the population size ("popSize") and to give the GA
--- Cuts : more time to find improvements by increasing the number of steps
--- Cuts : ("nsteps")
--- Cuts : -> increase "popSize" (at least >10 * number of variables)
--- Cuts : -> increase "nsteps"
--- Cuts :
--- Cuts : Simulated Annealing (SA) algorithm:
--- Cuts :
--- Cuts : "Increasing Adaptive" approach:
--- Cuts :
--- Cuts : The algorithm seeks local minima and explores their neighborhood, while
--- Cuts : changing the ambient temperature depending on the number of failures
--- Cuts : in the previous steps. The performance can be improved by increasing
--- Cuts : the number of iteration steps ("MaxCalls"), or by adjusting the
--- Cuts : minimal temperature ("MinTemperature"). Manual adjustments of the
--- Cuts : speed of the temperature increase ("TemperatureScale" and "AdaptiveSpeed")
--- Cuts : to individual data sets should also help. Summary:
--- Cuts : -> increase "MaxCalls"
--- Cuts : -> adjust "MinTemperature"
--- Cuts : -> adjust "TemperatureScale"
--- Cuts : -> adjust "AdaptiveSpeed"
--- Cuts :
--- Cuts : "Decreasing Adaptive" approach:
--- Cuts :
--- Cuts : The algorithm calculates the initial temperature (based on the effect-
--- Cuts : iveness of large steps) and the multiplier that ensures to reach the
--- Cuts : minimal temperature with the requested number of iteration steps.
--- Cuts : The performance can be improved by adjusting the minimal temperature
--- Cuts : ("MinTemperature") and by increasing number of steps ("MaxCalls"):
--- Cuts : -> increase "MaxCalls"
--- Cuts : -> adjust "MinTemperature"
--- Cuts :
--- Cuts : Other kernels:
--- Cuts :
--- Cuts : Alternative ways of counting the temperature change are implemented.
--- Cuts : Each of them starts with the maximum temperature ("MaxTemperature")
--- Cuts : and descreases while changing the temperature according to a given
--- Cuts : prescription:
--- Cuts : CurrentTemperature =
--- Cuts : - Sqrt: InitialTemperature / Sqrt(StepNumber+2) * TemperatureScale
--- Cuts : - Log: InitialTemperature / Log(StepNumber+2) * TemperatureScale
--- Cuts : - Homo: InitialTemperature / (StepNumber+2) * TemperatureScale
--- Cuts : - Sin: ( Sin( StepNumber / TemperatureScale ) + 1 ) / (StepNumber + 1) * InitialTemperature + Eps
--- Cuts : - Geo: CurrentTemperature * TemperatureScale
--- Cuts :
--- Cuts : Their performance can be improved by adjusting initial temperature
--- Cuts : ("InitialTemperature"), the number of iteration steps ("MaxCalls"),
--- Cuts : and the multiplier that scales the termperature descrease
--- Cuts : ("TemperatureScale")
--- Cuts : -> increase "MaxCalls"
--- Cuts : -> adjust "InitialTemperature"
--- Cuts : -> adjust "TemperatureScale"
--- Cuts : -> adjust "KernelTemperature"
--- Cuts :
--- Cuts : <Suppress this message by specifying "!H" in the booking option>
--- Cuts : [1m================================================================[0m
--- Cuts :
--- CutsFitter_GA : Parsing option string:
--- CutsFitter_GA : "!H:!V:!FitMethod=GA:!EffSel:Steps=30:Cycles=3:PopSize=100:SC_steps=10:SC_rate=5:SC_factor=0.95:!VarProp=FSmart"
--- CutsFitter_GA : The following options are set:
--- CutsFitter_GA : - By User:
--- CutsFitter_GA : PopSize: "100" [Population size for GA]
--- CutsFitter_GA : Steps: "30" [Number of steps for convergence]
--- CutsFitter_GA : Cycles: "3" [Independent cycles of GA fitting]
--- CutsFitter_GA : SC_steps: "10" [Spread control, steps]
--- CutsFitter_GA : SC_rate: "5" [Spread control, rate: factor is changed depending on the rate]
--- CutsFitter_GA : SC_factor: "0.95" [Spread control, factor]
--- CutsFitter_GA : - Default:
--- CutsFitter_GA : ConvCrit: "0.001" [Convergence criteria]
--- CutsFitter_GA : SaveBestGen: "1" [Saves the best n results from each generation; these are included in the last cycle]
--- CutsFitter_GA : SaveBestCycle: "10" [Saves the best n results from each cycle; these are included in the last cycle]
--- CutsFitter_GA : Trim: "False" [Trim the population to PopSize after assessing the fitness of each individual]
--- CutsFitter_GA : Seed: "100" [Set seed of random generator (0 gives random seeds)]
--- CutsFitter_GA : <GeneticFitter> Optimisation, please be patient ... (note: inaccurate progress timing for GA)
--- CutsFitter_GA : Elapsed time: [1;31m4.07 sec[0m
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.1
--- Cuts : Corresponding background efficiency : 0.00732256)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 135.6 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 42.8952 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 3.81004 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 89.332
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.2
--- Cuts : Corresponding background efficiency : 0.0141207)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 127.312 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 42.8952 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 3.49335 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 80.3864
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.3
--- Cuts : Corresponding background efficiency : 0.070704)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 160.795 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 40.9539 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 3.26098 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 113.589
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.4
--- Cuts : Corresponding background efficiency : 0.105358)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 195.593 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 52.7386 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 2.65079 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 79.7597
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.5
--- Cuts : Corresponding background efficiency : 0.157221)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 153.781 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 43.7053 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 2.19232 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 71.3056
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.6
--- Cuts : Corresponding background efficiency : 0.176855)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 160.795 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 43.478 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 2.38514 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 79.99
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.7
--- Cuts : Corresponding background efficiency : 0.23809)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 144.802 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 43.7053 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 2.07507 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 84.3156
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.8
--- Cuts : Corresponding background efficiency : 0.307613)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 210.679 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 45.0753 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 1.50482 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 101.042
--- Cuts : -----------------------------------------------------
--- Cuts : -----------------------------------------------------
--- Cuts : Cut values for requested signal efficiency: 0.9
--- Cuts : Corresponding background efficiency : 0.360652)
--- Cuts : -----------------------------------------------------
--- Cuts : Cut[ 0]: 158.587 < [HT] <= 1e+30
--- Cuts : Cut[ 1]: 41.3157 < [Jet1Pt] <= 1e+30
--- Cuts : Cut[ 2]: 1.87085 < [DeltaRJet1Jet2] <= 1e+30
--- Cuts : Cut[ 3]: -1e+30 < [WTransverseMass] <= 149.694
--- Cuts : -----------------------------------------------------
--- Cuts : Creating weight file in text format: [1;34mweights/TMVAnalysis_CutsGA.weights.txt[0m
--- Cuts : Creating standalone response class : [1;34mweights/TMVAnalysis_CutsGA.class.C[0m
--- Cuts : write monitoring histograms to file: TMVAout.1405006.root:/Method_Cuts/CutsGA
--- Factory : Train method: Likelihood
--- Likelihood : Filling reference histograms
[1;31m--- <WARNING> Likelihood : Histogram "HT_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "HT signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "Jet1Pt_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "Jet1Pt signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "DeltaRJet1Jet2_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "DeltaRJet1Jet2 signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "WTransverseMass_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "WTransverseMass signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "HT_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "HT background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "Jet1Pt_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "Jet1Pt background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "DeltaRJet1Jet2_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "DeltaRJet1Jet2 background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "WTransverseMass_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "WTransverseMass background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
--- Likelihood : Creating weight file in text format: [1;34mweights/TMVAnalysis_Likelihood.weights.txt[0m
--- Likelihood : Creating standalone response class : [1;34mweights/TMVAnalysis_Likelihood.class.C[0m
--- Likelihood : Write monitoring histograms to file: TMVAout.1405006.root:/Method_Likelihood/Likelihood
--- Factory : Train method: LikelihoodPCA
--- Likelihood : Filling reference histograms
[1;31m--- <WARNING> Likelihood : Histogram "HT_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "HT signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "Jet1Pt_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "Jet1Pt signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "DeltaRJet1Jet2_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "DeltaRJet1Jet2 signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "WTransverseMass_sig_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given signal statistics[0m
--- PDF : Validation result for PDF "WTransverseMass signal training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "HT_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "HT background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "Jet1Pt_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "Jet1Pt background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "DeltaRJet1Jet2_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "DeltaRJet1Jet2 background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
[1;31m--- <WARNING> Likelihood : Histogram "WTransverseMass_bgd_smooth" has too few bins (2) for smoothing[0m
[1;31m--- <WARNING> Likelihood : -> use histogram[0m
[1;31m--- <WARNING> Likelihood : Please check if the option variable "NAvEvtPerBin" requires too many events per bin for the given background statistics[0m
--- PDF : Validation result for PDF "WTransverseMass background training":
--- PDF : chi2/ndof(!=0) = 0.0/2 = 0.00 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [0(0),0(0),0(0),0(0)]
--- Likelihood : Creating weight file in text format: [1;34mweights/TMVAnalysis_LikelihoodPCA.weights.txt[0m
--- Likelihood : Creating standalone response class : [1;34mweights/TMVAnalysis_LikelihoodPCA.class.C[0m
--- Likelihood : Write monitoring histograms to file: TMVAout.1405006.root:/Method_Likelihood/LikelihoodPCA
--- Factory : Train method: PDERS
--- PDERS : Creating weight file in text format: [1;34mweights/TMVAnalysis_PDERS.weights.txt[0m
--- PDERS : Creating standalone response class : [1;34mweights/TMVAnalysis_PDERS.class.C[0m
--- PDERS : No monitoring histograms written
--- Factory : Train method: HMatrix
--- HMatrix : Creating weight file in text format: [1;34mweights/TMVAnalysis_HMatrix.weights.txt[0m
--- HMatrix : Creating standalone response class : [1;34mweights/TMVAnalysis_HMatrix.class.C[0m
--- HMatrix : No monitoring histograms written
--- Factory : Train method: Fisher
--- Fisher :
--- Fisher : [1m================================================================[0m
--- Fisher : [1mH e l p f o r c l a s s i f i e r [ Fisher ] :[0m
--- Fisher :
--- Fisher : [1m--- Short description:[0m
--- Fisher :
--- Fisher : Fisher discriminants select events by distinguishing the mean
--- Fisher : values of the signal and background distributions in a trans-
--- Fisher : formed variable space where linear correlations are removed.
--- Fisher :
--- Fisher : (More precisely: the "linear discriminator" determines
--- Fisher : an axis in the (correlated) hyperspace of the input
--- Fisher : variables such that, when projecting the output classes
--- Fisher : (signal and background) upon this axis, they are pushed
--- Fisher : as far as possible away from each other, while events
--- Fisher : of a same class are confined in a close vicinity. The
--- Fisher : linearity property of this classifier is reflected in the
--- Fisher : metric with which "far apart" and "close vicinity" are
--- Fisher : determined: the covariance matrix of the discriminating
--- Fisher : variable space.)
--- Fisher :
--- Fisher : [1m--- Performance optimisation:[0m
--- Fisher :
--- Fisher : Optimal performance for Fisher discriminants is obtained for
--- Fisher : linearly correlated Gaussian-distributed variables. Any deviation
--- Fisher : from this ideal reduces the achievable separation power. In
--- Fisher : particular, no discrimination at all is achieved for a variable
--- Fisher : that has the same sample mean for signal and background, even if
--- Fisher : the shapes of the distributions are very different. Thus, Fisher
--- Fisher : discriminants often benefit from suitable transformations of the
--- Fisher : input variables. For example, if a variable x in [-1,1] has a
--- Fisher : a parabolic signal distributions, and a uniform background
--- Fisher : distributions, their mean value is zero in both cases, leading
--- Fisher : to no separation. The simple transformation x -> |x| renders this
--- Fisher : variable powerful for the use in a Fisher discriminant.
--- Fisher :
--- Fisher : [1m--- Performance tuning via configuration options:[0m
--- Fisher :
--- Fisher : None
--- Fisher :
--- Fisher : <Suppress this message by specifying "!H" in the booking option>
--- Fisher : [1m================================================================[0m
--- Fisher :
--- Fisher : Results for Fisher coefficients:
--- Fisher : ---------------------------------
--- Fisher : Variable: Coefficient:
--- Fisher : ---------------------------------
--- Fisher : HT: -0.000
--- Fisher : Jet1Pt: +0.001
--- Fisher : DeltaRJet1Jet2: +0.169
--- Fisher : WTransverseMass: -0.002
--- Fisher : (offset): -0.379
--- Fisher : ---------------------------------
--- Fisher : <CreateMVAPdfs> Using 50 bins and smooth 1 times
--- PDF : Validation result for PDF "Fisher_tr_S":
--- PDF : chi2/ndof(!=0) = 12.7/25 = 0.51 (Prob = 0.98)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [4(7),0(1),0(0),0(0)]
--- PDF : Validation result for PDF "Fisher_tr_B":
--- PDF : chi2/ndof(!=0) = 12.2/42 = 0.29 (Prob = 1.00)
--- PDF : #bins-found(#expected-bins) deviating > [1,2,3,6] sigmas: [3(13),0(1),0(0),0(0)]
--- Fisher : <CreateMVAPdfs> Separation from histogram (PDF): 0.122 (0.050)
--- Fisher : Creating weight file in text format: [1;34mweights/TMVAnalysis_Fisher.weights.txt[0m
--- Fisher : Creating standalone response class : [1;34mweights/TMVAnalysis_Fisher.class.C[0m
--- Fisher : No monitoring histograms written
--- Factory : Train method: FDA_MT
--- FDA :
--- FDA : [1m================================================================[0m
--- FDA : [1mH e l p f o r c l a s s i f i e r [ FDA ] :[0m
--- FDA :
--- FDA : [1m--- Short description:[0m
--- FDA :
--- FDA : The function discriminant analysis (FDA) is a classifier suitable
--- FDA : to solve linear or simple nonlinear discrimination problems.
--- FDA :
--- FDA : The user provides the desired function with adjustable parameters
--- FDA : via the configuration option string, and FDA fits the parameters to
--- FDA : it, requiring the signal (background) function value to be as close
--- FDA : as possible to 1 (0). Its advantage over the more involved and
--- FDA : automatic nonlinear discriminators is the simplicity and transparency
--- FDA : of the discrimination expression. A shortcoming is that FDA will
--- FDA : underperform for involved problems with complicated, phase space
--- FDA : dependent nonlinear correlations.
--- FDA :
--- FDA : Please consult the Users Guide for the format of the formula string
--- FDA : and the allowed parameter ranges:
--- FDA : http://tmva.sourceforge.net/docu/TMVAUsersGuide.pdf
--- FDA :
--- FDA : [1m--- Performance optimisation:[0m
--- FDA :
--- FDA : The FDA performance depends on the complexity and fidelity of the
--- FDA : user-defined discriminator function. As a general rule, it should
--- FDA : be able to reproduce the discrimination power of any linear
--- FDA : discriminant analysis. To reach into the nonlinear domain, it is
--- FDA : useful to inspect the correlation profiles of the input variables,
--- FDA : and add quadratic and higher polynomial terms between variables as
--- FDA : necessary. Comparison with more involved nonlinear classifiers can
--- FDA : be used as a guide.
--- FDA :
--- FDA : [1m--- Performance tuning via configuration options:[0m
--- FDA :
--- FDA : Depending on the function used, the choice of "FitMethod" is
--- FDA : crucial for getting valuable solutions with FDA. As a guideline it
--- FDA : is recommended to start with "FitMethod=MINUIT". When more complex
--- FDA : functions are used where MINUIT does not converge to reasonable
--- FDA : results, the user should switch to non-gradient FitMethods such
--- FDA : as GeneticAlgorithm (GA) or Monte Carlo (MC). It might prove to be
--- FDA : useful to combine GA (or MC) with MINUIT by setting the option
--- FDA : "Converger=MINUIT". GA (MC) will then set the starting parameters
--- FDA : for MINUIT such that the basic quality of GA (MC) of finding global
--- FDA : minima is combined with the efficacy of MINUIT of finding local
--- FDA : minima.
--- FDA :
--- FDA : <Suppress this message by specifying "!H" in the booking option>
--- FDA : [1m================================================================[0m
--- FDA :
--- FDA : Results for parameter fit using "MINUIT" fitter:
--- FDA : -----------------------
--- FDA : Parameter: Fit result:
--- FDA : -----------------------
--- FDA : Par(0): -0.0344668
--- FDA : Par(1): 0.000934737
--- FDA : Par(2):-0.000992232
--- FDA : Par(3): 0.174053
--- FDA : Par(4): -0.00153156
--- FDA : -----------------------
--- FDA : Discriminator expression: "(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3"
--- FDA : Value of estimator at minimum: 0.415947
--- FDA : Creating weight file in text format: [1;34mweights/TMVAnalysis_FDA_MT.weights.txt[0m
--- FDA : Creating standalone response class : [1;34mweights/TMVAnalysis_FDA_MT.class.C[0m
--- FDA : No monitoring histograms written
--- Factory : Train method: MLP
--- MLP :
--- MLP : [1m================================================================[0m
--- MLP : [1mH e l p f o r c l a s s i f i e r [ MLP ] :[0m
--- MLP :
--- MLP : [1m--- Short description:[0m
--- MLP :
--- MLP : The MLP artificial neural network (ANN) is a traditional feed-
--- MLP : forward multilayer perceptron impementation. The MLP has a user-
--- MLP : defined hidden layer architecture, while the number of input (output)
--- MLP : nodes is determined by the input variables (output classes, i.e.,
--- MLP : signal and one background).
--- MLP :
--- MLP : [1m--- Performance optimisation:[0m
--- MLP :
--- MLP : Neural networks are stable and performing for a large variety of
--- MLP : linear and non-linear classification problems. However, in contrast
--- MLP : to (e.g.) boosted decision trees, the user is advised to reduce the
--- MLP : number of input variables that have only little discrimination power.
--- MLP :
--- MLP : In the tests we have carried out so far, the MLP and ROOT networks
--- MLP : (TMlpANN, interfaced via TMVA) performed equally well, with however
--- MLP : a clear speed advantage for the MLP. The Clermont-Ferrand neural
--- MLP : net (CFMlpANN) exhibited worse classification performance in these
--- MLP : tests, which is partly due to the slow convergence of its training
--- MLP : (at least 10k training cycles are required to achieve approximately
--- MLP : competitive results).
--- MLP :
--- MLP : [1mOvertraining: [0monly the TMlpANN performs an explicit separation of the
--- MLP : full training sample into independent training and validation samples.
--- MLP : We have found that in most high-energy physics applications the
--- MLP : avaliable degrees of freedom (training events) are sufficient to
--- MLP : constrain the weights of the relatively simple architectures required
--- MLP : to achieve good performance. Hence no overtraining should occur, and
--- MLP : the use of validation samples would only reduce the available training
--- MLP : information. However, if the perrormance on the training sample is
--- MLP : found to be significantly better than the one found with the inde-
--- MLP : pendent test sample, caution is needed. The results for these samples
--- MLP : are printed to standard output at the end of each training job.
--- MLP :
--- MLP : [1m--- Performance tuning via configuration options:[0m
--- MLP :
--- MLP : The hidden layer architecture for all ANNs is defined by the option
--- MLP : "HiddenLayers=N+1,N,...", where here the first hidden layer has N+1
--- MLP : neurons and the second N neurons (and so on), and where N is the number
--- MLP : of input variables. Excessive numbers of hidden layers should be avoided,
--- MLP : in favour of more neurons in the first hidden layer.
--- MLP :
--- MLP : The number of cycles should be above 500. As said, if the number of
--- MLP : adjustable weights is small compared to the training sample size,
--- MLP : using a large number of training samples should not lead to overtraining.
--- MLP :
--- MLP : <Suppress this message by specifying "!H" in the booking option>
--- MLP : [1m================================================================[0m
--- MLP :
--- MLP : Training Network
--- MLP : Train: elapsed time: [1;31m4.01 sec[0m
--- MLP : Creating weight file in text format: [1;34mweights/TMVAnalysis_MLP.weights.txt[0m
--- MLP : Creating standalone response class : [1;34mweights/TMVAnalysis_MLP.class.C[0m
--- MLP : Write special histos to file: TMVAout.1405006.root:/Method_MLP/MLP
--- Factory : Train method: SVM_Gauss
--- SVM : Sorry, no computing time forecast available for SVM, please wait ...
--- SVM : <Train> elapsed time: [1;31m0.0262 sec[0m
--- SVM : <Train> number of iterations: 3
--- SVM : Results:
--- SVM : - number of support vectors: 17 (1%)
--- SVM : - b: 0.02119
--- SVM : All support vectors stored properly
--- SVM : Creating weight file in text format: [1;34mweights/TMVAnalysis_SVM_Gauss.weights.txt[0m
--- SVM : Creating standalone response class : [1;34mweights/TMVAnalysis_SVM_Gauss.class.C[0m
--- SVM : No monitoring histograms written
--- Factory : Train method: BDT
--- BDT : Training 400 Decision Trees ... patience please
--- BDT : <Train> elapsed time: [1;31m25.2 sec[0m
--- BDT : <Train> average number of nodes before/after pruning : 89 / 49
--- BDT : Creating weight file in text format: [1;34mweights/TMVAnalysis_BDT.weights.txt[0m
--- BDT : Creating standalone response class : [1;34mweights/TMVAnalysis_BDT.class.C[0m
--- BDT : Write monitoring histograms to file: TMVAout.1405006.root:/Method_BDT/BDT
--- Factory : Train method: BDTD
--- BDT : Training 400 Decision Trees ... patience please
--- BDT : <Train> elapsed time: [1;31m25.9 sec[0m
--- BDT : <Train> average number of nodes before/after pruning : 88 / 54
--- BDT : Creating weight file in text format: [1;34mweights/TMVAnalysis_BDTD.weights.txt[0m
--- BDT : Creating standalone response class : [1;34mweights/TMVAnalysis_BDTD.class.C[0m
--- BDT : Write monitoring histograms to file: TMVAout.1405006.root:/Method_BDT/BDTD
--- Factory : Train method: RuleFit
--- RuleFit :
--- RuleFit : [1m================================================================[0m
--- RuleFit : [1mH e l p f o r c l a s s i f i e r [ RuleFit ] :[0m
--- RuleFit :
--- RuleFit : [1m--- Short description:[0m
--- RuleFit :
--- RuleFit : This method uses a collection of so called rules to create a
--- RuleFit : discriminating scoring function. Each rule consists of a series
--- RuleFit : of cuts in parameter space. The ensemble of rules are created
--- RuleFit : from a forest of decision trees, trained using the training data.
--- RuleFit : Each node (apart from the root) corresponds to one rule.
--- RuleFit : The scoring function is then obtained by linearly combining
--- RuleFit : the rules. A fitting procedure is applied to find the optimum
--- RuleFit : set of coefficients. The goal is to find a model with few rules
--- RuleFit : but with a strong discriminating power.
--- RuleFit :
--- RuleFit : [1m--- Performance optimisation:[0m
--- RuleFit :
--- RuleFit : There are two important considerations to make when optimising:
--- RuleFit :
--- RuleFit : 1. Topology of the decision tree forest
--- RuleFit : 2. Fitting of the coefficients
--- RuleFit :
--- RuleFit : The maximum complexity of the rules is defined by the size of
--- RuleFit : the trees. Large trees will yield many complex rules and capture
--- RuleFit : higher order correlations. On the other hand, small trees will
--- RuleFit : lead to a smaller ensemble with simple rules, only capable of
--- RuleFit : modeling simple structures.
--- RuleFit : Several parameters exists for controlling the complexity of the
--- RuleFit : rule ensemble.
--- RuleFit :
--- RuleFit : The fitting procedure searches for a minimum using a gradient
--- RuleFit : directed path. Apart from step size and number of steps, the
--- RuleFit : evolution of the path is defined by a cut-off parameter, tau.
--- RuleFit : This parameter is unknown and depends on the training data.
--- RuleFit : A large value will tend to give large weights to a few rules.
--- RuleFit : Similarily, a small value will lead to a large set of rules
--- RuleFit : with similar weights.
--- RuleFit :
--- RuleFit : A final point is the model used; rules and/or linear terms.
--- RuleFit : For a given training sample, the result may improve by adding
--- RuleFit : linear terms. If best performance is optained using only linear
--- RuleFit : terms, it is very likely that the Fisher discriminant would be
--- RuleFit : a better choice. Ideally the fitting procedure should be able to
--- RuleFit : make this choice by giving appropriate weights for either terms.
--- RuleFit :
--- RuleFit : [1m--- Performance tuning via configuration options:[0m
--- RuleFit :
--- RuleFit : I. TUNING OF RULE ENSEMBLE:
--- RuleFit :
--- RuleFit : [1mForestType [0m: Recomended is to use the default "AdaBoost".
--- RuleFit : [1mnTrees [0m: More trees leads to more rules but also slow
--- RuleFit : performance. With too few trees the risk is
--- RuleFit : that the rule ensemble becomes too simple.
--- RuleFit : [1mfEventsMin [0m
--- RuleFit : [1mfEventsMax [0m: With a lower min, more large trees will be generated
--- RuleFit : leading to more complex rules.
--- RuleFit : With a higher max, more small trees will be
--- RuleFit : generated leading to more simple rules.
--- RuleFit : By changing this range, the average complexity
--- RuleFit : of the rule ensemble can be controlled.
--- RuleFit : [1mRuleMinDist [0m: By increasing the minimum distance between
--- RuleFit : rules, fewer and more diverse rules will remain.
--- RuleFit : Initially it is a good idea to keep this small
--- RuleFit : or zero and let the fitting do the selection of
--- RuleFit : rules. In order to reduce the ensemble size,
--- RuleFit : the value can then be increased.
--- RuleFit :
--- RuleFit : II. TUNING OF THE FITTING:
--- RuleFit :
--- RuleFit : [1mGDPathEveFrac [0m: fraction of events in path evaluation
--- RuleFit : Increasing this fraction will improve the path
--- RuleFit : finding. However, a too high value will give few
--- RuleFit : unique events available for error estimation.
--- RuleFit : It is recomended to usethe default = 0.5.
--- RuleFit : [1mGDTau [0m: cutoff parameter tau
--- RuleFit : By default this value is set to -1.0.
--- RuleFit : This means that the cut off parameter is
--- RuleFit : automatically estimated. In most cases
--- RuleFit : this should be fine. However, you may want
--- RuleFit : to fix this value if you already know it
--- RuleFit : and want to reduce on training time.
--- RuleFit : [1mGDTauPrec [0m: precision of estimated tau
--- RuleFit : Increase this precision to find a more
--- RuleFit : optimum cut-off parameter.
--- RuleFit : [1mGDNStep [0m: number of steps in path search
--- RuleFit : If the number of steps is too small, then
--- RuleFit : the program will give a warning message.
--- RuleFit :
--- RuleFit : III. WARNING MESSAGES
--- RuleFit :
--- RuleFit : [1mRisk(i+1)>=Risk(i) in path[0m
--- RuleFit : [1mChaotic behaviour of risk evolution.[0m
--- RuleFit : The error rate was still decreasing at the end
--- RuleFit : By construction the Risk should always decrease.
--- RuleFit : However, if the training sample is too small or
--- RuleFit : the model is overtrained, such warnings can
--- RuleFit : occur.
--- RuleFit : The warnings can safely be ignored if only a
--- RuleFit : few (<3) occur. If more warnings are generated,
--- RuleFit : the fitting fails.
--- RuleFit : A remedy may be to increase the value
--- RuleFit : [1mGDValidEveFrac[0m to 1.0 (or a larger value).
--- RuleFit : In addition, if [1mGDPathEveFrac[0m is too high
--- RuleFit : the same warnings may occur since the events
--- RuleFit : used for error estimation are also used for
--- RuleFit : path estimation.
--- RuleFit : Another possibility is to modify the model -
--- RuleFit : See above on tuning the rule ensemble.
--- RuleFit :
--- RuleFit : [1mThe error rate was still decreasing at the end of the path[0m
--- RuleFit : Too few steps in path! Increase [1mGDNSteps[0m.
--- RuleFit :
--- RuleFit : [1mReached minimum early in the search[0m
--- RuleFit : Minimum was found early in the fitting. This
--- RuleFit : may indicate that the used step size [1mGDStep[0m.
--- RuleFit : was too large. Reduce it and rerun.
--- RuleFit : If the results still are not OK, modify the
--- RuleFit : model either by modifying the rule ensemble
--- RuleFit : or add/remove linear terms
--- RuleFit :
--- RuleFit : <Suppress this message by specifying "!H" in the booking option>
--- RuleFit : [1m================================================================[0m
--- RuleFit :
--- RuleFit : -------------------RULE ENSEMBLE SUMMARY------------------------
--- RuleFit : Tree training method : AdaBoost
--- RuleFit : Number of events per tree : 1234
--- RuleFit : Number of trees : 20
--- RuleFit : Number of generated rules : 242
--- RuleFit : Idem, after cleanup : 177
--- RuleFit : Average number of cuts per rule : 3.37
--- RuleFit : Spread in number of cuts per rules : 1.66
--- RuleFit : ----------------------------------------------------------------
--- RuleFit :
--- RuleFit : GD path scan - the scan stops when the max num. of steps is reached or a min is found
--- RuleFit : Estimating the cutoff parameter tau. The estimated time is a pessimistic maximum.
--- RuleFit : Best path found with tau = 1.0000 after [1;31m0.943 sec[0m
--- RuleFit : Fitting model...
--- RuleFit : Minimization elapsed time : [1;31m1.63 sec[0m
--- RuleFit : ----------------------------------------------------------------
--- RuleFit : Found minimum at step 5700 with error = 0.380611
--- RuleFit : Reason for ending loop: end of loop reached
--- RuleFit : ----------------------------------------------------------------
--- RuleFit : Elapsed time: [1;31m3.02 sec[0m
--- RuleFit : Removed 174 out of a total of 177 rules with importance < 0.001
--- RuleFit :
--- RuleFit : ================================================================
--- RuleFit : M o d e l
--- RuleFit : ================================================================
--- RuleFit : Offset (a0) = -0.317228
--- RuleFit : ------------------------------------
--- RuleFit : Linear model (weights unnormalised)
--- RuleFit : ------------------------------------
--- RuleFit : Variable : Weights : Importance
--- RuleFit : ------------------------------------
--- RuleFit : HT-> importance below threshhold = 0.000
--- RuleFit : Jet1Pt-> importance below threshhold = 0.000
--- RuleFit : DeltaRJet1Jet2 : 1.214e-01 : 0.091
--- RuleFit : WTransverseMass : -4.797e-02 : 1.000
--- RuleFit : ------------------------------------
--- RuleFit : Number of rules = 3
--- RuleFit : Printing the first 3 rules, ordered in importance.
--- RuleFit : Rule 1 : Importance = 0.1568
--- RuleFit : Cut 1 : HT < 365
--- RuleFit : Rule 2 : Importance = 0.1000
--- RuleFit : Cut 1 : Jet1Pt < 143
--- RuleFit : Cut 2 : 3 < DeltaRJet1Jet2
--- RuleFit : Rule 3 : Importance = 0.0978
--- RuleFit : Cut 1 : 3.11 < DeltaRJet1Jet2
--- RuleFit : All rules printed
--- RuleFit : ================================================================
--- RuleFit :
--- RuleFit : Creating weight file in text format: [1;34mweights/TMVAnalysis_RuleFit.weights.txt[0m
--- RuleFit : Creating standalone response class : [1;34mweights/TMVAnalysis_RuleFit.class.C[0m
--- RuleFit : write monitoring ntuple to file: TMVAout.1405006.root:/Method_RuleFit/RuleFit
--- Factory :
--- Factory : Begin ranking of input variables...
--- Factory : No variable ranking supplied by classifier: CutsGA
--- Likelihood : Ranking result (top variable is best ranked)
--- Likelihood : ----------------------------------------------------------------
--- Likelihood : Rank : Variable : Delta Separation
--- Likelihood : ----------------------------------------------------------------
--- Likelihood : 1 : DeltaRJet1Jet2 : 4.024e-02
--- Likelihood : 2 : HT : 1.096e-03
--- Likelihood : 3 : WTransverseMass : -3.801e-03
--- Likelihood : 4 : Jet1Pt : -1.021e-02
--- Likelihood : ----------------------------------------------------------------
--- Likelihood : Ranking result (top variable is best ranked)
--- Likelihood : ----------------------------------------------------------------
--- Likelihood : Rank : Variable : Delta Separation
--- Likelihood : ----------------------------------------------------------------
--- Likelihood : 1 : WTransverseMass : 2.048e-02
--- Likelihood : 2 : Jet1Pt : -1.442e-02
--- Likelihood : 3 : HT : -1.676e-02
--- Likelihood : 4 : DeltaRJet1Jet2 : -1.855e-02
--- Likelihood : ----------------------------------------------------------------
--- Factory : No variable ranking supplied by classifier: PDERS
--- Factory : No variable ranking supplied by classifier: HMatrix
--- Fisher : Ranking result (top variable is best ranked)
--- Fisher : ----------------------------------------------------------------
--- Fisher : Rank : Variable : Discr. power
--- Fisher : ----------------------------------------------------------------
--- Fisher : 1 : DeltaRJet1Jet2 : 3.714e-02
--- Fisher : 2 : Jet1Pt : 1.637e-02
--- Fisher : 3 : HT : 1.611e-02
--- Fisher : 4 : WTransverseMass : 5.409e-03
--- Fisher : ----------------------------------------------------------------
--- Factory : No variable ranking supplied by classifier: FDA_MT
--- MLP : Ranking result (top variable is best ranked)
--- MLP : ----------------------------------------------------------------
--- MLP : Rank : Variable : Importance
--- MLP : ----------------------------------------------------------------
--- MLP : 1 : HT : 1.991e+00
--- MLP : 2 : Jet1Pt : 7.900e-01
--- MLP : 3 : DeltaRJet1Jet2 : 6.607e-01
--- MLP : 4 : WTransverseMass : 5.032e-04
--- MLP : ----------------------------------------------------------------
--- Factory : No variable ranking supplied by classifier: SVM_Gauss
--- BDT : Ranking result (top variable is best ranked)
--- BDT : ----------------------------------------------------------------
--- BDT : Rank : Variable : Variable Importance
--- BDT : ----------------------------------------------------------------
--- BDT : 1 : DeltaRJet1Jet2 : 3.755e-01
--- BDT : 2 : HT : 2.663e-01
--- BDT : 3 : WTransverseMass : 1.936e-01
--- BDT : 4 : Jet1Pt : 1.646e-01
--- BDT : ----------------------------------------------------------------
--- BDT : Ranking result (top variable is best ranked)
--- BDT : ----------------------------------------------------------------
--- BDT : Rank : Variable : Variable Importance
--- BDT : ----------------------------------------------------------------
--- BDT : 1 : HT : 3.074e-01
--- BDT : 2 : DeltaRJet1Jet2 : 2.853e-01
--- BDT : 3 : WTransverseMass : 2.282e-01
--- BDT : 4 : Jet1Pt : 1.790e-01
--- BDT : ----------------------------------------------------------------
--- RuleFit : Ranking result (top variable is best ranked)
--- RuleFit : ----------------------------------------------------------------
--- RuleFit : Rank : Variable : Importance
--- RuleFit : ----------------------------------------------------------------
--- RuleFit : 1 : WTransverseMass : 1.000e+00
--- RuleFit : 2 : DeltaRJet1Jet2 : 2.382e-01
--- RuleFit : 3 : HT : 1.568e-01
--- RuleFit : 4 : Jet1Pt : 4.998e-02
--- RuleFit : ----------------------------------------------------------------
--- Factory :
--- Factory : Testing all classifiers...
--- Factory : Test method: CutsGA
--- Cuts : Reading weight file: [1;34mweights/TMVAnalysis_CutsGA.weights.txt[0m
--- Cuts : Read method with name <Cuts> and title <CutsGA>
--- Cuts : Classifier was trained with TMVA Version: 3.9.5
--- Cuts : Classifier was trained with ROOT Version: 5.20/00
--- Cuts : Create VariableTransformation "None"
--- Cuts : Use optimization method: 'Genetic Algorithm'
--- Cuts : Use efficiency computation method: 'Event Selection'
--- Cuts : Use "FSmart" cuts for variable: 'HT'
--- Cuts : Use "FSmart" cuts for variable: 'Jet1Pt'
--- Cuts : Use "FSmart" cuts for variable: 'DeltaRJet1Jet2'
--- Cuts : Use "FSmart" cuts for variable: 'WTransverseMass'
--- Cuts : Option for variable: HT: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: Jet1Pt: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: DeltaRJet1Jet2: 'ForceSmart' (#: 3)
--- Cuts : Option for variable: WTransverseMass: 'ForceSmart' (#: 3)
--- Cuts : Read cuts optimised using Genetic Algorithm
--- Cuts : in 100 signal efficiency bins and for 4 variables
--- Cuts : Preparing evaluation tree...
--- Cuts : Elapsed time for evaluation of 4 events: [1;31m0.213 sec[0m
--- Factory : Test method: Likelihood
--- Likelihood : Reading weight file: [1;34mweights/TMVAnalysis_Likelihood.weights.txt[0m
--- Likelihood : Read method with name <Likelihood> and title <Likelihood>
--- Likelihood : Classifier was trained with TMVA Version: 3.9.5
--- Likelihood : Classifier was trained with ROOT Version: 5.20/00
--- Likelihood : Create VariableTransformation "None"
--- Likelihood : Preparing evaluation tree...
--- Likelihood : Elapsed time for evaluation of 4 events: [1;31m0.133 sec[0m
--- Factory : Test method: LikelihoodPCA
--- Likelihood : Reading weight file: [1;34mweights/TMVAnalysis_LikelihoodPCA.weights.txt[0m
--- Likelihood : Read method with name <Likelihood> and title <LikelihoodPCA>
--- Likelihood : Classifier was trained with TMVA Version: 3.9.5
--- Likelihood : Classifier was trained with ROOT Version: 5.20/00
--- Likelihood : Create VariableTransformation "PCA"
--- Likelihood : Use principal component transformation
--- Likelihood : Preparing evaluation tree...
--- Likelihood : Elapsed time for evaluation of 4 events: [1;31m0.113 sec[0m
--- Factory : Test method: PDERS
--- PDERS : Reading weight file: [1;34mweights/TMVAnalysis_PDERS.weights.txt[0m
--- PDERS : Read method with name <PDERS> and title <PDERS>
--- PDERS : Classifier was trained with TMVA Version: 3.9.5
--- PDERS : Classifier was trained with ROOT Version: 5.20/00
--- PDERS : Create VariableTransformation "None"
--- PDERS : Preparing evaluation tree...
--- PDERS : Elapsed time for evaluation of 4 events: [1;31m0.111 sec[0m
--- Factory : Test method: HMatrix
--- HMatrix : Reading weight file: [1;34mweights/TMVAnalysis_HMatrix.weights.txt[0m
--- HMatrix : Read method with name <HMatrix> and title <HMatrix>
--- HMatrix : Classifier was trained with TMVA Version: 3.9.5
--- HMatrix : Classifier was trained with ROOT Version: 5.20/00
--- HMatrix : Create VariableTransformation "None"
--- HMatrix : Preparing evaluation tree...
--- HMatrix : Elapsed time for evaluation of 4 events: [1;31m0.121 sec[0m
--- Factory : Test method: Fisher
--- Fisher : Reading weight file: [1;34mweights/TMVAnalysis_Fisher.weights.txt[0m
--- Fisher : Read method with name <Fisher> and title <Fisher>
--- Fisher : Classifier was trained with TMVA Version: 3.9.5
--- Fisher : Classifier was trained with ROOT Version: 5.20/00
--- Fisher : Create VariableTransformation "None"
--- Fisher : Preparing evaluation tree...
--- Fisher : Elapsed time for evaluation of 4 events: [1;31m0.109 sec[0m
--- Factory : Test method: FDA_MT
--- FDA : Reading weight file: [1;34mweights/TMVAnalysis_FDA_MT.weights.txt[0m
--- FDA : Read method with name <FDA> and title <FDA_MT>
--- FDA : Classifier was trained with TMVA Version: 3.9.5
--- FDA : Classifier was trained with ROOT Version: 5.20/00
--- FDA : Create VariableTransformation "None"
--- FDA : User-defined formula string : "(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3"
--- FDA : TFormula-compatible formula string: "[0]+[1]*[5]+[2]*[6]+[3]*[7]+[4]*[8]"
--- FDA : Creating and compiling formula
--- FDA_Fitter_M...: Parsing option string:
--- FDA_Fitter_M...: "!V=False:!H=True:!Formula=(0)+(1)*x0+(2)*x1+(3)*x2+(4)*x3:!ParRanges=(-1,1);(-10,10);(-10,10);(-10,10);(-10,10):!FitMethod=MINUIT:!D=False:!Normalise=False:!VarTransform=None:!VarTransformType=Signal:!NbinsMVAPdf=60:!NsmoothMVAPdf=2:!VerboseLevel=Default:!CreateMVAPdfs=False:!TxtWeightFilesOnly=True:!Converger=None"
--- FDA_Fitter_M...: The following options are set:
--- FDA_Fitter_M...: - By User:
--- FDA_Fitter_M...: <none>
--- FDA_Fitter_M...: - Default:
--- FDA_Fitter_M...: ErrorLevel: "1" [TMinuit: error level: 0.5=logL fit, 1=chi-squared fit]
--- FDA_Fitter_M...: PrintLevel: "-1" [TMinuit: output level: -1=least, 0, +1=all garbage]
--- FDA_Fitter_M...: FitStrategy: "2" [TMinuit: fit strategy: 2=best]
--- FDA_Fitter_M...: PrintWarnings: "False" [TMinuit: suppress warnings]
--- FDA_Fitter_M...: UseImprove: "True" [TMinuit: use IMPROVE]
--- FDA_Fitter_M...: UseMinos: "True" [TMinuit: use MINOS]
--- FDA_Fitter_M...: SetBatch: "False" [TMinuit: use batch mode]
--- FDA_Fitter_M...: MaxCalls: "1000" [TMinuit: approximate maximum number of function calls]
--- FDA_Fitter_M...: Tolerance: "0.1" [TMinuit: tolerance to the function value at the minimum]
--- FDA_Fitter_M...: <MinuitFitter> Init
--- FDA : Preparing evaluation tree...
--- FDA : Elapsed time for evaluation of 4 events: [1;31m0.116 sec[0m
--- Factory : Test method: MLP
--- MLP : Reading weight file: [1;34mweights/TMVAnalysis_MLP.weights.txt[0m
--- MLP : Read method with name <MLP> and title <MLP>
--- MLP : Classifier was trained with TMVA Version: 3.9.5
--- MLP : Classifier was trained with ROOT Version: 5.20/00
--- MLP : Create VariableTransformation "None"
--- MLP : Building Network
--- MLP : Initializing weights
--- MLP : Forcing weights
--- MLP : Preparing evaluation tree...
--- MLP : Elapsed time for evaluation of 4 events: [1;31m0.139 sec[0m
--- Factory : Test method: SVM_Gauss
--- SVM : Reading weight file: [1;34mweights/TMVAnalysis_SVM_Gauss.weights.txt[0m
--- SVM : Read method with name <SVM> and title <SVM_Gauss>
--- SVM : Classifier was trained with TMVA Version: 3.9.5
--- SVM : Classifier was trained with ROOT Version: 5.20/00
--- SVM : Create VariableTransformation "None"
--- SVM : Preparing evaluation tree...
--- SVM : Elapsed time for evaluation of 4 events: [1;31m0.103 sec[0m
--- Factory : Test method: BDT
--- BDT : Reading weight file: [1;34mweights/TMVAnalysis_BDT.weights.txt[0m
--- BDT : Read method with name <BDT> and title <BDT>
--- BDT : Classifier was trained with TMVA Version: 3.9.5
--- BDT : Classifier was trained with ROOT Version: 5.20/00
--- BDT : Create VariableTransformation "None"
--- BDT : Read 400 Decision trees
--- BDT : Preparing evaluation tree...
--- BDT : Elapsed time for evaluation of 4 events: [1;31m0.0999 sec[0m
--- Factory : Test method: BDTD
--- BDT : Reading weight file: [1;34mweights/TMVAnalysis_BDTD.weights.txt[0m
--- BDT : Read method with name <BDT> and title <BDTD>
--- BDT : Classifier was trained with TMVA Version: 3.9.5
--- BDT : Classifier was trained with ROOT Version: 5.20/00
--- BDT : Create VariableTransformation "Decorrelate"
--- BDT : Read 400 Decision trees
--- BDT : Preparing evaluation tree...
--- BDT : Elapsed time for evaluation of 4 events: [1;31m0.105 sec[0m
--- Factory : Test method: RuleFit
--- RuleFit : Reading weight file: [1;34mweights/TMVAnalysis_RuleFit.weights.txt[0m
--- RuleFit : Read method with name <RuleFit> and title <RuleFit>
--- RuleFit : Classifier was trained with TMVA Version: 3.9.5
--- RuleFit : Classifier was trained with ROOT Version: 5.20/00
--- RuleFit : Create VariableTransformation "None"
--- RuleFit : Preparing evaluation tree...
--- RuleFit : Elapsed time for evaluation of 4 events: [1;31m0.105 sec[0m
--- Factory : Evaluating all classifiers...
--- Factory : Evaluate classifier: CutsGA
--- Factory : Evaluate classifier: Likelihood
--- Likelihood : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: LikelihoodPCA
--- Likelihood : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: PDERS
--- PDERS : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: HMatrix
--- HMatrix : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: Fisher
--- Fisher : Loop over test events and fill histograms with classifier response ...
--- Fisher : Also filling probability and rarity histograms (on request) ...
--- Factory : Evaluate classifier: FDA_MT
--- FDA : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: MLP
--- MLP : Loop over test events and fill histograms with classifier response ...
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.005, fb=-0.005), refValue = 0.005[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.015, fb=-0.015), refValue = 0.015[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.025, fb=-0.025), refValue = 0.025[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.035, fb=-0.035), refValue = 0.035[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.045, fb=-0.045), refValue = 0.045[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.055, fb=-0.055), refValue = 0.055[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.065, fb=-0.065), refValue = 0.065[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.075, fb=-0.075), refValue = 0.075[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.085, fb=-0.085), refValue = 0.085[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.095, fb=-0.095), refValue = 0.095[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.105, fb=-0.105), refValue = 0.105[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.115, fb=-0.115), refValue = 0.115[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.125, fb=-0.125), refValue = 0.125[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.135, fb=-0.135), refValue = 0.135[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.145, fb=-0.145), refValue = 0.145[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.155, fb=-0.155), refValue = 0.155[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.165, fb=-0.165), refValue = 0.165[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.175, fb=-0.175), refValue = 0.175[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.185, fb=-0.185), refValue = 0.185[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.195, fb=-0.195), refValue = 0.195[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.205, fb=-0.205), refValue = 0.205[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.215, fb=-0.215), refValue = 0.215[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.225, fb=-0.225), refValue = 0.225[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.235, fb=-0.235), refValue = 0.235[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.245, fb=-0.245), refValue = 0.245[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.255, fb=-0.255), refValue = 0.255[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.265, fb=-0.265), refValue = 0.265[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.275, fb=-0.275), refValue = 0.275[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.285, fb=-0.285), refValue = 0.285[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.295, fb=-0.295), refValue = 0.295[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.305, fb=-0.305), refValue = 0.305[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.315, fb=-0.315), refValue = 0.315[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.325, fb=-0.325), refValue = 0.325[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.335, fb=-0.335), refValue = 0.335[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.345, fb=-0.345), refValue = 0.345[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.355, fb=-0.355), refValue = 0.355[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.365, fb=-0.365), refValue = 0.365[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.375, fb=-0.375), refValue = 0.375[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.385, fb=-0.385), refValue = 0.385[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.395, fb=-0.395), refValue = 0.395[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.405, fb=-0.405), refValue = 0.405[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.415, fb=-0.415), refValue = 0.415[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.425, fb=-0.425), refValue = 0.425[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.435, fb=-0.435), refValue = 0.435[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.445, fb=-0.445), refValue = 0.445[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.455, fb=-0.455), refValue = 0.455[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.465, fb=-0.465), refValue = 0.465[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.475, fb=-0.475), refValue = 0.475[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.485, fb=-0.485), refValue = 0.485[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.495, fb=-0.495), refValue = 0.495[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.505, fb=-0.505), refValue = 0.505[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.515, fb=-0.515), refValue = 0.515[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.525, fb=-0.525), refValue = 0.525[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.535, fb=-0.535), refValue = 0.535[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.545, fb=-0.545), refValue = 0.545[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.555, fb=-0.555), refValue = 0.555[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.565, fb=-0.565), refValue = 0.565[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.575, fb=-0.575), refValue = 0.575[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.585, fb=-0.585), refValue = 0.585[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.595, fb=-0.595), refValue = 0.595[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.605, fb=-0.605), refValue = 0.605[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.615, fb=-0.615), refValue = 0.615[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.625, fb=-0.625), refValue = 0.625[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.635, fb=-0.635), refValue = 0.635[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.645, fb=-0.645), refValue = 0.645[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.655, fb=-0.655), refValue = 0.655[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.665, fb=-0.665), refValue = 0.665[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.675, fb=-0.675), refValue = 0.675[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.685, fb=-0.685), refValue = 0.685[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.695, fb=-0.695), refValue = 0.695[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.705, fb=-0.705), refValue = 0.705[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.715, fb=-0.715), refValue = 0.715[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.725, fb=-0.725), refValue = 0.725[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.735, fb=-0.735), refValue = 0.735[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.745, fb=-0.745), refValue = 0.745[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.755, fb=-0.755), refValue = 0.755[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.765, fb=-0.765), refValue = 0.765[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.775, fb=-0.775), refValue = 0.775[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.785, fb=-0.785), refValue = 0.785[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.795, fb=-0.795), refValue = 0.795[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.805, fb=-0.805), refValue = 0.805[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.815, fb=-0.815), refValue = 0.815[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.825, fb=-0.825), refValue = 0.825[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.835, fb=-0.835), refValue = 0.835[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.845, fb=-0.845), refValue = 0.845[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.855, fb=-0.855), refValue = 0.855[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.865, fb=-0.865), refValue = 0.865[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.875, fb=-0.875), refValue = 0.875[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.885, fb=-0.885), refValue = 0.885[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.895, fb=-0.895), refValue = 0.895[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.905, fb=-0.905), refValue = 0.905[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.915, fb=-0.915), refValue = 0.915[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.925, fb=-0.925), refValue = 0.925[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.935, fb=-0.935), refValue = 0.935[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.945, fb=-0.945), refValue = 0.945[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.955, fb=-0.955), refValue = 0.955[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.965, fb=-0.965), refValue = 0.965[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.975, fb=-0.975), refValue = 0.975[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.985, fb=-0.985), refValue = 0.985[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.995, fb=-0.995), refValue = 0.995[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.9999, fb=-0.9999), refValue = 0.9999[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.005, fb=-0.005), refValue = 0.005[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.015, fb=-0.015), refValue = 0.015[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.025, fb=-0.025), refValue = 0.025[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.035, fb=-0.035), refValue = 0.035[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.045, fb=-0.045), refValue = 0.045[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.055, fb=-0.055), refValue = 0.055[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.065, fb=-0.065), refValue = 0.065[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.075, fb=-0.075), refValue = 0.075[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.085, fb=-0.085), refValue = 0.085[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.095, fb=-0.095), refValue = 0.095[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.105, fb=-0.105), refValue = 0.105[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.115, fb=-0.115), refValue = 0.115[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.125, fb=-0.125), refValue = 0.125[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.135, fb=-0.135), refValue = 0.135[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.145, fb=-0.145), refValue = 0.145[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.155, fb=-0.155), refValue = 0.155[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.165, fb=-0.165), refValue = 0.165[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.175, fb=-0.175), refValue = 0.175[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.185, fb=-0.185), refValue = 0.185[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.195, fb=-0.195), refValue = 0.195[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.205, fb=-0.205), refValue = 0.205[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.215, fb=-0.215), refValue = 0.215[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.225, fb=-0.225), refValue = 0.225[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.235, fb=-0.235), refValue = 0.235[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.245, fb=-0.245), refValue = 0.245[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.255, fb=-0.255), refValue = 0.255[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.265, fb=-0.265), refValue = 0.265[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.275, fb=-0.275), refValue = 0.275[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.285, fb=-0.285), refValue = 0.285[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.295, fb=-0.295), refValue = 0.295[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.305, fb=-0.305), refValue = 0.305[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.315, fb=-0.315), refValue = 0.315[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.325, fb=-0.325), refValue = 0.325[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.335, fb=-0.335), refValue = 0.335[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.345, fb=-0.345), refValue = 0.345[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.355, fb=-0.355), refValue = 0.355[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.365, fb=-0.365), refValue = 0.365[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.375, fb=-0.375), refValue = 0.375[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.385, fb=-0.385), refValue = 0.385[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.395, fb=-0.395), refValue = 0.395[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.405, fb=-0.405), refValue = 0.405[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.415, fb=-0.415), refValue = 0.415[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.425, fb=-0.425), refValue = 0.425[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.435, fb=-0.435), refValue = 0.435[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.445, fb=-0.445), refValue = 0.445[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.455, fb=-0.455), refValue = 0.455[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.465, fb=-0.465), refValue = 0.465[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.475, fb=-0.475), refValue = 0.475[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.485, fb=-0.485), refValue = 0.485[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.495, fb=-0.495), refValue = 0.495[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.505, fb=-0.505), refValue = 0.505[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.515, fb=-0.515), refValue = 0.515[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.525, fb=-0.525), refValue = 0.525[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.535, fb=-0.535), refValue = 0.535[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.545, fb=-0.545), refValue = 0.545[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.555, fb=-0.555), refValue = 0.555[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.565, fb=-0.565), refValue = 0.565[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.575, fb=-0.575), refValue = 0.575[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.585, fb=-0.585), refValue = 0.585[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.595, fb=-0.595), refValue = 0.595[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.605, fb=-0.605), refValue = 0.605[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.615, fb=-0.615), refValue = 0.615[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.625, fb=-0.625), refValue = 0.625[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.635, fb=-0.635), refValue = 0.635[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.645, fb=-0.645), refValue = 0.645[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.655, fb=-0.655), refValue = 0.655[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.665, fb=-0.665), refValue = 0.665[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.675, fb=-0.675), refValue = 0.675[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.685, fb=-0.685), refValue = 0.685[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.695, fb=-0.695), refValue = 0.695[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.705, fb=-0.705), refValue = 0.705[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.715, fb=-0.715), refValue = 0.715[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.725, fb=-0.725), refValue = 0.725[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.735, fb=-0.735), refValue = 0.735[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.745, fb=-0.745), refValue = 0.745[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.755, fb=-0.755), refValue = 0.755[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.765, fb=-0.765), refValue = 0.765[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.775, fb=-0.775), refValue = 0.775[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.785, fb=-0.785), refValue = 0.785[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.795, fb=-0.795), refValue = 0.795[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.805, fb=-0.805), refValue = 0.805[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.815, fb=-0.815), refValue = 0.815[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.825, fb=-0.825), refValue = 0.825[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.835, fb=-0.835), refValue = 0.835[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.845, fb=-0.845), refValue = 0.845[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.855, fb=-0.855), refValue = 0.855[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.865, fb=-0.865), refValue = 0.865[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.875, fb=-0.875), refValue = 0.875[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.885, fb=-0.885), refValue = 0.885[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.895, fb=-0.895), refValue = 0.895[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.905, fb=-0.905), refValue = 0.905[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.915, fb=-0.915), refValue = 0.915[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.925, fb=-0.925), refValue = 0.925[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.935, fb=-0.935), refValue = 0.935[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.945, fb=-0.945), refValue = 0.945[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.955, fb=-0.955), refValue = 0.955[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.965, fb=-0.965), refValue = 0.965[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.975, fb=-0.975), refValue = 0.975[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.985, fb=-0.985), refValue = 0.985[0m
[1;31m--- <WARNING> RootFinder : <Root> initial interval w/o root: (a=-0.752496, b=-0.752496), (Eff_a=0, Eff_b=0), (fa=-0.995, fb=-0.995), refValue = 0.995[0m
--- Factory : Evaluate classifier: SVM_Gauss
--- SVM : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: BDT
--- BDT : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: BDTD
--- BDT : Loop over test events and fill histograms with classifier response ...
--- Factory : Evaluate classifier: RuleFit
--- RuleFit : Loop over test events and fill histograms with classifier response ...
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (0, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (1, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (2, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (3, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (4, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (5, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 0)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 1)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 2)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 3)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 4)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 5)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 7)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 8)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 9)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 10)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (7, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (8, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (9, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (10, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (0, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (1, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (2, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (3, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (4, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (5, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 0)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 1)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 2)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 3)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 4)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 5)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 7)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 8)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 9)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (6, 10)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (7, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (8, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (9, 6)[0m
[1;31m--- <WARNING> Tools : <GetCorrelationMatrix> zero variances for variables (10, 6)[0m
--- Factory :
--- Factory : Inter-MVA correlation matrix (signal):
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory : Likelihood LikelihoodPCA PDERS HMatrix Fisher FDA_MT MLP SVM_Gauss BDT BDTD RuleFit
--- Factory : Likelihood: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : LikelihoodPCA: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : PDERS: -1.000 -1.000 +1.000 -1.000 -1.000 -1.000 +0.000 +1.000 +1.000 -1.000 -1.000
--- Factory : HMatrix: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : Fisher: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : FDA_MT: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : MLP: +0.000 +0.000 +0.000 +0.000 +0.000 +0.000 +1.000 +0.000 +0.000 +0.000 +0.000
--- Factory : SVM_Gauss: -1.000 -1.000 +1.000 -1.000 -1.000 -1.000 +0.000 +1.000 +1.000 -1.000 -1.000
--- Factory : BDT: -1.000 -1.000 +1.000 -1.000 -1.000 -1.000 +0.000 +1.000 +1.000 -1.000 -1.000
--- Factory : BDTD: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : RuleFit: +1.000 +1.000 -1.000 +1.000 +1.000 +1.000 +0.000 -1.000 -1.000 +1.000 +1.000
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory :
--- Factory : Inter-MVA correlation matrix (background):
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory : Likelihood LikelihoodPCA PDERS HMatrix Fisher FDA_MT MLP SVM_Gauss BDT BDTD RuleFit
--- Factory : Likelihood: +1.000 +1.000 -1.000 -1.000 +1.000 -1.000 +0.000 -1.000 -1.000 -1.000 -1.000
--- Factory : LikelihoodPCA: +1.000 +1.000 -1.000 -1.000 +1.000 -1.000 +0.000 -1.000 -1.000 -1.000 -1.000
--- Factory : PDERS: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : HMatrix: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : Fisher: +1.000 +1.000 -1.000 -1.000 +1.000 -1.000 +0.000 -1.000 -1.000 -1.000 -1.000
--- Factory : FDA_MT: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : MLP: +0.000 +0.000 +0.000 +0.000 +0.000 +0.000 +1.000 +0.000 +0.000 +0.000 +0.000
--- Factory : SVM_Gauss: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : BDT: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : BDTD: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : RuleFit: -1.000 -1.000 +1.000 +1.000 -1.000 +1.000 +0.000 +1.000 +1.000 +1.000 +1.000
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory :
--- Factory : The following "overlap" matrices contain the fraction of events for which
--- Factory : the MVAs 'i' and 'j' have returned conform answers about "signal-likeness"
--- Factory : An event is signal-like, if its MVA output exceeds the following value:
--- Factory : -----------------------------
--- Factory : Method: Cut value:
--- Factory : -----------------------------
--- Factory : Likelihood: +0.470
--- Factory : LikelihoodPCA: +0.483
--- Factory : PDERS: +0.736
--- Factory : HMatrix: +0.031
--- Factory : Fisher: +0.036
--- Factory : FDA_MT: +0.570
--- Factory : MLP: +1.000
--- Factory : SVM_Gauss: +0.525
--- Factory : BDT: -0.233
--- Factory : BDTD: -0.266
--- Factory : RuleFit: -3.266
--- Factory : -----------------------------
--- Factory : which correspond to the working point: eff(signal) = 1 - eff(background)
--- Factory : Note: no correlations and overlap with cut method are provided at present
--- Factory :
--- Factory : Inter-MVA overlap matrix (signal):
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory : Likelihood LikelihoodPCA PDERS HMatrix Fisher FDA_MT MLP SVM_Gauss BDT BDTD RuleFit
--- Factory : Likelihood: +1.000 +0.500 +0.500 +0.500 +0.500 +1.000 +1.000 +0.500 +0.500 +0.500 +1.000
--- Factory : LikelihoodPCA: +0.500 +1.000 +0.000 +1.000 +1.000 +0.500 +0.500 +0.000 +0.000 +1.000 +0.500
--- Factory : PDERS: +0.500 +0.000 +1.000 +0.000 +0.000 +0.500 +0.500 +1.000 +1.000 +0.000 +0.500
--- Factory : HMatrix: +0.500 +1.000 +0.000 +1.000 +1.000 +0.500 +0.500 +0.000 +0.000 +1.000 +0.500
--- Factory : Fisher: +0.500 +1.000 +0.000 +1.000 +1.000 +0.500 +0.500 +0.000 +0.000 +1.000 +0.500
--- Factory : FDA_MT: +1.000 +0.500 +0.500 +0.500 +0.500 +1.000 +1.000 +0.500 +0.500 +0.500 +1.000
--- Factory : MLP: +1.000 +0.500 +0.500 +0.500 +0.500 +1.000 +1.000 +0.500 +0.500 +0.500 +1.000
--- Factory : SVM_Gauss: +0.500 +0.000 +1.000 +0.000 +0.000 +0.500 +0.500 +1.000 +1.000 +0.000 +0.500
--- Factory : BDT: +0.500 +0.000 +1.000 +0.000 +0.000 +0.500 +0.500 +1.000 +1.000 +0.000 +0.500
--- Factory : BDTD: +0.500 +1.000 +0.000 +1.000 +1.000 +0.500 +0.500 +0.000 +0.000 +1.000 +0.500
--- Factory : RuleFit: +1.000 +0.500 +0.500 +0.500 +0.500 +1.000 +1.000 +0.500 +0.500 +0.500 +1.000
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory :
--- Factory : Inter-MVA overlap matrix (background):
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory : Likelihood LikelihoodPCA PDERS HMatrix Fisher FDA_MT MLP SVM_Gauss BDT BDTD RuleFit
--- Factory : Likelihood: +1.000 +1.000 +0.500 +0.500 +0.500 +0.500 +0.500 +0.000 +0.500 +0.000 +0.000
--- Factory : LikelihoodPCA: +1.000 +1.000 +0.500 +0.500 +0.500 +0.500 +0.500 +0.000 +0.500 +0.000 +0.000
--- Factory : PDERS: +0.500 +0.500 +1.000 +1.000 +0.000 +0.000 +1.000 +0.500 +1.000 +0.500 +0.500
--- Factory : HMatrix: +0.500 +0.500 +1.000 +1.000 +0.000 +0.000 +1.000 +0.500 +1.000 +0.500 +0.500
--- Factory : Fisher: +0.500 +0.500 +0.000 +0.000 +1.000 +1.000 +0.000 +0.500 +0.000 +0.500 +0.500
--- Factory : FDA_MT: +0.500 +0.500 +0.000 +0.000 +1.000 +1.000 +0.000 +0.500 +0.000 +0.500 +0.500
--- Factory : MLP: +0.500 +0.500 +1.000 +1.000 +0.000 +0.000 +1.000 +0.500 +1.000 +0.500 +0.500
--- Factory : SVM_Gauss: +0.000 +0.000 +0.500 +0.500 +0.500 +0.500 +0.500 +1.000 +0.500 +1.000 +1.000
--- Factory : BDT: +0.500 +0.500 +1.000 +1.000 +0.000 +0.000 +1.000 +0.500 +1.000 +0.500 +0.500
--- Factory : BDTD: +0.000 +0.000 +0.500 +0.500 +0.500 +0.500 +0.500 +1.000 +0.500 +1.000 +1.000
--- Factory : RuleFit: +0.000 +0.000 +0.500 +0.500 +0.500 +0.500 +0.500 +1.000 +0.500 +1.000 +1.000
--- Factory : -----------------------------------------------------------------------------------------------------------------
--- Factory :
--- Factory : Evaluation results ranked by best signal efficiency and purity (area)
--- Factory : -----------------------------------------------------------------------------
--- Factory : MVA Signal efficiency at bkg eff. (error): | Sepa- Signifi-
--- Factory : Methods: @B=0.01 @B=0.10 @B=0.30 Area | ration: cance:
--- Factory : -----------------------------------------------------------------------------
--- Factory : PDERS : 1.000(12) 1.000(12) 1.000(12) 1.000 | 0.902 3.247
--- Factory : FDA_MT : 1.000(12) 1.000(12) 1.000(12) 1.000 | 0.912 3.558
--- Factory : BDT : 1.000(12) 1.000(12) 1.000(12) 1.000 | 0.902 1.728
--- Factory : Likelihood : 0.495(281) 0.500(281) 1.000(12) 0.892 | 0.902 0.534
--- Factory : BDTD : 0.495(281) 0.500(281) 1.000(12) 0.892 | 0.902 1.107
--- Factory : LikelihoodPCA : 0.495(281) 0.496(281) 0.499(281) 0.608 | 0.912 0.840
--- Factory : CutsGA : 0.325(331) 0.326(331) 0.329(332) 0.600 | 0.000 0.000
--- Factory : HMatrix : 0.495(281) 0.496(281) 0.498(281) 0.500 | 0.926 0.205
--- Factory : Fisher : 0.495(281) 0.496(281) 0.498(281) 0.500 | 0.902 1.305
--- Factory : SVM_Gauss : 0.000(08) 0.000(08) 0.000(08) 0.215 | 0.889 0.249
--- Factory : MLP : 0.000(08) 0.000(08) 0.000(08) 0.000 | 0.000 0.000
--- Factory : RuleFit : 0.000(08) 0.000(08) 0.000(08) 0.000 | 0.908 0.769
--- Factory : -----------------------------------------------------------------------------
--- Factory :
--- Factory : Testing efficiency compared to training efficiency (overtraining check)
--- Factory : -----------------------------------------------------------------------------
--- Factory : MVA Signal efficiency: from test sample (from traing sample)
--- Factory : Methods: @B=0.01 @B=0.10 @B=0.30
--- Factory : -----------------------------------------------------------------------------
--- Factory : PDERS : 1.000 (0.499) 1.000 (1.000) 1.000 (1.000)
--- Factory : FDA_MT : 1.000 (0.495) 1.000 (0.497) 1.000 (0.501)
--- Factory : BDT : 1.000 (0.499) 1.000 (1.000) 1.000 (1.000)
--- Factory : Likelihood : 0.495 (0.495) 0.500 (0.497) 1.000 (0.500)
--- Factory : BDTD : 0.495 (0.495) 0.500 (0.497) 1.000 (0.500)
--- Factory : LikelihoodPCA : 0.495 (0.496) 0.496 (1.000) 0.499 (1.000)
--- Factory : CutsGA : 0.325 (0.149) 0.326 (0.331) 0.329 (0.810)
--- Factory : HMatrix : 0.495 (0.495) 0.496 (0.497) 0.498 (0.502)
--- Factory : Fisher : 0.495 (0.495) 0.496 (0.496) 0.498 (0.499)
--- Factory : SVM_Gauss : 0.000 (0.000) 0.000 (0.496) 0.000 (0.499)
--- Factory : MLP : 0.000 (0.000) 0.000 (0.000) 0.000 (0.000)
--- Factory : RuleFit : 0.000 (0.000) 0.000 (1.000) 0.000 (1.000)
--- Factory : -----------------------------------------------------------------------------
--- Factory :
--- Factory : Write Test Tree 'TestTree' to file
=== wrote root file TMVAout.1405006.root
=== TMVAnalysis is done!
--- Launch TMVA GUI to view input file: TMVAout.1405006.root
--- Reading keys ...
=== Note: inactive buttons indicate that the corresponding classifiers were not trained ===
--
PatRyan - 13 Mar 2009