tracking-parametrisation-tuner/outputs_nn/output_D_res.txt
2023-12-19 13:00:59 +01:00

269 lines
23 KiB
Plaintext
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

: Parsing option string:
: ... "V:!Silent:Color:DrawProgressBar:AnalysisType=Classification"
: The following options are set:
: - By User:
: V: "True" [Verbose flag]
: Color: "True" [Flag for coloured screen output (default: True, if in batch mode: False)]
: Silent: "False" [Batch mode: boolean silent flag inhibiting any output from TMVA after the creation of the factory class object (default: False)]
: DrawProgressBar: "True" [Draw progress bar to display training, testing and evaluation schedule (default: True)]
: AnalysisType: "Classification" [Set the analysis type (Classification, Regression, Multiclass, Auto) (default: Auto)]
: - Default:
: VerboseLevel: "Info" [VerboseLevel (Debug/Verbose/Info)]
: Transformations: "I" [List of transformations to test; formatting example: "Transformations=I;D;P;U;G,D", for identity, decorrelation, PCA, Uniform and Gaussianisation followed by decorrelation transformations]
: Correlations: "False" [boolean to show correlation in output]
: ROC: "True" [boolean to show ROC in output]
: ModelPersistence: "True" [Option to save the trained model in xml file or using serialization]
DataSetInfo : [MatchNNDataSet] : Added class "Signal"
: Add Tree Signal of type Signal with 8286 events
DataSetInfo : [MatchNNDataSet] : Added class "Background"
: Add Tree Bkg of type Background with 12762964 events
: Dataset[MatchNNDataSet] : Class index : 0 name : Signal
: Dataset[MatchNNDataSet] : Class index : 1 name : Background
Factory : Booking method: matching_mlp
:
: Parsing option string:
: ... "!H:V:TrainingMethod=BP:NeuronType=ReLU:EstimatorType=CE:VarTransform=Norm:NCycles=700:HiddenLayers=N+2,N:TestRate=50:Sampling=1.0:SamplingImportance=1.0:LearningRate=0.02:DecayRate=0.01:!UseRegulator"
: The following options are set:
: - By User:
: <none>
: - Default:
: Boost_num: "0" [Number of times the classifier will be boosted]
: Parsing option string:
: ... "!H:V:TrainingMethod=BP:NeuronType=ReLU:EstimatorType=CE:VarTransform=Norm:NCycles=700:HiddenLayers=N+2,N:TestRate=50:Sampling=1.0:SamplingImportance=1.0:LearningRate=0.02:DecayRate=0.01:!UseRegulator"
: The following options are set:
: - By User:
: NCycles: "700" [Number of training cycles]
: HiddenLayers: "N+2,N" [Specification of hidden layer architecture]
: NeuronType: "ReLU" [Neuron activation function type]
: EstimatorType: "CE" [MSE (Mean Square Estimator) for Gaussian Likelihood or CE(Cross-Entropy) for Bernoulli Likelihood]
: V: "True" [Verbose output (short form of "VerbosityLevel" below - overrides the latter one)]
: VarTransform: "Norm" [List of variable transformations performed before training, e.g., "D_Background,P_Signal,G,N_AllClasses" for: "Decorrelation, PCA-transformation, Gaussianisation, Normalisation, each for the given class of events ('AllClasses' denotes all events of all classes, if no class indication is given, 'All' is assumed)"]
: H: "False" [Print method-specific help message]
: TrainingMethod: "BP" [Train with Back-Propagation (BP), BFGS Algorithm (BFGS), or Genetic Algorithm (GA - slower and worse)]
: LearningRate: "2.000000e-02" [ANN learning rate parameter]
: DecayRate: "1.000000e-02" [Decay rate for learning parameter]
: TestRate: "50" [Test for overtraining performed at each #th epochs]
: Sampling: "1.000000e+00" [Only 'Sampling' (randomly selected) events are trained each epoch]
: SamplingImportance: "1.000000e+00" [ The sampling weights of events in epochs which successful (worse estimator than before) are multiplied with SamplingImportance, else they are divided.]
: UseRegulator: "False" [Use regulator to avoid over-training]
: - Default:
: RandomSeed: "1" [Random seed for initial synapse weights (0 means unique seed for each run; default value '1')]
: NeuronInputType: "sum" [Neuron input function type]
: VerbosityLevel: "Default" [Verbosity level]
: CreateMVAPdfs: "False" [Create PDFs for classifier outputs (signal and background)]
: IgnoreNegWeightsInTraining: "False" [Events with negative weights are ignored in the training (but are included for testing and performance evaluation)]
: EpochMonitoring: "False" [Provide epoch-wise monitoring plots according to TestRate (caution: causes big ROOT output file!)]
: SamplingEpoch: "1.000000e+00" [Sampling is used for the first 'SamplingEpoch' epochs, afterwards, all events are taken for training]
: SamplingTraining: "True" [The training sample is sampled]
: SamplingTesting: "False" [The testing sample is sampled]
: ResetStep: "50" [How often BFGS should reset history]
: Tau: "3.000000e+00" [LineSearch "size step"]
: BPMode: "sequential" [Back-propagation learning mode: sequential or batch]
: BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events]
: ConvergenceImprove: "1.000000e-30" [Minimum improvement which counts as improvement (<0 means automatic convergence check is turned off)]
: ConvergenceTests: "-1" [Number of steps (without improvement) required for convergence (<0 means automatic convergence check is turned off)]
: UpdateLimit: "10000" [Maximum times of regulator update]
: CalculateErrors: "False" [Calculates inverse Hessian matrix at the end of the training to be able to calculate the uncertainties of an MVA value]
: WeightRange: "1.000000e+00" [Take the events for the estimator calculations from small deviations from the desired value to large deviations only over the weight range]
matching_mlp : [MatchNNDataSet] : Create Transformation "Norm" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'chi2' <---> Output : variable 'chi2'
: Input : variable 'teta2' <---> Output : variable 'teta2'
: Input : variable 'distX' <---> Output : variable 'distX'
: Input : variable 'distY' <---> Output : variable 'distY'
: Input : variable 'dSlope' <---> Output : variable 'dSlope'
: Input : variable 'dSlopeY' <---> Output : variable 'dSlopeY'
matching_mlp : Building Network.
: Initializing weights
Factory : Train all methods
: Rebuilding Dataset MatchNNDataSet
: Parsing option string:
: ... "SplitMode=random:V:nTrain_Signal=0:nTrain_Background=20000.0:nTest_Signal=1000.0:nTest_Background=5000.0"
: The following options are set:
: - By User:
: SplitMode: "Random" [Method of picking training and testing events (default: random)]
: nTrain_Signal: "0" [Number of training events of class Signal (default: 0 = all)]
: nTest_Signal: "1000" [Number of test events of class Signal (default: 0 = all)]
: nTrain_Background: "20000" [Number of training events of class Background (default: 0 = all)]
: nTest_Background: "5000" [Number of test events of class Background (default: 0 = all)]
: V: "True" [Verbosity (default: true)]
: - Default:
: MixMode: "SameAsSplitMode" [Method of mixing events of different classes into one dataset (default: SameAsSplitMode)]
: SplitSeed: "100" [Seed for random event shuffling]
: NormMode: "EqualNumEvents" [Overall renormalisation of event-by-event weights used in the training (NumEvents: average weight of 1 per event, independently for signal and background; EqualNumEvents: average weight of 1 per event for signal, and sum of weights for background equal to sum of weights for signal)]
: ScaleWithPreselEff: "False" [Scale the number of requested events by the eff. of the preselection cuts (or not)]
: TrainTestSplit_Signal: "0.000000e+00" [Number of test events of class Signal (default: 0 = all)]
: TrainTestSplit_Background: "0.000000e+00" [Number of test events of class Background (default: 0 = all)]
: VerboseLevel: "Info" [VerboseLevel (Debug/Verbose/Info)]
: Correlations: "True" [Boolean to show correlation output (Default: true)]
: CalcCorrelations: "True" [Compute correlations and also some variable statistics, e.g. min/max (Default: true )]
: Building event vectors for type 2 Signal
: Dataset[MatchNNDataSet] : create input formulas for tree Signal
: Building event vectors for type 2 Background
: Dataset[MatchNNDataSet] : create input formulas for tree Bkg
DataSetFactory : [MatchNNDataSet] : Number of events in input trees
:
:
: Dataset[MatchNNDataSet] : Weight renormalisation mode: "EqualNumEvents": renormalises all event classes ...
: Dataset[MatchNNDataSet] : such that the effective (weighted) number of events in each class is the same
: Dataset[MatchNNDataSet] : (and equals the number of events (entries) given for class=0 )
: Dataset[MatchNNDataSet] : ... i.e. such that Sum[i=1..N_j]{w_i} = N_classA, j=classA, classB, ...
: Dataset[MatchNNDataSet] : ... (note that N_j is the sum of TRAINING events
: Dataset[MatchNNDataSet] : ..... Testing events are not renormalised nor included in the renormalisation factor!)
: Number of training and testing events
: ---------------------------------------------------------------------------
: Signal -- training events : 7286
: Signal -- testing events : 1000
: Signal -- training and testing events: 8286
: Background -- training events : 20000
: Background -- testing events : 5000
: Background -- training and testing events: 25000
:
DataSetInfo : Correlation matrix (Signal):
: --------------------------------------------------------
: chi2 teta2 distX distY dSlope dSlopeY
: chi2: +1.000 -0.090 +0.190 +0.270 +0.150 +0.032
: teta2: -0.090 +1.000 +0.022 +0.557 +0.231 +0.681
: distX: +0.190 +0.022 +1.000 -0.243 +0.667 +0.066
: distY: +0.270 +0.557 -0.243 +1.000 +0.299 +0.491
: dSlope: +0.150 +0.231 +0.667 +0.299 +1.000 +0.343
: dSlopeY: +0.032 +0.681 +0.066 +0.491 +0.343 +1.000
: --------------------------------------------------------
DataSetInfo : Correlation matrix (Background):
: --------------------------------------------------------
: chi2 teta2 distX distY dSlope dSlopeY
: chi2: +1.000 -0.032 +0.249 +0.208 +0.048 +0.047
: teta2: -0.032 +1.000 +0.256 +0.643 +0.377 +0.464
: distX: +0.249 +0.256 +1.000 +0.027 +0.771 +0.192
: distY: +0.208 +0.643 +0.027 +1.000 +0.323 +0.556
: dSlope: +0.048 +0.377 +0.771 +0.323 +1.000 +0.394
: dSlopeY: +0.047 +0.464 +0.192 +0.556 +0.394 +1.000
: --------------------------------------------------------
DataSetFactory : [MatchNNDataSet] :
:
Factory : [MatchNNDataSet] : Create Transformation "I" with events from all classes.
:
: Transformation, Variable selection :
: Input : variable 'chi2' <---> Output : variable 'chi2'
: Input : variable 'teta2' <---> Output : variable 'teta2'
: Input : variable 'distX' <---> Output : variable 'distX'
: Input : variable 'distY' <---> Output : variable 'distY'
: Input : variable 'dSlope' <---> Output : variable 'dSlope'
: Input : variable 'dSlopeY' <---> Output : variable 'dSlopeY'
TFHandler_Factory : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: chi2: 15.110 7.5957 [ 0.25759 29.998 ]
: teta2: 0.0049007 0.015613 [ 1.1810e-05 0.34609 ]
: distX: 77.540 64.030 [ 0.00059319 494.45 ]
: distY: 35.596 43.128 [ 0.0016556 497.11 ]
: dSlope: 0.37313 0.24282 [ 0.00012810 1.2803 ]
: dSlopeY: 0.0071048 0.011434 [ 4.9639e-07 0.14679 ]
: -----------------------------------------------------------
: Ranking input variables (method unspecific)...
IdTransformation : Ranking result (top variable is best ranked)
: --------------------------------
: Rank : Variable : Separation
: --------------------------------
: 1 : chi2 : 8.701e-02
: 2 : distY : 7.455e-02
: 3 : dSlope : 6.957e-02
: 4 : teta2 : 4.316e-02
: 5 : dSlopeY : 2.562e-02
: 6 : distX : 1.371e-02
: --------------------------------
Factory : Train method: matching_mlp for Classification
:
TFHandler_matching_mlp : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: chi2: -0.0011851 0.51079 [ -1.0000 1.0000 ]
: teta2: -0.97175 0.090226 [ -1.0000 1.0000 ]
: distX: -0.68636 0.25900 [ -1.0000 1.0000 ]
: distY: -0.85679 0.17352 [ -1.0000 1.0000 ]
: dSlope: -0.41728 0.37935 [ -1.0000 1.0000 ]
: dSlopeY: -0.90320 0.15579 [ -1.0000 1.0000 ]
: -----------------------------------------------------------
: Training Network
:
: Elapsed time for training with 27286 events: 59.2 sec
matching_mlp : [MatchNNDataSet] : Evaluation of matching_mlp on training sample (27286 events)
: Elapsed time for evaluation of 27286 events: 0.0331 sec
: Creating xml weight file: MatchNNDataSet/weights/TMVAClassification_matching_mlp.weights.xml
: Creating standalone class: MatchNNDataSet/weights/TMVAClassification_matching_mlp.class.C
: Write special histos to file: matching_ghost_mlp_training.root:/MatchNNDataSet/Method_MLP/matching_mlp
Factory : Training finished
:
: Ranking input variables (method specific)...
matching_mlp : Ranking result (top variable is best ranked)
: --------------------------------
: Rank : Variable : Importance
: --------------------------------
: 1 : distY : 1.487e+02
: 2 : distX : 9.251e+01
: 3 : dSlopeY : 5.612e+01
: 4 : teta2 : 3.951e+01
: 5 : dSlope : 1.219e+01
: 6 : chi2 : 1.428e+00
: --------------------------------
Factory : === Destroy and recreate all methods via weight files for testing ===
:
: Reading weight file: MatchNNDataSet/weights/TMVAClassification_matching_mlp.weights.xml
matching_mlp : Building Network.
: Initializing weights
Factory : Test all methods
Factory : Test method: matching_mlp for Classification performance
:
matching_mlp : [MatchNNDataSet] : Evaluation of matching_mlp on testing sample (6000 events)
: Elapsed time for evaluation of 6000 events: 0.0113 sec
Factory : Evaluate all methods
Factory : Evaluate classifier: matching_mlp
:
TFHandler_matching_mlp : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: chi2: 0.10129 0.51080 [ -0.98564 0.99991 ]
: teta2: -0.96473 0.096760 [ -0.99997 0.43123 ]
: distX: -0.68127 0.26859 [ -0.99983 0.92711 ]
: distY: -0.83124 0.20417 [ -0.99994 1.0115 ]
: dSlope: -0.45660 0.39080 [ -0.99695 0.96415 ]
: dSlopeY: -0.89629 0.16201 [ -0.99999 1.0015 ]
: -----------------------------------------------------------
matching_mlp : [MatchNNDataSet] : Loop over test events and fill histograms with classifier response...
:
TFHandler_matching_mlp : Variable Mean RMS [ Min Max ]
: -----------------------------------------------------------
: chi2: 0.10129 0.51080 [ -0.98564 0.99991 ]
: teta2: -0.96473 0.096760 [ -0.99997 0.43123 ]
: distX: -0.68127 0.26859 [ -0.99983 0.92711 ]
: distY: -0.83124 0.20417 [ -0.99994 1.0115 ]
: dSlope: -0.45660 0.39080 [ -0.99695 0.96415 ]
: dSlopeY: -0.89629 0.16201 [ -0.99999 1.0015 ]
: -----------------------------------------------------------
:
: Evaluation results ranked by best signal efficiency and purity (area)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA
: Name: Method: ROC-integ
: MatchNNDataSet matching_mlp : 0.854
: -------------------------------------------------------------------------------------------------------------------
:
: Testing efficiency compared to training efficiency (overtraining check)
: -------------------------------------------------------------------------------------------------------------------
: DataSet MVA Signal efficiency: from test sample (from training sample)
: Name: Method: @B=0.01 @B=0.10 @B=0.30
: -------------------------------------------------------------------------------------------------------------------
: MatchNNDataSet matching_mlp : 0.091 (0.089) 0.501 (0.494) 0.851 (0.854)
: -------------------------------------------------------------------------------------------------------------------
:
Dataset:MatchNNDataSet : Created tree 'TestTree' with 6000 events
:
Dataset:MatchNNDataSet : Created tree 'TrainTree' with 27286 events
:
Factory : Thank you for using TMVA!
: For citation information, please visit: http://tmva.sf.net/citeTMVA.html
Transforming nn_electron_training/result/MatchNNDataSet/weights/TMVAClassification_matching_mlp.class.C ...
Found minimum and maximum values for 6 variables.
Found 3 matrices:
1. fWeightMatrix0to1 with 7 columns and 8 rows
2. fWeightMatrix1to2 with 9 columns and 6 rows
3. fWeightMatrix2to3 with 7 columns and 1 rows