// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:33:58 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job416 Training events: 29926 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [49.7499961853,711.081665039] LepAPt LepAPt 'F' [20.0021438599,189.268966675] LepBPt LepBPt 'F' [10.0008592606,74.385925293] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.951345682144,19.6063156128] MetSpec MetSpec 'F' [15.0248270035,290.76965332] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.2123203278,440.607177734] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.888904690742,300.372680664] addEt addEt 'F' [48.2450904846,354.966430664] dPhiLepSumMet dPhiLepSumMet 'F' [0.160927712917,3.14158320427] dPhiLeptons dPhiLeptons 'F' [2.93254852295e-05,1.04818320274] dRLeptons dRLeptons 'F' [0.200063899159,1.09498274326] lep1_E lep1_E 'F' [20.0025482178,202.459274292] lep2_E lep2_E 'F' [10.0034532547,115.518630981] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 49.7499961853027; fVmax[0] = 711.081665039062; fVmin[1] = 20.0021438598633; fVmax[1] = 189.268966674805; fVmin[2] = 10.0008592605591; fVmax[2] = 74.3859252929688; fVmin[3] = 0.951345682144165; fVmax[3] = 19.606315612793; fVmin[4] = 15.024827003479; fVmax[4] = 290.769653320312; fVmin[5] = 30.2123203277588; fVmax[5] = 440.607177734375; fVmin[6] = 0.888904690742493; fVmax[6] = 300.372680664062; fVmin[7] = 48.2450904846191; fVmax[7] = 354.966430664062; fVmin[8] = 0.160927712917328; fVmax[8] = 3.14158320426941; fVmin[9] = 2.93254852294922e-05; fVmax[9] = 1.04818320274353; fVmin[10] = 0.200063899159431; fVmax[10] = 1.09498274326324; fVmin[11] = 20.0025482177734; fVmax[11] = 202.459274291992; fVmin[12] = 10.0034532546997; fVmax[12] = 115.518630981445; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.243924447852646; fWeightMatrix0to1[1][0] = 2.28160033328484; fWeightMatrix0to1[2][0] = 1.01637933246159; fWeightMatrix0to1[3][0] = 1.86663133595625; fWeightMatrix0to1[4][0] = -1.84386849604761; fWeightMatrix0to1[5][0] = -1.63185973695249; fWeightMatrix0to1[6][0] = -1.0312156629348; fWeightMatrix0to1[7][0] = 2.01869762840658; fWeightMatrix0to1[8][0] = -1.28681844774521; fWeightMatrix0to1[9][0] = -0.339566031246458; fWeightMatrix0to1[10][0] = -1.18151571657004; fWeightMatrix0to1[11][0] = -0.189498095614139; fWeightMatrix0to1[12][0] = -1.67876661284167; fWeightMatrix0to1[13][0] = -0.778149457835293; fWeightMatrix0to1[0][1] = -0.587833675200515; fWeightMatrix0to1[1][1] = 0.783734155327491; fWeightMatrix0to1[2][1] = -0.293301399940047; fWeightMatrix0to1[3][1] = 1.86357838240848; fWeightMatrix0to1[4][1] = -0.127250531928449; fWeightMatrix0to1[5][1] = 1.32611872475849; fWeightMatrix0to1[6][1] = -0.636394728047283; fWeightMatrix0to1[7][1] = -0.457810124236711; fWeightMatrix0to1[8][1] = 0.845744353165209; fWeightMatrix0to1[9][1] = 0.702185100510655; fWeightMatrix0to1[10][1] = -0.0970896881149856; fWeightMatrix0to1[11][1] = -0.031935125190339; fWeightMatrix0to1[12][1] = 1.82611213913231; fWeightMatrix0to1[13][1] = -1.46177357622925; fWeightMatrix0to1[0][2] = -1.84985580092606; fWeightMatrix0to1[1][2] = 0.788932511388185; fWeightMatrix0to1[2][2] = 0.61997014608447; fWeightMatrix0to1[3][2] = 1.69860264473055; fWeightMatrix0to1[4][2] = -0.18509073659769; fWeightMatrix0to1[5][2] = -0.386007651877983; fWeightMatrix0to1[6][2] = -3.00517858323584; fWeightMatrix0to1[7][2] = 0.659546706808658; fWeightMatrix0to1[8][2] = -1.45177620019804; fWeightMatrix0to1[9][2] = 1.93043119298598; fWeightMatrix0to1[10][2] = -1.63308616275503; fWeightMatrix0to1[11][2] = 1.26960630138034; fWeightMatrix0to1[12][2] = -0.420854485948467; fWeightMatrix0to1[13][2] = 0.860805039827536; fWeightMatrix0to1[0][3] = 1.93481123425534; fWeightMatrix0to1[1][3] = 1.40073557239993; fWeightMatrix0to1[2][3] = -0.690828625433421; fWeightMatrix0to1[3][3] = -1.53607724729392; fWeightMatrix0to1[4][3] = 1.00077265815864; fWeightMatrix0to1[5][3] = 0.400766507578748; fWeightMatrix0to1[6][3] = 4.68624211677187; fWeightMatrix0to1[7][3] = 2.18947477995142; fWeightMatrix0to1[8][3] = 1.65289886411961; fWeightMatrix0to1[9][3] = 2.2987486279253; fWeightMatrix0to1[10][3] = -1.21063984648083; fWeightMatrix0to1[11][3] = 1.64268602769114; fWeightMatrix0to1[12][3] = -3.53865539842002; fWeightMatrix0to1[13][3] = -1.13238146696387; fWeightMatrix0to1[0][4] = -1.25067109770457; fWeightMatrix0to1[1][4] = -1.61847640530405; fWeightMatrix0to1[2][4] = 1.64555337310755; fWeightMatrix0to1[3][4] = 0.886695350592254; fWeightMatrix0to1[4][4] = -1.44120894334861; fWeightMatrix0to1[5][4] = 0.578349014691624; fWeightMatrix0to1[6][4] = 1.05075608888394; fWeightMatrix0to1[7][4] = 0.699408688914702; fWeightMatrix0to1[8][4] = 1.96071379896492; fWeightMatrix0to1[9][4] = 0.313238572899438; fWeightMatrix0to1[10][4] = 0.760247475558394; fWeightMatrix0to1[11][4] = -0.839697217693266; fWeightMatrix0to1[12][4] = 0.338587867715091; fWeightMatrix0to1[13][4] = -1.26779208122167; fWeightMatrix0to1[0][5] = -0.650195281953911; fWeightMatrix0to1[1][5] = -1.23351561641018; fWeightMatrix0to1[2][5] = 0.887355423094652; fWeightMatrix0to1[3][5] = 1.26087219017021; fWeightMatrix0to1[4][5] = 1.57739476619659; fWeightMatrix0to1[5][5] = -0.569385952421056; fWeightMatrix0to1[6][5] = -2.51076679125427; fWeightMatrix0to1[7][5] = -2.47084319933539; fWeightMatrix0to1[8][5] = 1.11899083517664; fWeightMatrix0to1[9][5] = 0.928704814628416; fWeightMatrix0to1[10][5] = 2.27643055430444; fWeightMatrix0to1[11][5] = 0.77776978707168; fWeightMatrix0to1[12][5] = 1.16298946412707; fWeightMatrix0to1[13][5] = -1.13056245489102; fWeightMatrix0to1[0][6] = -0.815183930899835; fWeightMatrix0to1[1][6] = 0.112675476702737; fWeightMatrix0to1[2][6] = 1.27353157731099; fWeightMatrix0to1[3][6] = -0.967615126170451; fWeightMatrix0to1[4][6] = -1.54783318233055; fWeightMatrix0to1[5][6] = -1.76695198361787; fWeightMatrix0to1[6][6] = -1.47224362785832; fWeightMatrix0to1[7][6] = 0.892794240292222; fWeightMatrix0to1[8][6] = 1.76618338788923; fWeightMatrix0to1[9][6] = -0.727707796967364; fWeightMatrix0to1[10][6] = -0.0302257632133804; fWeightMatrix0to1[11][6] = 2.01099000584848; fWeightMatrix0to1[12][6] = -1.60453459774906; fWeightMatrix0to1[13][6] = -1.83008847714402; fWeightMatrix0to1[0][7] = -1.40092711306166; fWeightMatrix0to1[1][7] = 1.60517153543799; fWeightMatrix0to1[2][7] = -1.92585214896351; fWeightMatrix0to1[3][7] = -1.70721706385866; fWeightMatrix0to1[4][7] = 0.0660609404330079; fWeightMatrix0to1[5][7] = 1.08635567244665; fWeightMatrix0to1[6][7] = -6.32817807576524; fWeightMatrix0to1[7][7] = 6.59734172199718; fWeightMatrix0to1[8][7] = -0.865110905993968; fWeightMatrix0to1[9][7] = 1.58449112227415; fWeightMatrix0to1[10][7] = 1.01193127591297; fWeightMatrix0to1[11][7] = 0.575800592207619; fWeightMatrix0to1[12][7] = -6.81704730545695; fWeightMatrix0to1[13][7] = 0.449451055925819; fWeightMatrix0to1[0][8] = 0.205341670092366; fWeightMatrix0to1[1][8] = 0.523703585242847; fWeightMatrix0to1[2][8] = -1.69725195683309; fWeightMatrix0to1[3][8] = 1.43603157213776; fWeightMatrix0to1[4][8] = 0.443308332514654; fWeightMatrix0to1[5][8] = -1.22819892549451; fWeightMatrix0to1[6][8] = 1.41288937127963; fWeightMatrix0to1[7][8] = -0.0179676585706473; fWeightMatrix0to1[8][8] = -1.58303167239377; fWeightMatrix0to1[9][8] = 0.357915652883983; fWeightMatrix0to1[10][8] = -0.275392307888235; fWeightMatrix0to1[11][8] = 1.83416548227555; fWeightMatrix0to1[12][8] = 0.947887726122778; fWeightMatrix0to1[13][8] = 0.348595286622099; fWeightMatrix0to1[0][9] = -0.320308704769567; fWeightMatrix0to1[1][9] = 0.275726510447433; fWeightMatrix0to1[2][9] = -1.7366295401651; fWeightMatrix0to1[3][9] = 1.29633681282869; fWeightMatrix0to1[4][9] = 0.130399808677321; fWeightMatrix0to1[5][9] = -1.67670971263231; fWeightMatrix0to1[6][9] = 0.231369012837127; fWeightMatrix0to1[7][9] = -0.839827122724099; fWeightMatrix0to1[8][9] = 0.00833698749713488; fWeightMatrix0to1[9][9] = 0.335568357140095; fWeightMatrix0to1[10][9] = 1.65031021548865; fWeightMatrix0to1[11][9] = -1.43307526644773; fWeightMatrix0to1[12][9] = -0.867797539579012; fWeightMatrix0to1[13][9] = 1.39366480976003; fWeightMatrix0to1[0][10] = 1.63278311930413; fWeightMatrix0to1[1][10] = -0.318566354952273; fWeightMatrix0to1[2][10] = -1.29033997138273; fWeightMatrix0to1[3][10] = 0.118006228985594; fWeightMatrix0to1[4][10] = -0.906539106928623; fWeightMatrix0to1[5][10] = 1.47523125074097; fWeightMatrix0to1[6][10] = -1.24301030402138; fWeightMatrix0to1[7][10] = 0.801721424608507; fWeightMatrix0to1[8][10] = -0.433466153604142; fWeightMatrix0to1[9][10] = -0.4542866427595; fWeightMatrix0to1[10][10] = -0.729338037186283; fWeightMatrix0to1[11][10] = 1.01718698193432; fWeightMatrix0to1[12][10] = 0.909515570114262; fWeightMatrix0to1[13][10] = -0.594131525274601; fWeightMatrix0to1[0][11] = -0.58306217730421; fWeightMatrix0to1[1][11] = 0.229220706105523; fWeightMatrix0to1[2][11] = 1.12883816728362; fWeightMatrix0to1[3][11] = 0.765838304310492; fWeightMatrix0to1[4][11] = 1.2493947838481; fWeightMatrix0to1[5][11] = -1.48160944349388; fWeightMatrix0to1[6][11] = 1.87719709659965; fWeightMatrix0to1[7][11] = 0.0486152508520241; fWeightMatrix0to1[8][11] = 0.591148860924962; fWeightMatrix0to1[9][11] = 0.910871896600218; fWeightMatrix0to1[10][11] = 1.82524614931785; fWeightMatrix0to1[11][11] = -0.102731662469376; fWeightMatrix0to1[12][11] = 1.2340319140195; fWeightMatrix0to1[13][11] = 0.542704761875138; fWeightMatrix0to1[0][12] = -0.887979578946781; fWeightMatrix0to1[1][12] = -0.243493411742818; fWeightMatrix0to1[2][12] = 1.61307007555572; fWeightMatrix0to1[3][12] = 1.2527388807377; fWeightMatrix0to1[4][12] = 0.1551613672947; fWeightMatrix0to1[5][12] = 1.71998979599093; fWeightMatrix0to1[6][12] = 1.10792365303984; fWeightMatrix0to1[7][12] = -2.11888518646672; fWeightMatrix0to1[8][12] = 0.642391295692462; fWeightMatrix0to1[9][12] = -0.100738785967977; fWeightMatrix0to1[10][12] = 0.221845115025422; fWeightMatrix0to1[11][12] = -1.20563280833172; fWeightMatrix0to1[12][12] = -2.02354388442231; fWeightMatrix0to1[13][12] = 0.668990634909273; fWeightMatrix0to1[0][13] = 1.76670107505098; fWeightMatrix0to1[1][13] = -0.137640626627834; fWeightMatrix0to1[2][13] = -0.469696571163241; fWeightMatrix0to1[3][13] = -0.988454870363874; fWeightMatrix0to1[4][13] = -0.604214217483157; fWeightMatrix0to1[5][13] = 0.941542816079766; fWeightMatrix0to1[6][13] = -8.15944811317272; fWeightMatrix0to1[7][13] = 5.8724472373575; fWeightMatrix0to1[8][13] = -1.21497154621965; fWeightMatrix0to1[9][13] = 2.84277764011448; fWeightMatrix0to1[10][13] = 1.60881142582793; fWeightMatrix0to1[11][13] = 0.739410623057269; fWeightMatrix0to1[12][13] = -6.02984445782473; fWeightMatrix0to1[13][13] = 0.349385237380432; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.10627560780416; fWeightMatrix1to2[1][0] = -0.166634458729502; fWeightMatrix1to2[2][0] = -0.091276766315153; fWeightMatrix1to2[3][0] = -1.63796678480323; fWeightMatrix1to2[4][0] = -0.694367071618321; fWeightMatrix1to2[5][0] = -0.655419392327571; fWeightMatrix1to2[6][0] = -0.533320224263592; fWeightMatrix1to2[7][0] = -2.30786340318138; fWeightMatrix1to2[8][0] = 0.732310172823305; fWeightMatrix1to2[9][0] = 1.40610795669379; fWeightMatrix1to2[10][0] = -1.25380291464578; fWeightMatrix1to2[11][0] = -0.325900182141926; fWeightMatrix1to2[12][0] = -0.0253156996022293; fWeightMatrix1to2[0][1] = 1.19454903038668; fWeightMatrix1to2[1][1] = 0.186629342529534; fWeightMatrix1to2[2][1] = -0.194302300073229; fWeightMatrix1to2[3][1] = -2.07187842377253; fWeightMatrix1to2[4][1] = -0.922841009464415; fWeightMatrix1to2[5][1] = 1.37355352085783; fWeightMatrix1to2[6][1] = 1.11986070177941; fWeightMatrix1to2[7][1] = 0.659440937689285; fWeightMatrix1to2[8][1] = 0.0894878542488058; fWeightMatrix1to2[9][1] = 2.79763373618991; fWeightMatrix1to2[10][1] = -1.80367892953002; fWeightMatrix1to2[11][1] = -1.2902710607335; fWeightMatrix1to2[12][1] = -0.315053811273184; fWeightMatrix1to2[0][2] = -1.50896798240165; fWeightMatrix1to2[1][2] = 0.841857779778997; fWeightMatrix1to2[2][2] = 1.67459916415689; fWeightMatrix1to2[3][2] = -1.28241122788787; fWeightMatrix1to2[4][2] = 0.785753602443922; fWeightMatrix1to2[5][2] = -0.096091970906831; fWeightMatrix1to2[6][2] = -1.78644906460329; fWeightMatrix1to2[7][2] = -1.11193891651445; fWeightMatrix1to2[8][2] = 0.963295272696027; fWeightMatrix1to2[9][2] = -0.106045689873081; fWeightMatrix1to2[10][2] = 0.876803024080962; fWeightMatrix1to2[11][2] = -1.15105368250964; fWeightMatrix1to2[12][2] = 1.62042667766698; fWeightMatrix1to2[0][3] = -1.74461624170666; fWeightMatrix1to2[1][3] = 0.682620479101955; fWeightMatrix1to2[2][3] = 0.0572696247893098; fWeightMatrix1to2[3][3] = -1.55422815745142; fWeightMatrix1to2[4][3] = -1.31026638086208; fWeightMatrix1to2[5][3] = -1.8742076755879; fWeightMatrix1to2[6][3] = -0.3811479230925; fWeightMatrix1to2[7][3] = -1.88661470179286; fWeightMatrix1to2[8][3] = -1.62311339911851; fWeightMatrix1to2[9][3] = -1.72015439528033; fWeightMatrix1to2[10][3] = 0.143591406028686; fWeightMatrix1to2[11][3] = -0.983133921516612; fWeightMatrix1to2[12][3] = -1.60844728415827; fWeightMatrix1to2[0][4] = 0.190786327418679; fWeightMatrix1to2[1][4] = -1.55119408202062; fWeightMatrix1to2[2][4] = -0.343132490945559; fWeightMatrix1to2[3][4] = -1.28078917256178; fWeightMatrix1to2[4][4] = 0.218658596363254; fWeightMatrix1to2[5][4] = 1.02826483707203; fWeightMatrix1to2[6][4] = 0.765421027637003; fWeightMatrix1to2[7][4] = -1.11726194071087; fWeightMatrix1to2[8][4] = -2.08320358064464; fWeightMatrix1to2[9][4] = -1.067993848029; fWeightMatrix1to2[10][4] = -1.45348107750568; fWeightMatrix1to2[11][4] = -1.51872427708187; fWeightMatrix1to2[12][4] = -0.896637170249612; fWeightMatrix1to2[0][5] = -0.707972988797471; fWeightMatrix1to2[1][5] = 0.660715066989334; fWeightMatrix1to2[2][5] = -0.828741309944517; fWeightMatrix1to2[3][5] = 0.012593893957648; fWeightMatrix1to2[4][5] = -0.522603463402475; fWeightMatrix1to2[5][5] = -2.89451644185691; fWeightMatrix1to2[6][5] = 0.794692572315666; fWeightMatrix1to2[7][5] = 0.904883601695916; fWeightMatrix1to2[8][5] = -1.61543751933231; fWeightMatrix1to2[9][5] = -0.859260905818097; fWeightMatrix1to2[10][5] = 0.312184836816919; fWeightMatrix1to2[11][5] = 0.962899232772109; fWeightMatrix1to2[12][5] = -1.10418419978026; fWeightMatrix1to2[0][6] = -1.57342820466169; fWeightMatrix1to2[1][6] = -0.14278369124706; fWeightMatrix1to2[2][6] = 1.18695199293427; fWeightMatrix1to2[3][6] = 1.49740979375705; fWeightMatrix1to2[4][6] = 1.02781838597874; fWeightMatrix1to2[5][6] = -2.05015712391698; fWeightMatrix1to2[6][6] = 0.246626484193078; fWeightMatrix1to2[7][6] = -0.202761573266097; fWeightMatrix1to2[8][6] = -2.06044444242267; fWeightMatrix1to2[9][6] = -5.05750785600888; fWeightMatrix1to2[10][6] = -2.06525846138573; fWeightMatrix1to2[11][6] = -1.62385686885905; fWeightMatrix1to2[12][6] = -2.20500818923813; fWeightMatrix1to2[0][7] = 1.61242196404721; fWeightMatrix1to2[1][7] = -2.96233026050562; fWeightMatrix1to2[2][7] = 1.4216088472479; fWeightMatrix1to2[3][7] = -2.31448053636407; fWeightMatrix1to2[4][7] = 1.13147991065701; fWeightMatrix1to2[5][7] = -0.782424536124387; fWeightMatrix1to2[6][7] = 0.174157202487603; fWeightMatrix1to2[7][7] = 0.384363028602759; fWeightMatrix1to2[8][7] = -2.148615188582; fWeightMatrix1to2[9][7] = 4.80375497315194; fWeightMatrix1to2[10][7] = -0.251631886546574; fWeightMatrix1to2[11][7] = -1.85704751646422; fWeightMatrix1to2[12][7] = -2.11427126537063; fWeightMatrix1to2[0][8] = -1.73365246778725; fWeightMatrix1to2[1][8] = -0.841904114129791; fWeightMatrix1to2[2][8] = 1.85387261670693; fWeightMatrix1to2[3][8] = -0.583473409913468; fWeightMatrix1to2[4][8] = 0.27003416856283; fWeightMatrix1to2[5][8] = -0.917787458502261; fWeightMatrix1to2[6][8] = -1.21440113238258; fWeightMatrix1to2[7][8] = -1.74568143348922; fWeightMatrix1to2[8][8] = -0.998123245540387; fWeightMatrix1to2[9][8] = -1.25726295988592; fWeightMatrix1to2[10][8] = 0.81516923106252; fWeightMatrix1to2[11][8] = -1.36309213808659; fWeightMatrix1to2[12][8] = -1.21909448964456; fWeightMatrix1to2[0][9] = 1.89199724946487; fWeightMatrix1to2[1][9] = -0.836402389587974; fWeightMatrix1to2[2][9] = -0.720680775657927; fWeightMatrix1to2[3][9] = 1.32653237919382; fWeightMatrix1to2[4][9] = -1.7621634171376; fWeightMatrix1to2[5][9] = 1.55813657809999; fWeightMatrix1to2[6][9] = 1.12354661906439; fWeightMatrix1to2[7][9] = -0.948529350574689; fWeightMatrix1to2[8][9] = -1.26419244230579; fWeightMatrix1to2[9][9] = 2.56198960371361; fWeightMatrix1to2[10][9] = -0.761664454689076; fWeightMatrix1to2[11][9] = 0.815755343252925; fWeightMatrix1to2[12][9] = 1.16772640105067; fWeightMatrix1to2[0][10] = 0.127294744655809; fWeightMatrix1to2[1][10] = 0.721324587599663; fWeightMatrix1to2[2][10] = -1.75196923194707; fWeightMatrix1to2[3][10] = 1.07200533199872; fWeightMatrix1to2[4][10] = -1.93727254308726; fWeightMatrix1to2[5][10] = -0.103466939947555; fWeightMatrix1to2[6][10] = -2.22092492891763; fWeightMatrix1to2[7][10] = -1.9809194896259; fWeightMatrix1to2[8][10] = -0.719131860065193; fWeightMatrix1to2[9][10] = 1.31944300965691; fWeightMatrix1to2[10][10] = -0.00582646484595008; fWeightMatrix1to2[11][10] = 0.885284705385833; fWeightMatrix1to2[12][10] = -0.336226110274363; fWeightMatrix1to2[0][11] = -1.13489190110261; fWeightMatrix1to2[1][11] = -1.92912527895627; fWeightMatrix1to2[2][11] = -0.453546689722132; fWeightMatrix1to2[3][11] = 1.41077860979778; fWeightMatrix1to2[4][11] = -1.57320947808629; fWeightMatrix1to2[5][11] = -0.666355153097004; fWeightMatrix1to2[6][11] = -2.06776092169749; fWeightMatrix1to2[7][11] = -0.859799564950984; fWeightMatrix1to2[8][11] = -0.731732626993293; fWeightMatrix1to2[9][11] = 1.05069261173303; fWeightMatrix1to2[10][11] = -1.85883792949011; fWeightMatrix1to2[11][11] = 0.903462197366655; fWeightMatrix1to2[12][11] = 0.116452460482027; fWeightMatrix1to2[0][12] = -0.42299413031173; fWeightMatrix1to2[1][12] = -0.502722851506614; fWeightMatrix1to2[2][12] = -1.24608880732198; fWeightMatrix1to2[3][12] = -0.898950142217729; fWeightMatrix1to2[4][12] = -1.77046872279162; fWeightMatrix1to2[5][12] = -0.184556727841078; fWeightMatrix1to2[6][12] = -1.27920263937844; fWeightMatrix1to2[7][12] = -1.17435040890578; fWeightMatrix1to2[8][12] = 0.189738572066856; fWeightMatrix1to2[9][12] = -6.24134213873258; fWeightMatrix1to2[10][12] = -0.614742658213285; fWeightMatrix1to2[11][12] = -1.95915467351771; fWeightMatrix1to2[12][12] = 0.293332841646203; fWeightMatrix1to2[0][13] = -1.94570282952709; fWeightMatrix1to2[1][13] = -2.48768598272321; fWeightMatrix1to2[2][13] = -0.92945854721881; fWeightMatrix1to2[3][13] = -0.939979070649916; fWeightMatrix1to2[4][13] = -1.59807748799076; fWeightMatrix1to2[5][13] = 0.591013772166534; fWeightMatrix1to2[6][13] = -1.56518117050441; fWeightMatrix1to2[7][13] = -1.35318864963869; fWeightMatrix1to2[8][13] = 1.15744990077861; fWeightMatrix1to2[9][13] = 1.01655787507361; fWeightMatrix1to2[10][13] = 0.193802793037939; fWeightMatrix1to2[11][13] = 0.00358679203257185; fWeightMatrix1to2[12][13] = -0.310328333735411; fWeightMatrix1to2[0][14] = 0.594961286344004; fWeightMatrix1to2[1][14] = -0.779396964712319; fWeightMatrix1to2[2][14] = -1.5173939198773; fWeightMatrix1to2[3][14] = -0.959920219230902; fWeightMatrix1to2[4][14] = -1.73780773519672; fWeightMatrix1to2[5][14] = -0.141827945899992; fWeightMatrix1to2[6][14] = 0.241729601378008; fWeightMatrix1to2[7][14] = -0.270691889494659; fWeightMatrix1to2[8][14] = 0.26221802501706; fWeightMatrix1to2[9][14] = 1.46471594940908; fWeightMatrix1to2[10][14] = 0.876480000974422; fWeightMatrix1to2[11][14] = -1.55782248681099; fWeightMatrix1to2[12][14] = 0.0481827861142911; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.436908905307327; fWeightMatrix2to3[0][1] = 0.611619716038239; fWeightMatrix2to3[0][2] = 0.32057647353563; fWeightMatrix2to3[0][3] = -0.804572091341853; fWeightMatrix2to3[0][4] = 0.913064282581505; fWeightMatrix2to3[0][5] = 0.498736961012823; fWeightMatrix2to3[0][6] = 0.645987605937985; fWeightMatrix2to3[0][7] = 1.57548381875926; fWeightMatrix2to3[0][8] = 0.631722616002673; fWeightMatrix2to3[0][9] = 0.824869930169458; fWeightMatrix2to3[0][10] = 0.137241420906595; fWeightMatrix2to3[0][11] = 0.510796874242758; fWeightMatrix2to3[0][12] = 0.404013473837409; fWeightMatrix2to3[0][13] = -0.0709615679918301; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l