// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:42:30 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job414 Training events: 39628 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [48.899105072,787.769287109] LepAPt LepAPt 'F' [20.0004863739,162.429443359] LepBPt LepBPt 'F' [10.0008592606,71.1104202271] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.06155312061,21.7385215759] MetSpec MetSpec 'F' [15.0050735474,274.839141846] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1484737396,469.726501465] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [1.09349226952,329.840057373] addEt addEt 'F' [48.899105072,391.964660645] dPhiLepSumMet dPhiLepSumMet 'F' [0.005448944401,3.14159226418] dPhiLeptons dPhiLeptons 'F' [1.13248825073e-05,1.10251176357] dRLeptons dRLeptons 'F' [0.200022801757,1.11529541016] lep1_E lep1_E 'F' [20.0075798035,206.098724365] lep2_E lep2_E 'F' [10.0028533936,107.376251221] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 48.8991050720215; fVmax[0] = 787.769287109375; fVmin[1] = 20.0004863739014; fVmax[1] = 162.429443359375; fVmin[2] = 10.0008592605591; fVmax[2] = 71.1104202270508; fVmin[3] = 1.0615531206131; fVmax[3] = 21.7385215759277; fVmin[4] = 15.0050735473633; fVmax[4] = 274.839141845703; fVmin[5] = 30.148473739624; fVmax[5] = 469.726501464844; fVmin[6] = 1.09349226951599; fVmax[6] = 329.840057373047; fVmin[7] = 48.8991050720215; fVmax[7] = 391.964660644531; fVmin[8] = 0.00544894440099597; fVmax[8] = 3.14159226417542; fVmin[9] = 1.13248825073242e-05; fVmax[9] = 1.10251176357269; fVmin[10] = 0.200022801756859; fVmax[10] = 1.11529541015625; fVmin[11] = 20.0075798034668; fVmax[11] = 206.098724365234; fVmin[12] = 10.0028533935547; fVmax[12] = 107.376251220703; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.152439552186457; fWeightMatrix0to1[1][0] = 2.18079479256545; fWeightMatrix0to1[2][0] = 0.791031296210526; fWeightMatrix0to1[3][0] = 1.46557475747692; fWeightMatrix0to1[4][0] = -1.56184225756949; fWeightMatrix0to1[5][0] = -1.36514829178838; fWeightMatrix0to1[6][0] = -0.434384388537805; fWeightMatrix0to1[7][0] = 1.79764536075083; fWeightMatrix0to1[8][0] = -1.54499140286548; fWeightMatrix0to1[9][0] = -0.867949716497269; fWeightMatrix0to1[10][0] = -1.14285343910601; fWeightMatrix0to1[11][0] = -0.147458000735048; fWeightMatrix0to1[12][0] = -1.524937602283; fWeightMatrix0to1[13][0] = -1.05962628148263; fWeightMatrix0to1[0][1] = -0.421231563026552; fWeightMatrix0to1[1][1] = 0.745915450725967; fWeightMatrix0to1[2][1] = -0.522997381089249; fWeightMatrix0to1[3][1] = 0.91034841757329; fWeightMatrix0to1[4][1] = 2.56389724284252; fWeightMatrix0to1[5][1] = 1.30912852419215; fWeightMatrix0to1[6][1] = -2.29715685206532; fWeightMatrix0to1[7][1] = -0.0161364893516969; fWeightMatrix0to1[8][1] = 0.638030931636167; fWeightMatrix0to1[9][1] = 2.29306287773082; fWeightMatrix0to1[10][1] = -1.26799364033589; fWeightMatrix0to1[11][1] = 0.244624150884799; fWeightMatrix0to1[12][1] = 1.14462411076747; fWeightMatrix0to1[13][1] = -1.95297734183742; fWeightMatrix0to1[0][2] = -1.72338750288555; fWeightMatrix0to1[1][2] = 0.396278330103262; fWeightMatrix0to1[2][2] = 0.565420035200165; fWeightMatrix0to1[3][2] = -0.565301323053809; fWeightMatrix0to1[4][2] = 0.959195660460204; fWeightMatrix0to1[5][2] = -1.02410097160426; fWeightMatrix0to1[6][2] = -2.40359641517643; fWeightMatrix0to1[7][2] = 0.404877500469581; fWeightMatrix0to1[8][2] = -1.83697766479691; fWeightMatrix0to1[9][2] = 4.14628852145469; fWeightMatrix0to1[10][2] = -1.97280115688359; fWeightMatrix0to1[11][2] = 1.31834245759115; fWeightMatrix0to1[12][2] = 0.875252057090126; fWeightMatrix0to1[13][2] = -0.212961560873203; fWeightMatrix0to1[0][3] = 2.02710240093432; fWeightMatrix0to1[1][3] = 1.36548018703085; fWeightMatrix0to1[2][3] = -0.83209618862926; fWeightMatrix0to1[3][3] = -2.6604658796241; fWeightMatrix0to1[4][3] = 0.329036264748641; fWeightMatrix0to1[5][3] = 0.60402380993694; fWeightMatrix0to1[6][3] = 3.68731375103314; fWeightMatrix0to1[7][3] = 1.71852218405228; fWeightMatrix0to1[8][3] = 1.32203025789325; fWeightMatrix0to1[9][3] = 1.2538734331792; fWeightMatrix0to1[10][3] = -1.23642671192143; fWeightMatrix0to1[11][3] = 1.12482065180133; fWeightMatrix0to1[12][3] = -2.00011525042135; fWeightMatrix0to1[13][3] = -1.39485196752492; fWeightMatrix0to1[0][4] = -1.08399405480056; fWeightMatrix0to1[1][4] = -1.63835045794743; fWeightMatrix0to1[2][4] = 1.46289364821765; fWeightMatrix0to1[3][4] = -0.391122460996191; fWeightMatrix0to1[4][4] = -0.856835274306764; fWeightMatrix0to1[5][4] = 1.05525779503417; fWeightMatrix0to1[6][4] = -0.348228631032701; fWeightMatrix0to1[7][4] = 1.75980960535051; fWeightMatrix0to1[8][4] = 1.71576631625087; fWeightMatrix0to1[9][4] = 0.531182438950925; fWeightMatrix0to1[10][4] = 0.468047729462358; fWeightMatrix0to1[11][4] = -1.04085380708566; fWeightMatrix0to1[12][4] = 0.540648654304222; fWeightMatrix0to1[13][4] = -1.58861116605653; fWeightMatrix0to1[0][5] = -0.577279953026339; fWeightMatrix0to1[1][5] = -1.23110990289471; fWeightMatrix0to1[2][5] = 0.642618992655834; fWeightMatrix0to1[3][5] = 1.11814375788264; fWeightMatrix0to1[4][5] = 1.89188116012266; fWeightMatrix0to1[5][5] = -0.392109057805278; fWeightMatrix0to1[6][5] = -1.89647150539494; fWeightMatrix0to1[7][5] = -2.27986339467037; fWeightMatrix0to1[8][5] = 0.888120654554429; fWeightMatrix0to1[9][5] = 0.655929797546082; fWeightMatrix0to1[10][5] = 2.42084478887018; fWeightMatrix0to1[11][5] = 0.981563334828013; fWeightMatrix0to1[12][5] = 0.706525580422921; fWeightMatrix0to1[13][5] = -1.39840378768272; fWeightMatrix0to1[0][6] = -0.692437992502043; fWeightMatrix0to1[1][6] = -0.121219267424442; fWeightMatrix0to1[2][6] = 1.17035128664446; fWeightMatrix0to1[3][6] = -2.89733847201603; fWeightMatrix0to1[4][6] = 0.605638166003461; fWeightMatrix0to1[5][6] = -1.60850818210668; fWeightMatrix0to1[6][6] = -2.60835446491966; fWeightMatrix0to1[7][6] = 1.53355766201092; fWeightMatrix0to1[8][6] = 1.67370035568222; fWeightMatrix0to1[9][6] = 0.891145118085795; fWeightMatrix0to1[10][6] = -0.890094471764891; fWeightMatrix0to1[11][6] = 2.23163352236111; fWeightMatrix0to1[12][6] = -1.03917433107152; fWeightMatrix0to1[13][6] = -2.10102996541098; fWeightMatrix0to1[0][7] = -1.25128925355959; fWeightMatrix0to1[1][7] = 0.248473936383797; fWeightMatrix0to1[2][7] = -2.05000239437913; fWeightMatrix0to1[3][7] = -5.32196449879343; fWeightMatrix0to1[4][7] = 4.26972461291065; fWeightMatrix0to1[5][7] = 1.42301795908168; fWeightMatrix0to1[6][7] = -5.93073333958047; fWeightMatrix0to1[7][7] = 4.40938288346109; fWeightMatrix0to1[8][7] = -0.93543531015328; fWeightMatrix0to1[9][7] = 3.98740619140761; fWeightMatrix0to1[10][7] = -0.669908055838929; fWeightMatrix0to1[11][7] = 0.909766126197712; fWeightMatrix0to1[12][7] = -1.88135509937643; fWeightMatrix0to1[13][7] = -0.802353838741631; fWeightMatrix0to1[0][8] = 0.159728006170076; fWeightMatrix0to1[1][8] = 0.167516927227825; fWeightMatrix0to1[2][8] = -1.48656495325727; fWeightMatrix0to1[3][8] = 3.48367558649619; fWeightMatrix0to1[4][8] = 1.32057903311509; fWeightMatrix0to1[5][8] = -1.09712328279773; fWeightMatrix0to1[6][8] = 2.04961770691039; fWeightMatrix0to1[7][8] = -1.19799681530899; fWeightMatrix0to1[8][8] = -1.59898367027062; fWeightMatrix0to1[9][8] = 1.99675326566539; fWeightMatrix0to1[10][8] = -1.07360209806507; fWeightMatrix0to1[11][8] = 1.62873424330093; fWeightMatrix0to1[12][8] = 0.964687191828404; fWeightMatrix0to1[13][8] = 0.20813776594297; fWeightMatrix0to1[0][9] = -0.354327582631219; fWeightMatrix0to1[1][9] = 0.451534923571479; fWeightMatrix0to1[2][9] = -2.07668588058219; fWeightMatrix0to1[3][9] = 0.726115411129646; fWeightMatrix0to1[4][9] = 1.12689499268904; fWeightMatrix0to1[5][9] = -1.51254034799266; fWeightMatrix0to1[6][9] = -0.825534259541042; fWeightMatrix0to1[7][9] = 0.308197951206449; fWeightMatrix0to1[8][9] = -0.235588186345832; fWeightMatrix0to1[9][9] = -0.207943773419894; fWeightMatrix0to1[10][9] = 1.9574536094828; fWeightMatrix0to1[11][9] = -2.5088919246989; fWeightMatrix0to1[12][9] = -0.183803755005573; fWeightMatrix0to1[13][9] = 0.870204698651245; fWeightMatrix0to1[0][10] = 1.56882504140597; fWeightMatrix0to1[1][10] = 0.335660933550778; fWeightMatrix0to1[2][10] = -1.58525478085447; fWeightMatrix0to1[3][10] = -0.553090948832041; fWeightMatrix0to1[4][10] = -0.562892309286924; fWeightMatrix0to1[5][10] = 1.73007518172328; fWeightMatrix0to1[6][10] = -0.969222308900282; fWeightMatrix0to1[7][10] = -1.49348247158269; fWeightMatrix0to1[8][10] = -0.622736058603812; fWeightMatrix0to1[9][10] = 0.299510384338769; fWeightMatrix0to1[10][10] = -1.36645004246045; fWeightMatrix0to1[11][10] = 0.40303900849332; fWeightMatrix0to1[12][10] = 1.26156043865059; fWeightMatrix0to1[13][10] = -0.791872999488422; fWeightMatrix0to1[0][11] = -0.448008349028369; fWeightMatrix0to1[1][11] = 0.3111201468093; fWeightMatrix0to1[2][11] = 0.83753519022392; fWeightMatrix0to1[3][11] = 0.829649441819839; fWeightMatrix0to1[4][11] = 1.56577790446854; fWeightMatrix0to1[5][11] = -1.18818821807556; fWeightMatrix0to1[6][11] = 3.4363298509452; fWeightMatrix0to1[7][11] = 0.724436934831439; fWeightMatrix0to1[8][11] = 0.444014908573085; fWeightMatrix0to1[9][11] = 0.815193168694489; fWeightMatrix0to1[10][11] = 1.57202496972573; fWeightMatrix0to1[11][11] = -0.359442887830592; fWeightMatrix0to1[12][11] = -0.984448254863713; fWeightMatrix0to1[13][11] = 0.424308080751598; fWeightMatrix0to1[0][12] = -0.774305338621929; fWeightMatrix0to1[1][12] = -0.494628660504796; fWeightMatrix0to1[2][12] = 1.7267196273359; fWeightMatrix0to1[3][12] = 0.0920769090076305; fWeightMatrix0to1[4][12] = -0.10860034877007; fWeightMatrix0to1[5][12] = 1.75027080091799; fWeightMatrix0to1[6][12] = 1.81331418049058; fWeightMatrix0to1[7][12] = -1.38345415484846; fWeightMatrix0to1[8][12] = 0.37048172728514; fWeightMatrix0to1[9][12] = 1.81046071247136; fWeightMatrix0to1[10][12] = 0.161899747511968; fWeightMatrix0to1[11][12] = -1.50460457133028; fWeightMatrix0to1[12][12] = -1.99897129156981; fWeightMatrix0to1[13][12] = 0.435809996538409; fWeightMatrix0to1[0][13] = 1.66149465233425; fWeightMatrix0to1[1][13] = -1.91889855383697; fWeightMatrix0to1[2][13] = -0.198394740329347; fWeightMatrix0to1[3][13] = -5.66618108104114; fWeightMatrix0to1[4][13] = 4.81185328394036; fWeightMatrix0to1[5][13] = 0.952940763031861; fWeightMatrix0to1[6][13] = -9.43763957567097; fWeightMatrix0to1[7][13] = 4.33057790544463; fWeightMatrix0to1[8][13] = -0.500528500476801; fWeightMatrix0to1[9][13] = 8.05470402822451; fWeightMatrix0to1[10][13] = -0.362464033111804; fWeightMatrix0to1[11][13] = 1.51942640567297; fWeightMatrix0to1[12][13] = 1.00703699373798; fWeightMatrix0to1[13][13] = -0.69772231190871; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.12787285409358; fWeightMatrix1to2[1][0] = 0.1367107200382; fWeightMatrix1to2[2][0] = -0.0326713996690597; fWeightMatrix1to2[3][0] = -1.64430288181875; fWeightMatrix1to2[4][0] = -0.691756049548955; fWeightMatrix1to2[5][0] = -0.462788758038252; fWeightMatrix1to2[6][0] = -0.664400022898371; fWeightMatrix1to2[7][0] = -1.43051674251916; fWeightMatrix1to2[8][0] = 0.606740541716986; fWeightMatrix1to2[9][0] = 0.148005897541379; fWeightMatrix1to2[10][0] = -1.21328924284122; fWeightMatrix1to2[11][0] = -0.436006352044658; fWeightMatrix1to2[12][0] = -0.122495989882759; fWeightMatrix1to2[0][1] = 1.19422170027039; fWeightMatrix1to2[1][1] = -0.0266831781922975; fWeightMatrix1to2[2][1] = -0.220965648227146; fWeightMatrix1to2[3][1] = -1.9163441848335; fWeightMatrix1to2[4][1] = -0.923571586218113; fWeightMatrix1to2[5][1] = 1.79720893978085; fWeightMatrix1to2[6][1] = 1.30808449081963; fWeightMatrix1to2[7][1] = 0.779862150879022; fWeightMatrix1to2[8][1] = 0.161328989209591; fWeightMatrix1to2[9][1] = 1.91373704356273; fWeightMatrix1to2[10][1] = -1.6521741697889; fWeightMatrix1to2[11][1] = -1.26403636348076; fWeightMatrix1to2[12][1] = -0.188676042653716; fWeightMatrix1to2[0][2] = -1.22352661883923; fWeightMatrix1to2[1][2] = 1.06894700859972; fWeightMatrix1to2[2][2] = 1.67341943194195; fWeightMatrix1to2[3][2] = -1.23400657685263; fWeightMatrix1to2[4][2] = 0.789710107157542; fWeightMatrix1to2[5][2] = -0.34611154089812; fWeightMatrix1to2[6][2] = -1.77959037518899; fWeightMatrix1to2[7][2] = -1.35194219176202; fWeightMatrix1to2[8][2] = 1.09924799991188; fWeightMatrix1to2[9][2] = -0.137200808010733; fWeightMatrix1to2[10][2] = 0.901787476483927; fWeightMatrix1to2[11][2] = -1.14430371327042; fWeightMatrix1to2[12][2] = 1.67246455333558; fWeightMatrix1to2[0][3] = -1.75587395997867; fWeightMatrix1to2[1][3] = 4.38854732247668; fWeightMatrix1to2[2][3] = 0.925558178785458; fWeightMatrix1to2[3][3] = -1.6662526162909; fWeightMatrix1to2[4][3] = -1.31350863202838; fWeightMatrix1to2[5][3] = -1.57072282859189; fWeightMatrix1to2[6][3] = -0.76348616516508; fWeightMatrix1to2[7][3] = -1.86227033517865; fWeightMatrix1to2[8][3] = -1.79873077457926; fWeightMatrix1to2[9][3] = -1.89343146101035; fWeightMatrix1to2[10][3] = -0.154470020263966; fWeightMatrix1to2[11][3] = -1.22222556613966; fWeightMatrix1to2[12][3] = -2.73581927908241; fWeightMatrix1to2[0][4] = 0.949611281449322; fWeightMatrix1to2[1][4] = -4.91239685126414; fWeightMatrix1to2[2][4] = -0.431005803479942; fWeightMatrix1to2[3][4] = -1.24088981635954; fWeightMatrix1to2[4][4] = 0.214821747887121; fWeightMatrix1to2[5][4] = 0.943070175328502; fWeightMatrix1to2[6][4] = 0.941076540884255; fWeightMatrix1to2[7][4] = -0.918896849012684; fWeightMatrix1to2[8][4] = -2.0147193437971; fWeightMatrix1to2[9][4] = -1.54982157665389; fWeightMatrix1to2[10][4] = -0.596714323350211; fWeightMatrix1to2[11][4] = -1.49971924535204; fWeightMatrix1to2[12][4] = -0.376538862277041; fWeightMatrix1to2[0][5] = -0.68814479442689; fWeightMatrix1to2[1][5] = 0.919127097168252; fWeightMatrix1to2[2][5] = -0.767052343127794; fWeightMatrix1to2[3][5] = 0.0321185368632067; fWeightMatrix1to2[4][5] = -0.529497414276476; fWeightMatrix1to2[5][5] = -2.85926308405785; fWeightMatrix1to2[6][5] = 0.518828309351065; fWeightMatrix1to2[7][5] = 1.47649989481706; fWeightMatrix1to2[8][5] = -1.63755408839567; fWeightMatrix1to2[9][5] = -1.75976182227058; fWeightMatrix1to2[10][5] = 0.214857648710998; fWeightMatrix1to2[11][5] = 0.937949429038982; fWeightMatrix1to2[12][5] = -1.18521404355327; fWeightMatrix1to2[0][6] = -2.35131556481023; fWeightMatrix1to2[1][6] = 5.07061517652297; fWeightMatrix1to2[2][6] = 1.82659706133386; fWeightMatrix1to2[3][6] = 0.72261603971234; fWeightMatrix1to2[4][6] = 0.986682328864341; fWeightMatrix1to2[5][6] = -2.03375623449036; fWeightMatrix1to2[6][6] = -0.580755174172081; fWeightMatrix1to2[7][6] = 0.281998580127071; fWeightMatrix1to2[8][6] = -2.24157497853121; fWeightMatrix1to2[9][6] = -0.840628396791638; fWeightMatrix1to2[10][6] = -2.9116851261697; fWeightMatrix1to2[11][6] = -1.98541193271157; fWeightMatrix1to2[12][6] = -2.2107033110766; fWeightMatrix1to2[0][7] = 1.87992490718644; fWeightMatrix1to2[1][7] = -4.83523912340383; fWeightMatrix1to2[2][7] = 0.649028372767886; fWeightMatrix1to2[3][7] = -1.41238957022888; fWeightMatrix1to2[4][7] = 1.13252984214362; fWeightMatrix1to2[5][7] = -1.24457953465269; fWeightMatrix1to2[6][7] = 0.343359058458807; fWeightMatrix1to2[7][7] = 1.00345039727837; fWeightMatrix1to2[8][7] = -1.87948606945966; fWeightMatrix1to2[9][7] = -0.000682289145044944; fWeightMatrix1to2[10][7] = 0.764562179398619; fWeightMatrix1to2[11][7] = -1.76869269901281; fWeightMatrix1to2[12][7] = -2.10421280649458; fWeightMatrix1to2[0][8] = -1.66066303411733; fWeightMatrix1to2[1][8] = -1.40466592133981; fWeightMatrix1to2[2][8] = 1.63899913340403; fWeightMatrix1to2[3][8] = -0.545977042185538; fWeightMatrix1to2[4][8] = 0.275081750948815; fWeightMatrix1to2[5][8] = -1.25225647561285; fWeightMatrix1to2[6][8] = -1.17410723145719; fWeightMatrix1to2[7][8] = -1.8300735819148; fWeightMatrix1to2[8][8] = -1.00199890547339; fWeightMatrix1to2[9][8] = -1.44075311664262; fWeightMatrix1to2[10][8] = 0.871801699858399; fWeightMatrix1to2[11][8] = -1.35657542840781; fWeightMatrix1to2[12][8] = -1.23911862455263; fWeightMatrix1to2[0][9] = 3.03293211946455; fWeightMatrix1to2[1][9] = -5.61165441299675; fWeightMatrix1to2[2][9] = -1.38687002287624; fWeightMatrix1to2[3][9] = 1.76779883318484; fWeightMatrix1to2[4][9] = -1.79221837632117; fWeightMatrix1to2[5][9] = 2.12916015069966; fWeightMatrix1to2[6][9] = 0.884329180532474; fWeightMatrix1to2[7][9] = -0.693170532218105; fWeightMatrix1to2[8][9] = -1.25597211579543; fWeightMatrix1to2[9][9] = -0.0408636561765163; fWeightMatrix1to2[10][9] = 0.343899423093176; fWeightMatrix1to2[11][9] = 1.12906033733577; fWeightMatrix1to2[12][9] = 0.874579809305211; fWeightMatrix1to2[0][10] = -0.0383860468490221; fWeightMatrix1to2[1][10] = 1.42826062446321; fWeightMatrix1to2[2][10] = -1.79531741106538; fWeightMatrix1to2[3][10] = 0.756776546438759; fWeightMatrix1to2[4][10] = -1.92675831110257; fWeightMatrix1to2[5][10] = -0.397738562954895; fWeightMatrix1to2[6][10] = -2.17651228495509; fWeightMatrix1to2[7][10] = -1.68809565810555; fWeightMatrix1to2[8][10] = -1.02485142803126; fWeightMatrix1to2[9][10] = 0.555136242242051; fWeightMatrix1to2[10][10] = -0.652813836086494; fWeightMatrix1to2[11][10] = 0.618791709321964; fWeightMatrix1to2[12][10] = -1.08707674646477; fWeightMatrix1to2[0][11] = -1.11685561258707; fWeightMatrix1to2[1][11] = -1.54865725661029; fWeightMatrix1to2[2][11] = -0.0775181091216244; fWeightMatrix1to2[3][11] = 1.69894932274186; fWeightMatrix1to2[4][11] = -1.56072491581661; fWeightMatrix1to2[5][11] = -0.668566383126374; fWeightMatrix1to2[6][11] = -2.13144438536211; fWeightMatrix1to2[7][11] = -0.250918736714533; fWeightMatrix1to2[8][11] = -0.519042721400405; fWeightMatrix1to2[9][11] = -0.48598193698788; fWeightMatrix1to2[10][11] = -1.76747173669327; fWeightMatrix1to2[11][11] = 0.676449902156844; fWeightMatrix1to2[12][11] = 0.0159448970948474; fWeightMatrix1to2[0][12] = -0.625697564245071; fWeightMatrix1to2[1][12] = -0.667765839682345; fWeightMatrix1to2[2][12] = -1.29048659884632; fWeightMatrix1to2[3][12] = -1.80454065984463; fWeightMatrix1to2[4][12] = -1.73627945951494; fWeightMatrix1to2[5][12] = 0.0266291632543861; fWeightMatrix1to2[6][12] = -1.17574330367879; fWeightMatrix1to2[7][12] = -0.728235365602283; fWeightMatrix1to2[8][12] = -0.481284681936726; fWeightMatrix1to2[9][12] = -2.17417920174992; fWeightMatrix1to2[10][12] = -1.74462562045105; fWeightMatrix1to2[11][12] = -1.9674743343258; fWeightMatrix1to2[12][12] = -0.688154947579063; fWeightMatrix1to2[0][13] = -2.71283421360704; fWeightMatrix1to2[1][13] = -2.42148425030618; fWeightMatrix1to2[2][13] = -0.881820164673091; fWeightMatrix1to2[3][13] = -1.11908626120378; fWeightMatrix1to2[4][13] = -1.62012919785461; fWeightMatrix1to2[5][13] = 0.561715969863393; fWeightMatrix1to2[6][13] = -1.94354502784573; fWeightMatrix1to2[7][13] = -0.709873179299158; fWeightMatrix1to2[8][13] = 0.889248183674207; fWeightMatrix1to2[9][13] = 0.260897699518087; fWeightMatrix1to2[10][13] = -0.272194456505819; fWeightMatrix1to2[11][13] = -0.238000658240805; fWeightMatrix1to2[12][13] = -0.360964816532286; fWeightMatrix1to2[0][14] = 0.640478970292694; fWeightMatrix1to2[1][14] = -0.640800174044537; fWeightMatrix1to2[2][14] = -1.47642218612492; fWeightMatrix1to2[3][14] = -0.966609436303077; fWeightMatrix1to2[4][14] = -1.73501602068977; fWeightMatrix1to2[5][14] = -0.175302917823886; fWeightMatrix1to2[6][14] = 0.175026800469408; fWeightMatrix1to2[7][14] = 0.392290644934597; fWeightMatrix1to2[8][14] = 0.151615083289458; fWeightMatrix1to2[9][14] = 0.241362941352202; fWeightMatrix1to2[10][14] = 0.916812297990016; fWeightMatrix1to2[11][14] = -1.64646031777023; fWeightMatrix1to2[12][14] = -0.0304071589248944; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.23076796572028; fWeightMatrix2to3[0][1] = -0.756424535604865; fWeightMatrix2to3[0][2] = -0.668155150359988; fWeightMatrix2to3[0][3] = 0.430613958947046; fWeightMatrix2to3[0][4] = 0.915495079083844; fWeightMatrix2to3[0][5] = 0.498551942159678; fWeightMatrix2to3[0][6] = -0.322401366188262; fWeightMatrix2to3[0][7] = 1.9343581635561; fWeightMatrix2to3[0][8] = -0.431822255300582; fWeightMatrix2to3[0][9] = 1.01893650925179; fWeightMatrix2to3[0][10] = 0.387043925160743; fWeightMatrix2to3[0][11] = 0.427990952559679; fWeightMatrix2to3[0][12] = -0.262190225993011; fWeightMatrix2to3[0][13] = 0.765485847903963; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l