// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 09:50:36 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job400 Training events: 46092 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [48.2055778503,708.240539551] LepAPt LepAPt 'F' [20.0001125336,183.946380615] LepBPt LepBPt 'F' [10.0002183914,70.5175170898] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.13823473454,17.7251911163] MetSpec MetSpec 'F' [15.0000524521,214.544067383] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1484737396,511.516204834] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [1.29604113102,257.11505127] addEt addEt 'F' [48.1082038879,354.9012146] dPhiLepSumMet dPhiLepSumMet 'F' [0.000599204504397,3.14159059525] dPhiLeptons dPhiLeptons 'F' [3.91006469727e-05,1.14321351051] dRLeptons dRLeptons 'F' [0.200031235814,1.14349746704] lep1_E lep1_E 'F' [20.0144271851,232.717926025] lep2_E lep2_E 'F' [10.0051202774,122.221923828] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 48.2055778503418; fVmax[0] = 708.240539550781; fVmin[1] = 20.0001125335693; fVmax[1] = 183.946380615234; fVmin[2] = 10.0002183914185; fVmax[2] = 70.5175170898438; fVmin[3] = 1.13823473453522; fVmax[3] = 17.725191116333; fVmin[4] = 15.0000524520874; fVmax[4] = 214.544067382812; fVmin[5] = 30.148473739624; fVmax[5] = 511.516204833984; fVmin[6] = 1.29604113101959; fVmax[6] = 257.115051269531; fVmin[7] = 48.1082038879395; fVmax[7] = 354.901214599609; fVmin[8] = 0.000599204504396766; fVmax[8] = 3.14159059524536; fVmin[9] = 3.91006469726562e-05; fVmax[9] = 1.14321351051331; fVmin[10] = 0.200031235814095; fVmax[10] = 1.14349746704102; fVmin[11] = 20.0144271850586; fVmax[11] = 232.717926025391; fVmin[12] = 10.0051202774048; fVmax[12] = 122.221923828125; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.393668996168818; fWeightMatrix0to1[1][0] = 2.31538211746849; fWeightMatrix0to1[2][0] = -0.362663893693422; fWeightMatrix0to1[3][0] = 1.61619886866628; fWeightMatrix0to1[4][0] = -1.20194788642762; fWeightMatrix0to1[5][0] = -0.956009956386112; fWeightMatrix0to1[6][0] = -0.753372623549554; fWeightMatrix0to1[7][0] = 1.38003580491104; fWeightMatrix0to1[8][0] = -1.68939484264793; fWeightMatrix0to1[9][0] = -0.779029979686507; fWeightMatrix0to1[10][0] = -0.969720087194179; fWeightMatrix0to1[11][0] = 0.10591926569969; fWeightMatrix0to1[12][0] = -1.64760113436515; fWeightMatrix0to1[13][0] = 0.0816835943247058; fWeightMatrix0to1[0][1] = -0.596876440303206; fWeightMatrix0to1[1][1] = 0.93731988892423; fWeightMatrix0to1[2][1] = -4.08209589341596; fWeightMatrix0to1[3][1] = 0.980677038666716; fWeightMatrix0to1[4][1] = 0.473194113403086; fWeightMatrix0to1[5][1] = 1.97890764275888; fWeightMatrix0to1[6][1] = -1.48320298951325; fWeightMatrix0to1[7][1] = 5.29244478441589; fWeightMatrix0to1[8][1] = 0.649318290835994; fWeightMatrix0to1[9][1] = 0.387088963566608; fWeightMatrix0to1[10][1] = -1.48849877155543; fWeightMatrix0to1[11][1] = 2.21506043992563; fWeightMatrix0to1[12][1] = 0.653452216714729; fWeightMatrix0to1[13][1] = -0.255709179174727; fWeightMatrix0to1[0][2] = -1.98082125553081; fWeightMatrix0to1[1][2] = 0.402759904419471; fWeightMatrix0to1[2][2] = -1.13423559879746; fWeightMatrix0to1[3][2] = 0.983911028052074; fWeightMatrix0to1[4][2] = 0.423838798755388; fWeightMatrix0to1[5][2] = 1.29559279216357; fWeightMatrix0to1[6][2] = -0.675577181609799; fWeightMatrix0to1[7][2] = 2.80463821947088; fWeightMatrix0to1[8][2] = -1.3562665557437; fWeightMatrix0to1[9][2] = 2.22518902049681; fWeightMatrix0to1[10][2] = -2.91540814874624; fWeightMatrix0to1[11][2] = 4.43748099871435; fWeightMatrix0to1[12][2] = 0.641863871937198; fWeightMatrix0to1[13][2] = 2.02675030603532; fWeightMatrix0to1[0][3] = 1.98326497561394; fWeightMatrix0to1[1][3] = 1.62504221052196; fWeightMatrix0to1[2][3] = 2.18826499909822; fWeightMatrix0to1[3][3] = -2.23644705265848; fWeightMatrix0to1[4][3] = 0.530383826611452; fWeightMatrix0to1[5][3] = -0.279883076332335; fWeightMatrix0to1[6][3] = 4.2179313265465; fWeightMatrix0to1[7][3] = 0.589250706229577; fWeightMatrix0to1[8][3] = 1.23423708613142; fWeightMatrix0to1[9][3] = 2.09367219389432; fWeightMatrix0to1[10][3] = -1.8114825438494; fWeightMatrix0to1[11][3] = 2.24076184869705; fWeightMatrix0to1[12][3] = -1.90385321853856; fWeightMatrix0to1[13][3] = -0.959700351404197; fWeightMatrix0to1[0][4] = -1.21045259738659; fWeightMatrix0to1[1][4] = -1.3040727723372; fWeightMatrix0to1[2][4] = 1.38660474516187; fWeightMatrix0to1[3][4] = -0.233632197710521; fWeightMatrix0to1[4][4] = -0.58578518586368; fWeightMatrix0to1[5][4] = 0.471220518572243; fWeightMatrix0to1[6][4] = 0.766531683768821; fWeightMatrix0to1[7][4] = 1.2860388419788; fWeightMatrix0to1[8][4] = 1.65949595872327; fWeightMatrix0to1[9][4] = 0.479888851823384; fWeightMatrix0to1[10][4] = -0.0252396075779889; fWeightMatrix0to1[11][4] = 0.553508148260977; fWeightMatrix0to1[12][4] = 0.450980645358948; fWeightMatrix0to1[13][4] = -0.407925874673347; fWeightMatrix0to1[0][5] = -0.864118700702786; fWeightMatrix0to1[1][5] = -1.23294330807031; fWeightMatrix0to1[2][5] = 0.00216811884499258; fWeightMatrix0to1[3][5] = 1.43108230520134; fWeightMatrix0to1[4][5] = 1.98262097118475; fWeightMatrix0to1[5][5] = -0.00670009175167116; fWeightMatrix0to1[6][5] = -1.88266734067005; fWeightMatrix0to1[7][5] = -4.72013353197976; fWeightMatrix0to1[8][5] = 0.742139492110559; fWeightMatrix0to1[9][5] = 0.549721828278226; fWeightMatrix0to1[10][5] = 3.03735434875986; fWeightMatrix0to1[11][5] = -0.033644549591912; fWeightMatrix0to1[12][5] = 0.435402683867225; fWeightMatrix0to1[13][5] = -0.368250718925979; fWeightMatrix0to1[0][6] = -0.84013885747519; fWeightMatrix0to1[1][6] = 0.238089137143131; fWeightMatrix0to1[2][6] = -0.876416475654751; fWeightMatrix0to1[3][6] = -1.89116960857065; fWeightMatrix0to1[4][6] = -0.212208226438224; fWeightMatrix0to1[5][6] = -1.24955078987415; fWeightMatrix0to1[6][6] = -0.181599925282504; fWeightMatrix0to1[7][6] = 1.68513063798522; fWeightMatrix0to1[8][6] = 1.32902695587081; fWeightMatrix0to1[9][6] = -1.02600026935776; fWeightMatrix0to1[10][6] = -0.0421643508266767; fWeightMatrix0to1[11][6] = 1.95187014668484; fWeightMatrix0to1[12][6] = -0.964501448272923; fWeightMatrix0to1[13][6] = -1.03497306065334; fWeightMatrix0to1[0][7] = -1.36762822209908; fWeightMatrix0to1[1][7] = 0.711820360274937; fWeightMatrix0to1[2][7] = -7.74217650508782; fWeightMatrix0to1[3][7] = -3.6166065491817; fWeightMatrix0to1[4][7] = 1.90661971476466; fWeightMatrix0to1[5][7] = 3.09737556908895; fWeightMatrix0to1[6][7] = -3.35597137735707; fWeightMatrix0to1[7][7] = 13.9860869778397; fWeightMatrix0to1[8][7] = -0.998065421583785; fWeightMatrix0to1[9][7] = -0.0317114147129067; fWeightMatrix0to1[10][7] = -1.73163926876021; fWeightMatrix0to1[11][7] = 6.60537952753263; fWeightMatrix0to1[12][7] = -1.97254197902727; fWeightMatrix0to1[13][7] = 1.63768230643227; fWeightMatrix0to1[0][8] = 0.515449458977777; fWeightMatrix0to1[1][8] = 0.448507999183764; fWeightMatrix0to1[2][8] = 3.57712772025954; fWeightMatrix0to1[3][8] = 2.35557072997175; fWeightMatrix0to1[4][8] = -0.789014097014929; fWeightMatrix0to1[5][8] = -3.10538545387907; fWeightMatrix0to1[6][8] = 2.62008217984983; fWeightMatrix0to1[7][8] = -0.787158094007552; fWeightMatrix0to1[8][8] = -1.84057807404318; fWeightMatrix0to1[9][8] = 0.674103810154384; fWeightMatrix0to1[10][8] = -1.50669059959657; fWeightMatrix0to1[11][8] = 1.93392810724457; fWeightMatrix0to1[12][8] = 1.22299631701278; fWeightMatrix0to1[13][8] = -0.476908179885553; fWeightMatrix0to1[0][9] = -0.49748535558122; fWeightMatrix0to1[1][9] = 0.155501124550373; fWeightMatrix0to1[2][9] = -0.137608401750506; fWeightMatrix0to1[3][9] = 0.702128717859933; fWeightMatrix0to1[4][9] = 0.0352437769436797; fWeightMatrix0to1[5][9] = -0.444300322772822; fWeightMatrix0to1[6][9] = 0.260355456331578; fWeightMatrix0to1[7][9] = 1.23943568601778; fWeightMatrix0to1[8][9] = 0.26523437079606; fWeightMatrix0to1[9][9] = 1.30443458368064; fWeightMatrix0to1[10][9] = 3.03949701153651; fWeightMatrix0to1[11][9] = -0.058609555123427; fWeightMatrix0to1[12][9] = -0.0487944829615738; fWeightMatrix0to1[13][9] = 0.814192490783975; fWeightMatrix0to1[0][10] = 1.46462567885475; fWeightMatrix0to1[1][10] = -0.236005641851429; fWeightMatrix0to1[2][10] = -1.63711331657551; fWeightMatrix0to1[3][10] = -0.21960581059329; fWeightMatrix0to1[4][10] = -3.19942295812903; fWeightMatrix0to1[5][10] = 0.572697677729271; fWeightMatrix0to1[6][10] = 0.338231794949744; fWeightMatrix0to1[7][10] = -0.580094195943779; fWeightMatrix0to1[8][10] = 0.471787273300134; fWeightMatrix0to1[9][10] = -0.0905411667047881; fWeightMatrix0to1[10][10] = -1.89167603024686; fWeightMatrix0to1[11][10] = 0.359296657104059; fWeightMatrix0to1[12][10] = 0.942275996449024; fWeightMatrix0to1[13][10] = -0.789968013768123; fWeightMatrix0to1[0][11] = -0.50731932953712; fWeightMatrix0to1[1][11] = 0.206057769776147; fWeightMatrix0to1[2][11] = 3.13408767771697; fWeightMatrix0to1[3][11] = 0.452700235277093; fWeightMatrix0to1[4][11] = 1.11203581452044; fWeightMatrix0to1[5][11] = -3.84981610633321; fWeightMatrix0to1[6][11] = 3.04762344314724; fWeightMatrix0to1[7][11] = 2.31908257700307; fWeightMatrix0to1[8][11] = 1.02854973490036; fWeightMatrix0to1[9][11] = 0.8294616270969; fWeightMatrix0to1[10][11] = 0.747325945180224; fWeightMatrix0to1[11][11] = -0.654746324221459; fWeightMatrix0to1[12][11] = -2.29028636594385; fWeightMatrix0to1[13][11] = 1.52840270271519; fWeightMatrix0to1[0][12] = -0.952637453229601; fWeightMatrix0to1[1][12] = -0.612292860411442; fWeightMatrix0to1[2][12] = 6.25893257879699; fWeightMatrix0to1[3][12] = 0.605225565345359; fWeightMatrix0to1[4][12] = 0.257599984298838; fWeightMatrix0to1[5][12] = 0.564708318375407; fWeightMatrix0to1[6][12] = 4.2341077730442; fWeightMatrix0to1[7][12] = -1.87142021981351; fWeightMatrix0to1[8][12] = 1.35481965314107; fWeightMatrix0to1[9][12] = 0.379010830724756; fWeightMatrix0to1[10][12] = 0.73714874800514; fWeightMatrix0to1[11][12] = -1.13428647896616; fWeightMatrix0to1[12][12] = -2.90336321051282; fWeightMatrix0to1[13][12] = 1.68645750041536; fWeightMatrix0to1[0][13] = 1.96949067515663; fWeightMatrix0to1[1][13] = -1.64455517452379; fWeightMatrix0to1[2][13] = -6.54904774167373; fWeightMatrix0to1[3][13] = -2.80713557562759; fWeightMatrix0to1[4][13] = 0.846648732164488; fWeightMatrix0to1[5][13] = 2.46327829094171; fWeightMatrix0to1[6][13] = -4.87502454963484; fWeightMatrix0to1[7][13] = 16.5813300492426; fWeightMatrix0to1[8][13] = -0.536550737929164; fWeightMatrix0to1[9][13] = 1.09863911897978; fWeightMatrix0to1[10][13] = -1.32488351889897; fWeightMatrix0to1[11][13] = 7.87269270888994; fWeightMatrix0to1[12][13] = 1.42359882387545; fWeightMatrix0to1[13][13] = 0.00177264159860801; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.30144304511489; fWeightMatrix1to2[1][0] = -0.794773772595184; fWeightMatrix1to2[2][0] = -0.891992644402911; fWeightMatrix1to2[3][0] = -1.50247248795266; fWeightMatrix1to2[4][0] = -0.738063948733524; fWeightMatrix1to2[5][0] = -1.35372484846167; fWeightMatrix1to2[6][0] = -0.82097179362675; fWeightMatrix1to2[7][0] = -2.46219366748594; fWeightMatrix1to2[8][0] = 0.180987940882679; fWeightMatrix1to2[9][0] = -0.491100802062979; fWeightMatrix1to2[10][0] = -1.39024516966096; fWeightMatrix1to2[11][0] = -0.164624496855477; fWeightMatrix1to2[12][0] = 0.534076551087223; fWeightMatrix1to2[0][1] = 1.10839862225373; fWeightMatrix1to2[1][1] = -0.0462447853293143; fWeightMatrix1to2[2][1] = -0.861855453463894; fWeightMatrix1to2[3][1] = -1.98268945544444; fWeightMatrix1to2[4][1] = -0.906717171056723; fWeightMatrix1to2[5][1] = 1.64457976467896; fWeightMatrix1to2[6][1] = 1.37335159970298; fWeightMatrix1to2[7][1] = 0.819999246497921; fWeightMatrix1to2[8][1] = 0.172120353288658; fWeightMatrix1to2[9][1] = 1.93719859062324; fWeightMatrix1to2[10][1] = -1.70274718764078; fWeightMatrix1to2[11][1] = -1.30252808586892; fWeightMatrix1to2[12][1] = -0.322939031620834; fWeightMatrix1to2[0][2] = -1.65530654136442; fWeightMatrix1to2[1][2] = 2.29679758488098; fWeightMatrix1to2[2][2] = 0.915175841940668; fWeightMatrix1to2[3][2] = -1.05856727761521; fWeightMatrix1to2[4][2] = 0.104454810025184; fWeightMatrix1to2[5][2] = -0.663161128895012; fWeightMatrix1to2[6][2] = -2.16361592873279; fWeightMatrix1to2[7][2] = -1.38381797068121; fWeightMatrix1to2[8][2] = -0.187060834900688; fWeightMatrix1to2[9][2] = -0.424336646219595; fWeightMatrix1to2[10][2] = -0.0602290778725912; fWeightMatrix1to2[11][2] = -1.27209325063146; fWeightMatrix1to2[12][2] = 3.62027620504772; fWeightMatrix1to2[0][3] = -1.70485554686013; fWeightMatrix1to2[1][3] = 2.08096906318497; fWeightMatrix1to2[2][3] = 0.220818852302584; fWeightMatrix1to2[3][3] = -1.44880632928097; fWeightMatrix1to2[4][3] = -1.31840143692739; fWeightMatrix1to2[5][3] = -2.10712243491194; fWeightMatrix1to2[6][3] = -0.620316454862815; fWeightMatrix1to2[7][3] = -1.86893510215876; fWeightMatrix1to2[8][3] = -1.7614886423705; fWeightMatrix1to2[9][3] = -1.91374131921907; fWeightMatrix1to2[10][3] = 0.574119420891097; fWeightMatrix1to2[11][3] = -0.818095896992939; fWeightMatrix1to2[12][3] = -1.82852739022145; fWeightMatrix1to2[0][4] = 0.463703069850637; fWeightMatrix1to2[1][4] = -1.86625142713935; fWeightMatrix1to2[2][4] = -2.04477560208186; fWeightMatrix1to2[3][4] = -1.23893059749783; fWeightMatrix1to2[4][4] = 0.274193531425756; fWeightMatrix1to2[5][4] = 0.138526870894344; fWeightMatrix1to2[6][4] = 0.741815317635203; fWeightMatrix1to2[7][4] = -0.765019966422366; fWeightMatrix1to2[8][4] = -3.24803415270133; fWeightMatrix1to2[9][4] = -1.7400055933031; fWeightMatrix1to2[10][4] = -1.87645748171315; fWeightMatrix1to2[11][4] = -1.90352949648566; fWeightMatrix1to2[12][4] = 1.77362458380791; fWeightMatrix1to2[0][5] = -1.09362349286045; fWeightMatrix1to2[1][5] = -0.426865429616894; fWeightMatrix1to2[2][5] = -0.309457221604289; fWeightMatrix1to2[3][5] = 0.0331154975240098; fWeightMatrix1to2[4][5] = -0.775210228162975; fWeightMatrix1to2[5][5] = -3.05453077594706; fWeightMatrix1to2[6][5] = 0.373197919291743; fWeightMatrix1to2[7][5] = 0.905179212314185; fWeightMatrix1to2[8][5] = -1.74400439056483; fWeightMatrix1to2[9][5] = -1.5147850969033; fWeightMatrix1to2[10][5] = 0.218484282910158; fWeightMatrix1to2[11][5] = 1.29741303094726; fWeightMatrix1to2[12][5] = -2.74578334388545; fWeightMatrix1to2[0][6] = -1.69355855835261; fWeightMatrix1to2[1][6] = 2.84677666494093; fWeightMatrix1to2[2][6] = 1.65153930657829; fWeightMatrix1to2[3][6] = 1.06512858331359; fWeightMatrix1to2[4][6] = 0.81651724319652; fWeightMatrix1to2[5][6] = -0.767010359361784; fWeightMatrix1to2[6][6] = -0.712254117199236; fWeightMatrix1to2[7][6] = -0.436173857665085; fWeightMatrix1to2[8][6] = -1.32358794678436; fWeightMatrix1to2[9][6] = -1.42690595899608; fWeightMatrix1to2[10][6] = -2.80017087308328; fWeightMatrix1to2[11][6] = -1.5946846266082; fWeightMatrix1to2[12][6] = -2.40080809924442; fWeightMatrix1to2[0][7] = 1.61975012577731; fWeightMatrix1to2[1][7] = -7.64064770829106; fWeightMatrix1to2[2][7] = 3.84160248363092; fWeightMatrix1to2[3][7] = -1.63748209879418; fWeightMatrix1to2[4][7] = 1.62264741499246; fWeightMatrix1to2[5][7] = -2.31028571201281; fWeightMatrix1to2[6][7] = 0.632525699691477; fWeightMatrix1to2[7][7] = 0.584104448128504; fWeightMatrix1to2[8][7] = -2.85308280101568; fWeightMatrix1to2[9][7] = 0.757743733675772; fWeightMatrix1to2[10][7] = 0.314194718308571; fWeightMatrix1to2[11][7] = -2.11386101246244; fWeightMatrix1to2[12][7] = -8.69843677822614; fWeightMatrix1to2[0][8] = -1.88787720465055; fWeightMatrix1to2[1][8] = -1.47066870275416; fWeightMatrix1to2[2][8] = 0.893454286553759; fWeightMatrix1to2[3][8] = -0.558095335399848; fWeightMatrix1to2[4][8] = 0.286861030262749; fWeightMatrix1to2[5][8] = -0.937124775206943; fWeightMatrix1to2[6][8] = -1.22169499913621; fWeightMatrix1to2[7][8] = -1.74582743108598; fWeightMatrix1to2[8][8] = -0.989696852287099; fWeightMatrix1to2[9][8] = -1.5106606267158; fWeightMatrix1to2[10][8] = 0.73274912504963; fWeightMatrix1to2[11][8] = -1.34180999304295; fWeightMatrix1to2[12][8] = -1.43943194183508; fWeightMatrix1to2[0][9] = 1.62446670931869; fWeightMatrix1to2[1][9] = -0.271839315178891; fWeightMatrix1to2[2][9] = -1.49036779661226; fWeightMatrix1to2[3][9] = 1.73490339828169; fWeightMatrix1to2[4][9] = -1.84517769258636; fWeightMatrix1to2[5][9] = 1.27048081158875; fWeightMatrix1to2[6][9] = 0.885195944243891; fWeightMatrix1to2[7][9] = -0.995665692796682; fWeightMatrix1to2[8][9] = -0.931524030407868; fWeightMatrix1to2[9][9] = -0.222915599560796; fWeightMatrix1to2[10][9] = -1.25778313222846; fWeightMatrix1to2[11][9] = 0.397070055803978; fWeightMatrix1to2[12][9] = 1.80445644215195; fWeightMatrix1to2[0][10] = -0.0934292859642727; fWeightMatrix1to2[1][10] = 0.965301429529877; fWeightMatrix1to2[2][10] = -2.73173628731525; fWeightMatrix1to2[3][10] = 0.779682928062457; fWeightMatrix1to2[4][10] = -1.85216033799918; fWeightMatrix1to2[5][10] = -0.696857794086288; fWeightMatrix1to2[6][10] = -2.48953005676853; fWeightMatrix1to2[7][10] = -1.80771506522609; fWeightMatrix1to2[8][10] = -2.18517477564911; fWeightMatrix1to2[9][10] = 0.518116495306831; fWeightMatrix1to2[10][10] = -0.0882766903678553; fWeightMatrix1to2[11][10] = 1.26077750631163; fWeightMatrix1to2[12][10] = 1.84794472063792; fWeightMatrix1to2[0][11] = -1.00135214962131; fWeightMatrix1to2[1][11] = -4.9940345566496; fWeightMatrix1to2[2][11] = 3.94318573776618; fWeightMatrix1to2[3][11] = 1.77217292111127; fWeightMatrix1to2[4][11] = -1.49501879646813; fWeightMatrix1to2[5][11] = -1.31258881587754; fWeightMatrix1to2[6][11] = -1.73787524009052; fWeightMatrix1to2[7][11] = -0.748070985349337; fWeightMatrix1to2[8][11] = -1.2245827631216; fWeightMatrix1to2[9][11] = -0.059297608692315; fWeightMatrix1to2[10][11] = -1.66546902722928; fWeightMatrix1to2[11][11] = 0.797634297438196; fWeightMatrix1to2[12][11] = -3.92876536784029; fWeightMatrix1to2[0][12] = -0.324335937114326; fWeightMatrix1to2[1][12] = -1.90469728460596; fWeightMatrix1to2[2][12] = -3.07528189158928; fWeightMatrix1to2[3][12] = -1.65444011412318; fWeightMatrix1to2[4][12] = -1.70642889771284; fWeightMatrix1to2[5][12] = -0.55429672591064; fWeightMatrix1to2[6][12] = -1.22424082453751; fWeightMatrix1to2[7][12] = -1.50949411056611; fWeightMatrix1to2[8][12] = -0.836657854508148; fWeightMatrix1to2[9][12] = -2.66633450284678; fWeightMatrix1to2[10][12] = -1.48806532156586; fWeightMatrix1to2[11][12] = -1.66335253082389; fWeightMatrix1to2[12][12] = -0.251250615247164; fWeightMatrix1to2[0][13] = -1.6195476875716; fWeightMatrix1to2[1][13] = -2.42868131597235; fWeightMatrix1to2[2][13] = -0.748045802022971; fWeightMatrix1to2[3][13] = -0.905513849720732; fWeightMatrix1to2[4][13] = -1.69600353553704; fWeightMatrix1to2[5][13] = 0.695630809766367; fWeightMatrix1to2[6][13] = -1.68477610153552; fWeightMatrix1to2[7][13] = -1.01493761052038; fWeightMatrix1to2[8][13] = 1.36607179416625; fWeightMatrix1to2[9][13] = 0.190517651309713; fWeightMatrix1to2[10][13] = -0.0419253793561937; fWeightMatrix1to2[11][13] = -0.0288897098252667; fWeightMatrix1to2[12][13] = 0.532296866428927; fWeightMatrix1to2[0][14] = 0.398353762472191; fWeightMatrix1to2[1][14] = -1.40712466589736; fWeightMatrix1to2[2][14] = -2.76244830097371; fWeightMatrix1to2[3][14] = -0.837502211244635; fWeightMatrix1to2[4][14] = -1.78560908192768; fWeightMatrix1to2[5][14] = -0.762351103594297; fWeightMatrix1to2[6][14] = 0.0236995928951709; fWeightMatrix1to2[7][14] = -0.302821588579824; fWeightMatrix1to2[8][14] = -0.281195343224177; fWeightMatrix1to2[9][14] = -0.363919259333822; fWeightMatrix1to2[10][14] = 0.762511572506654; fWeightMatrix1to2[11][14] = -1.37008975995906; fWeightMatrix1to2[12][14] = 0.668430023254252; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.87480204165932; fWeightMatrix2to3[0][1] = 0.0624636928736754; fWeightMatrix2to3[0][2] = -1.4393419834102; fWeightMatrix2to3[0][3] = -0.871203481230533; fWeightMatrix2to3[0][4] = 0.95012463875937; fWeightMatrix2to3[0][5] = -1.0054399758322; fWeightMatrix2to3[0][6] = 0.960936821249894; fWeightMatrix2to3[0][7] = 1.58592289764837; fWeightMatrix2to3[0][8] = -1.54266213013056; fWeightMatrix2to3[0][9] = 0.619302686304196; fWeightMatrix2to3[0][10] = 1.06733421905966; fWeightMatrix2to3[0][11] = 0.785863827355706; fWeightMatrix2to3[0][12] = -0.796094231112534; fWeightMatrix2to3[0][13] = 0.750287056272859; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l