// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:28:56 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 22190 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1.001734972] LRWg LRWg 'F' [0,0.999952673912] LRWj LRWj 'F' [-0.0757067129016,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.1351919174,274.802154541] MetDelPhi MetDelPhi 'F' [0.143834546208,3.13630962372] MetSpec MetSpec 'F' [15.0015153885,188.142364502] dPhiLeptons dPhiLeptons 'F' [7.39097595215e-05,3.14087033272] dRLeptons dRLeptons 'F' [0.401803344488,4.43655967712] dimass dimass 'F' [16.0089683533,573.807128906] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1.00173497200012; fVmin[2] = 0; fVmax[2] = 0.999952673912048; fVmin[3] = -0.0757067129015923; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.1351919174194; fVmax[5] = 274.802154541016; fVmin[6] = 0.143834546208382; fVmax[6] = 3.13630962371826; fVmin[7] = 15.0015153884888; fVmax[7] = 188.142364501953; fVmin[8] = 7.39097595214844e-05; fVmax[8] = 3.1408703327179; fVmin[9] = 0.401803344488144; fVmax[9] = 4.43655967712402; fVmin[10] = 16.0089683532715; fVmax[10] = 573.80712890625; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.3655238820722; fWeightMatrix0to1[1][0] = 1.69674168523753; fWeightMatrix0to1[2][0] = 0.574523376341876; fWeightMatrix0to1[3][0] = 1.7584225618094; fWeightMatrix0to1[4][0] = 0.340262230979231; fWeightMatrix0to1[5][0] = 1.90168818385564; fWeightMatrix0to1[6][0] = 1.52921832876048; fWeightMatrix0to1[7][0] = -0.421399993971939; fWeightMatrix0to1[8][0] = 0.5377607992608; fWeightMatrix0to1[9][0] = 1.44905843446165; fWeightMatrix0to1[10][0] = 1.09605170741562; fWeightMatrix0to1[11][0] = 0.0780494267384143; fWeightMatrix0to1[0][1] = 1.34151155715329; fWeightMatrix0to1[1][1] = 0.79717932603909; fWeightMatrix0to1[2][1] = 0.960358150129187; fWeightMatrix0to1[3][1] = -1.79789989475666; fWeightMatrix0to1[4][1] = 1.1311882825473; fWeightMatrix0to1[5][1] = -0.166368454074899; fWeightMatrix0to1[6][1] = -0.998597360632153; fWeightMatrix0to1[7][1] = 1.70647728035585; fWeightMatrix0to1[8][1] = 1.8383418470337; fWeightMatrix0to1[9][1] = 0.230557206714046; fWeightMatrix0to1[10][1] = 0.32893348323043; fWeightMatrix0to1[11][1] = -0.312993104801475; fWeightMatrix0to1[0][2] = 0.138874981911961; fWeightMatrix0to1[1][2] = 0.383123162624494; fWeightMatrix0to1[2][2] = 0.404325978769622; fWeightMatrix0to1[3][2] = -1.84881921229344; fWeightMatrix0to1[4][2] = 1.87478591338588; fWeightMatrix0to1[5][2] = -0.0321805134880711; fWeightMatrix0to1[6][2] = -0.173751544028926; fWeightMatrix0to1[7][2] = -0.23110400676349; fWeightMatrix0to1[8][2] = -0.496148250576307; fWeightMatrix0to1[9][2] = -0.602898503717735; fWeightMatrix0to1[10][2] = -0.660964431729048; fWeightMatrix0to1[11][2] = 0.930134168061604; fWeightMatrix0to1[0][3] = 0.860048175098089; fWeightMatrix0to1[1][3] = 0.536965391710769; fWeightMatrix0to1[2][3] = 0.812020265523432; fWeightMatrix0to1[3][3] = 1.83202372036783; fWeightMatrix0to1[4][3] = -0.998552632759102; fWeightMatrix0to1[5][3] = -0.170008062752894; fWeightMatrix0to1[6][3] = -0.379479603847301; fWeightMatrix0to1[7][3] = -2.11983226740107; fWeightMatrix0to1[8][3] = -0.892143316678764; fWeightMatrix0to1[9][3] = -1.13969739444068; fWeightMatrix0to1[10][3] = -0.351686402437062; fWeightMatrix0to1[11][3] = 0.431462180819865; fWeightMatrix0to1[0][4] = 0.138772183614904; fWeightMatrix0to1[1][4] = -0.0178651948325183; fWeightMatrix0to1[2][4] = 0.9301751277476; fWeightMatrix0to1[3][4] = -1.04017518118214; fWeightMatrix0to1[4][4] = -0.248549277170069; fWeightMatrix0to1[5][4] = -1.54543663240122; fWeightMatrix0to1[6][4] = -0.801716373179114; fWeightMatrix0to1[7][4] = 0.165240972253523; fWeightMatrix0to1[8][4] = -0.197443849704718; fWeightMatrix0to1[9][4] = 0.436441964719583; fWeightMatrix0to1[10][4] = -0.0321947258042463; fWeightMatrix0to1[11][4] = -0.514197420109861; fWeightMatrix0to1[0][5] = -0.618402654853365; fWeightMatrix0to1[1][5] = -0.313965049058966; fWeightMatrix0to1[2][5] = -2.53094669320195; fWeightMatrix0to1[3][5] = 0.69650253275276; fWeightMatrix0to1[4][5] = 0.710023069531411; fWeightMatrix0to1[5][5] = -1.8388172129349; fWeightMatrix0to1[6][5] = 1.20220708099028; fWeightMatrix0to1[7][5] = -0.582620464652825; fWeightMatrix0to1[8][5] = -0.172734966018485; fWeightMatrix0to1[9][5] = 1.04341364481047; fWeightMatrix0to1[10][5] = 0.171946552089645; fWeightMatrix0to1[11][5] = -0.978900993053251; fWeightMatrix0to1[0][6] = -1.41998577162993; fWeightMatrix0to1[1][6] = -0.865324896141606; fWeightMatrix0to1[2][6] = 1.49049458743851; fWeightMatrix0to1[3][6] = -0.58205653664835; fWeightMatrix0to1[4][6] = 0.00999447841616979; fWeightMatrix0to1[5][6] = -0.359986808590419; fWeightMatrix0to1[6][6] = -1.97764708062606; fWeightMatrix0to1[7][6] = 0.91997130876799; fWeightMatrix0to1[8][6] = 0.712223047462948; fWeightMatrix0to1[9][6] = 0.0546039711004311; fWeightMatrix0to1[10][6] = 2.06638947791944; fWeightMatrix0to1[11][6] = 0.641655852162619; fWeightMatrix0to1[0][7] = -0.321155806937596; fWeightMatrix0to1[1][7] = 0.17632242379608; fWeightMatrix0to1[2][7] = 1.33941984576778; fWeightMatrix0to1[3][7] = -0.379832480999721; fWeightMatrix0to1[4][7] = 1.11651118101258; fWeightMatrix0to1[5][7] = 0.0880201627195484; fWeightMatrix0to1[6][7] = 0.0465777921662186; fWeightMatrix0to1[7][7] = -0.808235103040409; fWeightMatrix0to1[8][7] = -0.379403188885857; fWeightMatrix0to1[9][7] = 1.87839296845768; fWeightMatrix0to1[10][7] = -0.656797467955032; fWeightMatrix0to1[11][7] = 2.2240940079751; fWeightMatrix0to1[0][8] = -0.180720710274282; fWeightMatrix0to1[1][8] = 0.131797993714753; fWeightMatrix0to1[2][8] = 2.47891134092949; fWeightMatrix0to1[3][8] = -2.08195701845977; fWeightMatrix0to1[4][8] = 0.302827653097163; fWeightMatrix0to1[5][8] = 0.433480248788263; fWeightMatrix0to1[6][8] = -0.284398050716946; fWeightMatrix0to1[7][8] = -0.234147524778565; fWeightMatrix0to1[8][8] = 2.10391883249321; fWeightMatrix0to1[9][8] = 0.441851800538043; fWeightMatrix0to1[10][8] = -0.160553231941192; fWeightMatrix0to1[11][8] = -1.30970323773329; fWeightMatrix0to1[0][9] = 0.324526752861528; fWeightMatrix0to1[1][9] = -2.46570322976323; fWeightMatrix0to1[2][9] = 2.15582104440723; fWeightMatrix0to1[3][9] = -1.16921072483682; fWeightMatrix0to1[4][9] = 0.511947399730409; fWeightMatrix0to1[5][9] = 0.288408301080512; fWeightMatrix0to1[6][9] = -1.50267898796417; fWeightMatrix0to1[7][9] = 0.726628749049921; fWeightMatrix0to1[8][9] = -0.983841116282634; fWeightMatrix0to1[9][9] = -2.36786964597525; fWeightMatrix0to1[10][9] = -1.48674707613788; fWeightMatrix0to1[11][9] = -0.213231803941346; fWeightMatrix0to1[0][10] = -0.267096577873847; fWeightMatrix0to1[1][10] = 1.6917044067354; fWeightMatrix0to1[2][10] = 1.11822619120366; fWeightMatrix0to1[3][10] = -1.83482498591703; fWeightMatrix0to1[4][10] = 0.418076236794588; fWeightMatrix0to1[5][10] = -1.87822126020697; fWeightMatrix0to1[6][10] = -1.84051367827486; fWeightMatrix0to1[7][10] = 0.84904046822622; fWeightMatrix0to1[8][10] = -0.407534480883594; fWeightMatrix0to1[9][10] = -0.250307503013924; fWeightMatrix0to1[10][10] = -1.27168982643953; fWeightMatrix0to1[11][10] = -1.30294960688802; fWeightMatrix0to1[0][11] = -1.34525557967546; fWeightMatrix0to1[1][11] = -1.29616161150223; fWeightMatrix0to1[2][11] = 0.013108073336852; fWeightMatrix0to1[3][11] = 1.12257273805554; fWeightMatrix0to1[4][11] = 1.71793663196078; fWeightMatrix0to1[5][11] = 1.60764601891807; fWeightMatrix0to1[6][11] = -0.869352492260657; fWeightMatrix0to1[7][11] = -1.44811057245234; fWeightMatrix0to1[8][11] = 2.00173352201059; fWeightMatrix0to1[9][11] = -2.44564475465293; fWeightMatrix0to1[10][11] = 0.0281534915388234; fWeightMatrix0to1[11][11] = -0.348513226009141; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.48568389190751; fWeightMatrix1to2[1][0] = 0.0865544745501835; fWeightMatrix1to2[2][0] = -0.877474644086871; fWeightMatrix1to2[3][0] = -0.898426729878528; fWeightMatrix1to2[4][0] = -1.95564877365745; fWeightMatrix1to2[5][0] = 0.78342770934655; fWeightMatrix1to2[6][0] = 1.86858333599301; fWeightMatrix1to2[7][0] = 1.35587653357704; fWeightMatrix1to2[8][0] = 1.53430383102352; fWeightMatrix1to2[9][0] = -1.96812477941741; fWeightMatrix1to2[10][0] = -0.265968664824526; fWeightMatrix1to2[0][1] = 0.851693195063922; fWeightMatrix1to2[1][1] = -1.56421591381463; fWeightMatrix1to2[2][1] = -0.899739814142478; fWeightMatrix1to2[3][1] = -0.0766905274442043; fWeightMatrix1to2[4][1] = 1.26653893575492; fWeightMatrix1to2[5][1] = -2.47447551863817; fWeightMatrix1to2[6][1] = -0.712217185837311; fWeightMatrix1to2[7][1] = -0.212985876698859; fWeightMatrix1to2[8][1] = -1.38574963444086; fWeightMatrix1to2[9][1] = -1.24954209962979; fWeightMatrix1to2[10][1] = 0.0120689399094671; fWeightMatrix1to2[0][2] = 0.511833531590644; fWeightMatrix1to2[1][2] = -0.337118890700066; fWeightMatrix1to2[2][2] = -1.1355073334991; fWeightMatrix1to2[3][2] = -0.235659888111727; fWeightMatrix1to2[4][2] = 1.68108514679904; fWeightMatrix1to2[5][2] = -0.685131612587853; fWeightMatrix1to2[6][2] = 0.989522139998497; fWeightMatrix1to2[7][2] = -2.11464913975215; fWeightMatrix1to2[8][2] = 0.32762501876918; fWeightMatrix1to2[9][2] = -0.209525451175814; fWeightMatrix1to2[10][2] = -0.256493758071642; fWeightMatrix1to2[0][3] = -1.32462517053823; fWeightMatrix1to2[1][3] = 0.0105184998115805; fWeightMatrix1to2[2][3] = 1.10550640173464; fWeightMatrix1to2[3][3] = -2.07452287860907; fWeightMatrix1to2[4][3] = 0.636392753924104; fWeightMatrix1to2[5][3] = -0.137937602301336; fWeightMatrix1to2[6][3] = 1.29024911635958; fWeightMatrix1to2[7][3] = 0.31403168898666; fWeightMatrix1to2[8][3] = -1.82582336868445; fWeightMatrix1to2[9][3] = -2.06536618800439; fWeightMatrix1to2[10][3] = -0.302286695851031; fWeightMatrix1to2[0][4] = 0.360799606172474; fWeightMatrix1to2[1][4] = -0.249007203734692; fWeightMatrix1to2[2][4] = 0.170929509802295; fWeightMatrix1to2[3][4] = -0.975306072262131; fWeightMatrix1to2[4][4] = 0.21123068406585; fWeightMatrix1to2[5][4] = 0.989426941448279; fWeightMatrix1to2[6][4] = -0.948160032953359; fWeightMatrix1to2[7][4] = 0.134684055242846; fWeightMatrix1to2[8][4] = 1.05091553995128; fWeightMatrix1to2[9][4] = 0.648993793989391; fWeightMatrix1to2[10][4] = -0.494979060374095; fWeightMatrix1to2[0][5] = -0.613657304185895; fWeightMatrix1to2[1][5] = -0.12077137134161; fWeightMatrix1to2[2][5] = -0.383623548692438; fWeightMatrix1to2[3][5] = -0.671492344977729; fWeightMatrix1to2[4][5] = -2.06859581624932; fWeightMatrix1to2[5][5] = -0.839262100706651; fWeightMatrix1to2[6][5] = -0.745149115611615; fWeightMatrix1to2[7][5] = 0.348424187907204; fWeightMatrix1to2[8][5] = -1.95106337880307; fWeightMatrix1to2[9][5] = 1.4405306284749; fWeightMatrix1to2[10][5] = -0.747329329705143; fWeightMatrix1to2[0][6] = -1.25719549440212; fWeightMatrix1to2[1][6] = 1.00447433855651; fWeightMatrix1to2[2][6] = -1.24489786966564; fWeightMatrix1to2[3][6] = -1.88259474250087; fWeightMatrix1to2[4][6] = 0.985556911799408; fWeightMatrix1to2[5][6] = -1.58866248230843; fWeightMatrix1to2[6][6] = 0.441907421941218; fWeightMatrix1to2[7][6] = -2.48224768821585; fWeightMatrix1to2[8][6] = -0.974776358435205; fWeightMatrix1to2[9][6] = 0.726437930424975; fWeightMatrix1to2[10][6] = -3.27134664001373; fWeightMatrix1to2[0][7] = 1.0271017921172; fWeightMatrix1to2[1][7] = 0.185376051038649; fWeightMatrix1to2[2][7] = -0.97933705913719; fWeightMatrix1to2[3][7] = -1.28645262772079; fWeightMatrix1to2[4][7] = -1.12363161020693; fWeightMatrix1to2[5][7] = -0.611439671727062; fWeightMatrix1to2[6][7] = -0.637483014412254; fWeightMatrix1to2[7][7] = -1.88384273826102; fWeightMatrix1to2[8][7] = 0.0549023605341747; fWeightMatrix1to2[9][7] = -1.22245632386051; fWeightMatrix1to2[10][7] = 1.07744539174648; fWeightMatrix1to2[0][8] = -1.59037795151872; fWeightMatrix1to2[1][8] = -0.512840531677879; fWeightMatrix1to2[2][8] = 0.886872356665446; fWeightMatrix1to2[3][8] = 0.758036086015656; fWeightMatrix1to2[4][8] = -1.31927003465093; fWeightMatrix1to2[5][8] = 1.8531962739199; fWeightMatrix1to2[6][8] = -2.37532792755869; fWeightMatrix1to2[7][8] = 0.322238960680337; fWeightMatrix1to2[8][8] = -0.526957596928218; fWeightMatrix1to2[9][8] = -1.95243451648882; fWeightMatrix1to2[10][8] = 0.928884873891484; fWeightMatrix1to2[0][9] = -0.425378642135783; fWeightMatrix1to2[1][9] = 1.20448784102091; fWeightMatrix1to2[2][9] = 1.24984503155885; fWeightMatrix1to2[3][9] = 0.078640504793885; fWeightMatrix1to2[4][9] = 1.01586778421514; fWeightMatrix1to2[5][9] = -1.28728503974155; fWeightMatrix1to2[6][9] = -1.27757732970517; fWeightMatrix1to2[7][9] = -0.815975065633659; fWeightMatrix1to2[8][9] = -0.562986186316554; fWeightMatrix1to2[9][9] = -1.24612244441362; fWeightMatrix1to2[10][9] = -0.910625292796999; fWeightMatrix1to2[0][10] = 0.905851765550299; fWeightMatrix1to2[1][10] = -0.852211692603032; fWeightMatrix1to2[2][10] = -2.04891225387201; fWeightMatrix1to2[3][10] = 1.26918858505613; fWeightMatrix1to2[4][10] = -1.22893394909703; fWeightMatrix1to2[5][10] = -0.702831998745524; fWeightMatrix1to2[6][10] = -1.30774254146412; fWeightMatrix1to2[7][10] = 0.353463195244114; fWeightMatrix1to2[8][10] = -0.587015526459941; fWeightMatrix1to2[9][10] = -0.813047989015593; fWeightMatrix1to2[10][10] = -0.971124875775387; fWeightMatrix1to2[0][11] = 0.754543255432467; fWeightMatrix1to2[1][11] = -0.585133462206116; fWeightMatrix1to2[2][11] = 0.0410131032182505; fWeightMatrix1to2[3][11] = -1.27358789154762; fWeightMatrix1to2[4][11] = -0.727638359901824; fWeightMatrix1to2[5][11] = -1.28468235065272; fWeightMatrix1to2[6][11] = 0.852150500629808; fWeightMatrix1to2[7][11] = 0.914264911276255; fWeightMatrix1to2[8][11] = -0.376043585358862; fWeightMatrix1to2[9][11] = 0.656867356127227; fWeightMatrix1to2[10][11] = 0.290470960479303; fWeightMatrix1to2[0][12] = -0.182680106855064; fWeightMatrix1to2[1][12] = -0.264682192087415; fWeightMatrix1to2[2][12] = -0.679372363764981; fWeightMatrix1to2[3][12] = -0.450895416293371; fWeightMatrix1to2[4][12] = 1.64341621899653; fWeightMatrix1to2[5][12] = 1.22829044875902; fWeightMatrix1to2[6][12] = -2.05618031618682; fWeightMatrix1to2[7][12] = -2.02590474668683; fWeightMatrix1to2[8][12] = 0.141629160320324; fWeightMatrix1to2[9][12] = -0.317248232563357; fWeightMatrix1to2[10][12] = 0.719446326715711; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.182045731064515; fWeightMatrix2to3[0][1] = -0.0521319959629289; fWeightMatrix2to3[0][2] = -0.938551651985426; fWeightMatrix2to3[0][3] = -0.452732730868741; fWeightMatrix2to3[0][4] = 0.519046062363469; fWeightMatrix2to3[0][5] = -0.97587865260127; fWeightMatrix2to3[0][6] = -1.45935159332204; fWeightMatrix2to3[0][7] = -0.60776438646775; fWeightMatrix2to3[0][8] = 0.801844125872954; fWeightMatrix2to3[0][9] = -0.235771319981548; fWeightMatrix2to3[0][10] = -0.457231031216493; fWeightMatrix2to3[0][11] = 1.0561226675435; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l