// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:28:17 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 21776 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1.001734972] LRWg LRWg 'F' [0,0.999911367893] LRWj LRWj 'F' [-0.0757067129016,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.0942249298,324.415008545] MetDelPhi MetDelPhi 'F' [0.118904195726,3.13655138016] MetSpec MetSpec 'F' [15.0046358109,192.753829956] dPhiLeptons dPhiLeptons 'F' [0.000216960906982,3.14020013809] dRLeptons dRLeptons 'F' [0.401400476694,4.23354482651] dimass dimass 'F' [16.0013523102,531.617553711] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1.00173497200012; fVmin[2] = 0; fVmax[2] = 0.999911367893219; fVmin[3] = -0.0757067129015923; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.0942249298096; fVmax[5] = 324.415008544922; fVmin[6] = 0.118904195725918; fVmax[6] = 3.13655138015747; fVmin[7] = 15.0046358108521; fVmax[7] = 192.753829956055; fVmin[8] = 0.000216960906982422; fVmax[8] = 3.14020013809204; fVmin[9] = 0.401400476694107; fVmax[9] = 4.23354482650757; fVmin[10] = 16.0013523101807; fVmax[10] = 531.617553710938; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 2.37342813496263; fWeightMatrix0to1[1][0] = 3.58993918085292; fWeightMatrix0to1[2][0] = 0.338664205692022; fWeightMatrix0to1[3][0] = 1.5894497319466; fWeightMatrix0to1[4][0] = 0.668360583702111; fWeightMatrix0to1[5][0] = 2.74446200956365; fWeightMatrix0to1[6][0] = 1.64177651342004; fWeightMatrix0to1[7][0] = 0.53382632371225; fWeightMatrix0to1[8][0] = 0.729933862909135; fWeightMatrix0to1[9][0] = 1.28430429627484; fWeightMatrix0to1[10][0] = 0.249260878375066; fWeightMatrix0to1[11][0] = 0.658386129428282; fWeightMatrix0to1[0][1] = 1.4535247447796; fWeightMatrix0to1[1][1] = 0.955612797752677; fWeightMatrix0to1[2][1] = 1.26845267706466; fWeightMatrix0to1[3][1] = -1.54474278034346; fWeightMatrix0to1[4][1] = 0.999522415356529; fWeightMatrix0to1[5][1] = 0.473613229718409; fWeightMatrix0to1[6][1] = -1.05181181085893; fWeightMatrix0to1[7][1] = 1.58832128687388; fWeightMatrix0to1[8][1] = 1.91383606001448; fWeightMatrix0to1[9][1] = 0.491696548463063; fWeightMatrix0to1[10][1] = 0.368673066186752; fWeightMatrix0to1[11][1] = -0.117089588459658; fWeightMatrix0to1[0][2] = 0.14279247667158; fWeightMatrix0to1[1][2] = 1.20816822611078; fWeightMatrix0to1[2][2] = 0.0435613786836967; fWeightMatrix0to1[3][2] = -1.92487236881959; fWeightMatrix0to1[4][2] = 2.57037735145034; fWeightMatrix0to1[5][2] = 0.000181045571687351; fWeightMatrix0to1[6][2] = 0.0728604367474736; fWeightMatrix0to1[7][2] = -0.679800570123907; fWeightMatrix0to1[8][2] = -0.271024159270371; fWeightMatrix0to1[9][2] = -0.794952409622495; fWeightMatrix0to1[10][2] = -0.806341721218475; fWeightMatrix0to1[11][2] = 0.321036613101015; fWeightMatrix0to1[0][3] = 0.634282748420431; fWeightMatrix0to1[1][3] = 0.626717832356846; fWeightMatrix0to1[2][3] = 1.03515176868901; fWeightMatrix0to1[3][3] = 1.52404995400191; fWeightMatrix0to1[4][3] = -0.878561356591494; fWeightMatrix0to1[5][3] = -0.249710941056184; fWeightMatrix0to1[6][3] = -0.677749320752645; fWeightMatrix0to1[7][3] = -1.86255513288927; fWeightMatrix0to1[8][3] = -0.80080942251502; fWeightMatrix0to1[9][3] = -1.24658164734053; fWeightMatrix0to1[10][3] = -1.00288419760382; fWeightMatrix0to1[11][3] = 0.433838395214099; fWeightMatrix0to1[0][4] = 0.572030859317365; fWeightMatrix0to1[1][4] = -0.33090266688426; fWeightMatrix0to1[2][4] = 1.6110714559879; fWeightMatrix0to1[3][4] = -1.13374833578558; fWeightMatrix0to1[4][4] = 0.606742894253393; fWeightMatrix0to1[5][4] = -0.825578387733247; fWeightMatrix0to1[6][4] = -0.878062358721886; fWeightMatrix0to1[7][4] = 0.237231393325083; fWeightMatrix0to1[8][4] = -0.25109895723629; fWeightMatrix0to1[9][4] = 0.852132197651493; fWeightMatrix0to1[10][4] = -0.111701965898716; fWeightMatrix0to1[11][4] = -0.480683603108649; fWeightMatrix0to1[0][5] = -0.789209059874959; fWeightMatrix0to1[1][5] = -0.00384461962492863; fWeightMatrix0to1[2][5] = -2.36881688491174; fWeightMatrix0to1[3][5] = 0.845766256584591; fWeightMatrix0to1[4][5] = 0.933251017479739; fWeightMatrix0to1[5][5] = -1.71930820959073; fWeightMatrix0to1[6][5] = 1.27013692722248; fWeightMatrix0to1[7][5] = -0.682692315979242; fWeightMatrix0to1[8][5] = -0.239250233913297; fWeightMatrix0to1[9][5] = 1.17438369831945; fWeightMatrix0to1[10][5] = -0.272908144086502; fWeightMatrix0to1[11][5] = -0.830760058457481; fWeightMatrix0to1[0][6] = -0.403690871406826; fWeightMatrix0to1[1][6] = -0.110025802776709; fWeightMatrix0to1[2][6] = 1.47577470024046; fWeightMatrix0to1[3][6] = -0.649737859328097; fWeightMatrix0to1[4][6] = 0.676288464602928; fWeightMatrix0to1[5][6] = -0.59737116919764; fWeightMatrix0to1[6][6] = -1.56738454381102; fWeightMatrix0to1[7][6] = 1.26937277056895; fWeightMatrix0to1[8][6] = 0.329380779118393; fWeightMatrix0to1[9][6] = 0.697537532351828; fWeightMatrix0to1[10][6] = 2.15369960740655; fWeightMatrix0to1[11][6] = 0.895331700889566; fWeightMatrix0to1[0][7] = -0.057832595072384; fWeightMatrix0to1[1][7] = 0.539522987390425; fWeightMatrix0to1[2][7] = 1.44512679898939; fWeightMatrix0to1[3][7] = -0.236980033882674; fWeightMatrix0to1[4][7] = 1.14485177351153; fWeightMatrix0to1[5][7] = 0.113295761308753; fWeightMatrix0to1[6][7] = -0.232461483934473; fWeightMatrix0to1[7][7] = -0.920850220473219; fWeightMatrix0to1[8][7] = -0.330506239719541; fWeightMatrix0to1[9][7] = 2.39304332911694; fWeightMatrix0to1[10][7] = -1.1730846660626; fWeightMatrix0to1[11][7] = 2.29177963289619; fWeightMatrix0to1[0][8] = -0.137505838697257; fWeightMatrix0to1[1][8] = 0.445315803707747; fWeightMatrix0to1[2][8] = 2.54338131234168; fWeightMatrix0to1[3][8] = -2.43635367623589; fWeightMatrix0to1[4][8] = 0.0642346059748409; fWeightMatrix0to1[5][8] = 0.571479789301522; fWeightMatrix0to1[6][8] = -0.0762305169308716; fWeightMatrix0to1[7][8] = -0.875696633491537; fWeightMatrix0to1[8][8] = 1.8555665654387; fWeightMatrix0to1[9][8] = 0.357099891657498; fWeightMatrix0to1[10][8] = 0.706528539806963; fWeightMatrix0to1[11][8] = -1.58443149108788; fWeightMatrix0to1[0][9] = 0.0809955553940687; fWeightMatrix0to1[1][9] = -1.41589913331379; fWeightMatrix0to1[2][9] = 1.78725132640883; fWeightMatrix0to1[3][9] = -1.58759114316406; fWeightMatrix0to1[4][9] = -0.345193061025767; fWeightMatrix0to1[5][9] = 0.540727299720033; fWeightMatrix0to1[6][9] = -1.51487012740436; fWeightMatrix0to1[7][9] = 0.633381274725067; fWeightMatrix0to1[8][9] = -1.28797628807151; fWeightMatrix0to1[9][9] = -1.67604811955909; fWeightMatrix0to1[10][9] = -1.19750449797741; fWeightMatrix0to1[11][9] = -0.984659159587844; fWeightMatrix0to1[0][10] = -0.476600181041174; fWeightMatrix0to1[1][10] = 1.88356660744512; fWeightMatrix0to1[2][10] = 1.44650848846528; fWeightMatrix0to1[3][10] = -1.82486303993091; fWeightMatrix0to1[4][10] = 0.709998207620988; fWeightMatrix0to1[5][10] = -1.37942519729816; fWeightMatrix0to1[6][10] = -1.9345598445008; fWeightMatrix0to1[7][10] = 0.774129767115561; fWeightMatrix0to1[8][10] = -0.499314372797675; fWeightMatrix0to1[9][10] = -0.21242119877639; fWeightMatrix0to1[10][10] = -1.29931506922598; fWeightMatrix0to1[11][10] = -1.58849718066343; fWeightMatrix0to1[0][11] = -1.29128858697511; fWeightMatrix0to1[1][11] = -1.49869133411648; fWeightMatrix0to1[2][11] = -0.0639760120993544; fWeightMatrix0to1[3][11] = 1.05464683628148; fWeightMatrix0to1[4][11] = 1.36872590722294; fWeightMatrix0to1[5][11] = 1.17361601767406; fWeightMatrix0to1[6][11] = -0.967745962643704; fWeightMatrix0to1[7][11] = -1.31384768686912; fWeightMatrix0to1[8][11] = 2.52225548022997; fWeightMatrix0to1[9][11] = -2.35357757019181; fWeightMatrix0to1[10][11] = 0.299878667986015; fWeightMatrix0to1[11][11] = -0.503252511646577; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.51623806756891; fWeightMatrix1to2[1][0] = 0.184827950699611; fWeightMatrix1to2[2][0] = -0.739662578296967; fWeightMatrix1to2[3][0] = -0.991772198084169; fWeightMatrix1to2[4][0] = -2.08776246136573; fWeightMatrix1to2[5][0] = 1.17493660850959; fWeightMatrix1to2[6][0] = 1.62607131694562; fWeightMatrix1to2[7][0] = 1.28733009874815; fWeightMatrix1to2[8][0] = 1.59374401250413; fWeightMatrix1to2[9][0] = -2.05212442909586; fWeightMatrix1to2[10][0] = 0.0889102917508986; fWeightMatrix1to2[0][1] = 0.982862219433751; fWeightMatrix1to2[1][1] = -1.54618326495054; fWeightMatrix1to2[2][1] = -0.960900002525102; fWeightMatrix1to2[3][1] = -0.0740076871864858; fWeightMatrix1to2[4][1] = 1.16607283982194; fWeightMatrix1to2[5][1] = -2.80222872980793; fWeightMatrix1to2[6][1] = -0.811421778399519; fWeightMatrix1to2[7][1] = -0.189692953877165; fWeightMatrix1to2[8][1] = -1.3878508749117; fWeightMatrix1to2[9][1] = -1.45748358167352; fWeightMatrix1to2[10][1] = 0.120387160317765; fWeightMatrix1to2[0][2] = 0.55199078980152; fWeightMatrix1to2[1][2] = -0.231942956688617; fWeightMatrix1to2[2][2] = -1.66051995910664; fWeightMatrix1to2[3][2] = -0.428763044980828; fWeightMatrix1to2[4][2] = 1.51284765574483; fWeightMatrix1to2[5][2] = -0.415472717080269; fWeightMatrix1to2[6][2] = 0.870408046251903; fWeightMatrix1to2[7][2] = -2.33301081911643; fWeightMatrix1to2[8][2] = 0.359612918728997; fWeightMatrix1to2[9][2] = 0.0934837796269547; fWeightMatrix1to2[10][2] = -0.820930786373556; fWeightMatrix1to2[0][3] = -1.34361963783484; fWeightMatrix1to2[1][3] = 0.118480035898314; fWeightMatrix1to2[2][3] = 1.11258806954146; fWeightMatrix1to2[3][3] = -2.01729743801878; fWeightMatrix1to2[4][3] = 0.550166679178709; fWeightMatrix1to2[5][3] = -0.29666004905899; fWeightMatrix1to2[6][3] = 1.17574599913594; fWeightMatrix1to2[7][3] = 0.272622751622419; fWeightMatrix1to2[8][3] = -1.65049179047423; fWeightMatrix1to2[9][3] = -2.0222419718269; fWeightMatrix1to2[10][3] = -0.58273252888926; fWeightMatrix1to2[0][4] = 0.426872327625924; fWeightMatrix1to2[1][4] = 0.0680136032799156; fWeightMatrix1to2[2][4] = 0.533118303956917; fWeightMatrix1to2[3][4] = -1.02162053598183; fWeightMatrix1to2[4][4] = 0.307347708334249; fWeightMatrix1to2[5][4] = 1.97609839795398; fWeightMatrix1to2[6][4] = -1.34626907411128; fWeightMatrix1to2[7][4] = 0.316316296687998; fWeightMatrix1to2[8][4] = 0.708298447933848; fWeightMatrix1to2[9][4] = 0.0962865161583868; fWeightMatrix1to2[10][4] = -0.35815587308585; fWeightMatrix1to2[0][5] = -0.589508269953756; fWeightMatrix1to2[1][5] = -0.0215864733513187; fWeightMatrix1to2[2][5] = -0.629755476313241; fWeightMatrix1to2[3][5] = -0.613060308596014; fWeightMatrix1to2[4][5] = -2.15589983405348; fWeightMatrix1to2[5][5] = -0.827136920615695; fWeightMatrix1to2[6][5] = -0.951808182130076; fWeightMatrix1to2[7][5] = 0.207929726441392; fWeightMatrix1to2[8][5] = -2.1214871690763; fWeightMatrix1to2[9][5] = 1.68060407924898; fWeightMatrix1to2[10][5] = -0.879562272263614; fWeightMatrix1to2[0][6] = -1.03725079963359; fWeightMatrix1to2[1][6] = 0.970352367615285; fWeightMatrix1to2[2][6] = -1.16104399522579; fWeightMatrix1to2[3][6] = -1.96422353013351; fWeightMatrix1to2[4][6] = 0.650940360905191; fWeightMatrix1to2[5][6] = -1.85927423266037; fWeightMatrix1to2[6][6] = 0.437792155920744; fWeightMatrix1to2[7][6] = -2.66323336249917; fWeightMatrix1to2[8][6] = -0.99805863823304; fWeightMatrix1to2[9][6] = 0.772959083260401; fWeightMatrix1to2[10][6] = -3.12178061948167; fWeightMatrix1to2[0][7] = 1.11018236782103; fWeightMatrix1to2[1][7] = 0.150169655820026; fWeightMatrix1to2[2][7] = -1.11994436469963; fWeightMatrix1to2[3][7] = -1.35181482549632; fWeightMatrix1to2[4][7] = -1.36290910770776; fWeightMatrix1to2[5][7] = -0.24225165754938; fWeightMatrix1to2[6][7] = -0.66306921723194; fWeightMatrix1to2[7][7] = -1.81134770945556; fWeightMatrix1to2[8][7] = -0.0671893504544516; fWeightMatrix1to2[9][7] = -1.0528538541127; fWeightMatrix1to2[10][7] = 1.22259755277792; fWeightMatrix1to2[0][8] = -1.65982170511128; fWeightMatrix1to2[1][8] = -0.462625300471405; fWeightMatrix1to2[2][8] = 0.80199120448955; fWeightMatrix1to2[3][8] = 0.73296759293051; fWeightMatrix1to2[4][8] = -1.45953540524464; fWeightMatrix1to2[5][8] = 2.23840952910728; fWeightMatrix1to2[6][8] = -2.31270617057378; fWeightMatrix1to2[7][8] = 0.457452792139321; fWeightMatrix1to2[8][8] = -0.673148254416909; fWeightMatrix1to2[9][8] = -1.85203869473412; fWeightMatrix1to2[10][8] = 0.788119383818947; fWeightMatrix1to2[0][9] = -0.226969155275253; fWeightMatrix1to2[1][9] = 1.21090272563256; fWeightMatrix1to2[2][9] = 1.29586074326756; fWeightMatrix1to2[3][9] = 0.0983214237234235; fWeightMatrix1to2[4][9] = 0.866245236015715; fWeightMatrix1to2[5][9] = -1.43412875531251; fWeightMatrix1to2[6][9] = -1.19677515635306; fWeightMatrix1to2[7][9] = -0.641892810826205; fWeightMatrix1to2[8][9] = -0.53462618540411; fWeightMatrix1to2[9][9] = -1.28944776722575; fWeightMatrix1to2[10][9] = -0.691542083006963; fWeightMatrix1to2[0][10] = 0.729241595351513; fWeightMatrix1to2[1][10] = -0.932794828904791; fWeightMatrix1to2[2][10] = -2.12734274961761; fWeightMatrix1to2[3][10] = 1.45080319533799; fWeightMatrix1to2[4][10] = -1.50713053655621; fWeightMatrix1to2[5][10] = -0.768887032596649; fWeightMatrix1to2[6][10] = -1.35039608494726; fWeightMatrix1to2[7][10] = 0.473874968798669; fWeightMatrix1to2[8][10] = -0.379285826337384; fWeightMatrix1to2[9][10] = -0.576582087022701; fWeightMatrix1to2[10][10] = -0.941092376715386; fWeightMatrix1to2[0][11] = 0.824442863477308; fWeightMatrix1to2[1][11] = -0.649459754462398; fWeightMatrix1to2[2][11] = 0.106005521107572; fWeightMatrix1to2[3][11] = -1.24667287880825; fWeightMatrix1to2[4][11] = -0.805899139712339; fWeightMatrix1to2[5][11] = -2.00174678870456; fWeightMatrix1to2[6][11] = 0.772449704534127; fWeightMatrix1to2[7][11] = 0.884557295466503; fWeightMatrix1to2[8][11] = -0.422005679078032; fWeightMatrix1to2[9][11] = 0.840330760423436; fWeightMatrix1to2[10][11] = 0.494024920312217; fWeightMatrix1to2[0][12] = -0.209767300746941; fWeightMatrix1to2[1][12] = -0.206192068102747; fWeightMatrix1to2[2][12] = -0.662063839275546; fWeightMatrix1to2[3][12] = -0.436615019269631; fWeightMatrix1to2[4][12] = 1.54664746015801; fWeightMatrix1to2[5][12] = 1.67472673084579; fWeightMatrix1to2[6][12] = -2.16136160611941; fWeightMatrix1to2[7][12] = -1.99356020503007; fWeightMatrix1to2[8][12] = 0.019303486004646; fWeightMatrix1to2[9][12] = -0.149977793234266; fWeightMatrix1to2[10][12] = 0.570472802725687; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.393774631577515; fWeightMatrix2to3[0][1] = -0.192762564581658; fWeightMatrix2to3[0][2] = -1.23847719644697; fWeightMatrix2to3[0][3] = -0.663362130846739; fWeightMatrix2to3[0][4] = 0.107918263496116; fWeightMatrix2to3[0][5] = -0.927010140935173; fWeightMatrix2to3[0][6] = -1.33472750471752; fWeightMatrix2to3[0][7] = -0.784439077770128; fWeightMatrix2to3[0][8] = 0.529613750210866; fWeightMatrix2to3[0][9] = 0.554857120531663; fWeightMatrix2to3[0][10] = -0.682076672947492; fWeightMatrix2to3[0][11] = 1.15788311617804; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l