// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:43:55 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job407 Training events: 49528 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [47.6823577881,732.116455078] LepAPt LepAPt 'F' [20.0003395081,161.648132324] LepBPt LepBPt 'F' [10.0010261536,69.9040908813] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.16472387314,19.2874412537] MetSpec MetSpec 'F' [15.0119848251,241.141921997] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1484737396,493.95300293] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.678276479244,293.303070068] addEt addEt 'F' [47.6823577881,370.165649414] dPhiLepSumMet dPhiLepSumMet 'F' [0.00143814424518,3.14159059525] dPhiLeptons dPhiLeptons 'F' [2.52577920037e-05,1.1460981369] dRLeptons dRLeptons 'F' [0.200028494,1.17128431797] lep1_E lep1_E 'F' [20.0069885254,232.717926025] lep2_E lep2_E 'F' [10.0076971054,122.221923828] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 47.6823577880859; fVmax[0] = 732.116455078125; fVmin[1] = 20.0003395080566; fVmax[1] = 161.648132324219; fVmin[2] = 10.0010261535645; fVmax[2] = 69.9040908813477; fVmin[3] = 1.16472387313843; fVmax[3] = 19.2874412536621; fVmin[4] = 15.0119848251343; fVmax[4] = 241.14192199707; fVmin[5] = 30.148473739624; fVmax[5] = 493.953002929688; fVmin[6] = 0.678276479244232; fVmax[6] = 293.303070068359; fVmin[7] = 47.6823577880859; fVmax[7] = 370.165649414062; fVmin[8] = 0.00143814424518496; fVmax[8] = 3.14159059524536; fVmin[9] = 2.52577920036856e-05; fVmax[9] = 1.14609813690186; fVmin[10] = 0.200028494000435; fVmax[10] = 1.17128431797028; fVmin[11] = 20.0069885253906; fVmax[11] = 232.717926025391; fVmin[12] = 10.0076971054077; fVmax[12] = 122.221923828125; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.418123075795464; fWeightMatrix0to1[1][0] = 1.8755796140874; fWeightMatrix0to1[2][0] = 1.00791977944453; fWeightMatrix0to1[3][0] = 2.62156792932643; fWeightMatrix0to1[4][0] = -0.773112333263616; fWeightMatrix0to1[5][0] = -0.4446315338965; fWeightMatrix0to1[6][0] = -1.52346996167579; fWeightMatrix0to1[7][0] = 2.70038692591917; fWeightMatrix0to1[8][0] = -1.66869566800072; fWeightMatrix0to1[9][0] = -0.835639642141709; fWeightMatrix0to1[10][0] = -1.46359244671537; fWeightMatrix0to1[11][0] = 0.281260270843264; fWeightMatrix0to1[12][0] = -0.367573951514058; fWeightMatrix0to1[13][0] = -0.684270752521892; fWeightMatrix0to1[0][1] = -0.39065417186827; fWeightMatrix0to1[1][1] = 1.08987085575721; fWeightMatrix0to1[2][1] = 0.646563323992563; fWeightMatrix0to1[3][1] = 4.90878188192663; fWeightMatrix0to1[4][1] = 1.60768255356222; fWeightMatrix0to1[5][1] = 4.24106061005085; fWeightMatrix0to1[6][1] = -1.0433733734668; fWeightMatrix0to1[7][1] = -1.72638646091517; fWeightMatrix0to1[8][1] = 0.350721094093152; fWeightMatrix0to1[9][1] = -0.58661352168654; fWeightMatrix0to1[10][1] = 1.23490769893328; fWeightMatrix0to1[11][1] = -0.255164214326158; fWeightMatrix0to1[12][1] = 1.2506957767729; fWeightMatrix0to1[13][1] = -3.34684052167006; fWeightMatrix0to1[0][2] = -2.02843607795687; fWeightMatrix0to1[1][2] = -0.0801882043509578; fWeightMatrix0to1[2][2] = 0.291485222085862; fWeightMatrix0to1[3][2] = 3.79987658831311; fWeightMatrix0to1[4][2] = 2.12568649481925; fWeightMatrix0to1[5][2] = 2.83910045601914; fWeightMatrix0to1[6][2] = -0.605731950128436; fWeightMatrix0to1[7][2] = 3.98727652284368; fWeightMatrix0to1[8][2] = -1.5359658106282; fWeightMatrix0to1[9][2] = 3.14759169246806; fWeightMatrix0to1[10][2] = -2.64679755021727; fWeightMatrix0to1[11][2] = 2.09709246589833; fWeightMatrix0to1[12][2] = 1.51610735851625; fWeightMatrix0to1[13][2] = -1.44368473062834; fWeightMatrix0to1[0][3] = 2.02098635087868; fWeightMatrix0to1[1][3] = 1.32314370225116; fWeightMatrix0to1[2][3] = -0.293269575969205; fWeightMatrix0to1[3][3] = -2.03398026094402; fWeightMatrix0to1[4][3] = 1.13269404546648; fWeightMatrix0to1[5][3] = -0.965441516240803; fWeightMatrix0to1[6][3] = 1.0909023957206; fWeightMatrix0to1[7][3] = 4.17936300979112; fWeightMatrix0to1[8][3] = 1.83822106163656; fWeightMatrix0to1[9][3] = 0.676834891553997; fWeightMatrix0to1[10][3] = -0.210691771166779; fWeightMatrix0to1[11][3] = 1.00623069000433; fWeightMatrix0to1[12][3] = -3.73743727227431; fWeightMatrix0to1[13][3] = -0.640125160864426; fWeightMatrix0to1[0][4] = -1.1747517100665; fWeightMatrix0to1[1][4] = -1.55546170352791; fWeightMatrix0to1[2][4] = 2.07864257046572; fWeightMatrix0to1[3][4] = 0.667460567347831; fWeightMatrix0to1[4][4] = -0.835084939389933; fWeightMatrix0to1[5][4] = 0.55485252795825; fWeightMatrix0to1[6][4] = -0.970475735087562; fWeightMatrix0to1[7][4] = 2.85395944241129; fWeightMatrix0to1[8][4] = 1.81778737011704; fWeightMatrix0to1[9][4] = -0.566984373146208; fWeightMatrix0to1[10][4] = 1.23938839899039; fWeightMatrix0to1[11][4] = -1.11074601634399; fWeightMatrix0to1[12][4] = -0.0975991333412447; fWeightMatrix0to1[13][4] = -1.95240928874957; fWeightMatrix0to1[0][5] = -0.901485725028981; fWeightMatrix0to1[1][5] = -1.58758939124143; fWeightMatrix0to1[2][5] = 0.77974914741068; fWeightMatrix0to1[3][5] = 1.28727113717381; fWeightMatrix0to1[4][5] = 2.13437533556138; fWeightMatrix0to1[5][5] = -0.0694473051347165; fWeightMatrix0to1[6][5] = -2.72918979399024; fWeightMatrix0to1[7][5] = -2.91598269932164; fWeightMatrix0to1[8][5] = 0.481477019016386; fWeightMatrix0to1[9][5] = 0.693473978219376; fWeightMatrix0to1[10][5] = 1.59637040873684; fWeightMatrix0to1[11][5] = 1.51646471496187; fWeightMatrix0to1[12][5] = 2.31730941109122; fWeightMatrix0to1[13][5] = -0.22967842693216; fWeightMatrix0to1[0][6] = -0.864684871005284; fWeightMatrix0to1[1][6] = -0.257629940021649; fWeightMatrix0to1[2][6] = 1.41326503318161; fWeightMatrix0to1[3][6] = -0.937864532668702; fWeightMatrix0to1[4][6] = -0.53510761418473; fWeightMatrix0to1[5][6] = -0.371267136796981; fWeightMatrix0to1[6][6] = -0.819273598786; fWeightMatrix0to1[7][6] = 0.912688921482141; fWeightMatrix0to1[8][6] = 1.68104379424418; fWeightMatrix0to1[9][6] = -1.45677505154291; fWeightMatrix0to1[10][6] = -0.26746047056463; fWeightMatrix0to1[11][6] = 2.04779893167072; fWeightMatrix0to1[12][6] = -0.560575359121817; fWeightMatrix0to1[13][6] = -2.93496102408767; fWeightMatrix0to1[0][7] = -1.30867606521108; fWeightMatrix0to1[1][7] = 0.2392684082829; fWeightMatrix0to1[2][7] = -1.48285201000155; fWeightMatrix0to1[3][7] = 3.82642526903192; fWeightMatrix0to1[4][7] = 4.14833495424181; fWeightMatrix0to1[5][7] = 6.85861663066686; fWeightMatrix0to1[6][7] = -1.64905762305794; fWeightMatrix0to1[7][7] = 8.39722295062915; fWeightMatrix0to1[8][7] = -0.497596570026448; fWeightMatrix0to1[9][7] = -0.543088433197203; fWeightMatrix0to1[10][7] = 1.78978033023384; fWeightMatrix0to1[11][7] = 0.548503331042828; fWeightMatrix0to1[12][7] = -1.88894243568991; fWeightMatrix0to1[13][7] = -4.10440911873156; fWeightMatrix0to1[0][8] = 0.626073623718308; fWeightMatrix0to1[1][8] = 1.04692454998667; fWeightMatrix0to1[2][8] = -1.08961111415588; fWeightMatrix0to1[3][8] = 3.79697598806754; fWeightMatrix0to1[4][8] = -0.0198176151254347; fWeightMatrix0to1[5][8] = -1.68701273952441; fWeightMatrix0to1[6][8] = 1.46002397840752; fWeightMatrix0to1[7][8] = -2.78754459889617; fWeightMatrix0to1[8][8] = -1.41676413882751; fWeightMatrix0to1[9][8] = 0.714201825754954; fWeightMatrix0to1[10][8] = 0.503846659460847; fWeightMatrix0to1[11][8] = 0.274206777589196; fWeightMatrix0to1[12][8] = -3.2578469413699; fWeightMatrix0to1[13][8] = -0.912372343335098; fWeightMatrix0to1[0][9] = -0.561037788095847; fWeightMatrix0to1[1][9] = 0.239262602733667; fWeightMatrix0to1[2][9] = -1.78751197197444; fWeightMatrix0to1[3][9] = -1.02883467320616; fWeightMatrix0to1[4][9] = 0.92550641538645; fWeightMatrix0to1[5][9] = 0.11604496614711; fWeightMatrix0to1[6][9] = -0.0455118477339268; fWeightMatrix0to1[7][9] = 0.421671289263613; fWeightMatrix0to1[8][9] = 0.436280994938697; fWeightMatrix0to1[9][9] = 2.70845566272358; fWeightMatrix0to1[10][9] = 0.108389628379018; fWeightMatrix0to1[11][9] = -1.91884678672945; fWeightMatrix0to1[12][9] = -0.924710623071421; fWeightMatrix0to1[13][9] = -0.263134016753535; fWeightMatrix0to1[0][10] = 1.49008619259588; fWeightMatrix0to1[1][10] = 0.600863165765859; fWeightMatrix0to1[2][10] = -1.1038867510429; fWeightMatrix0to1[3][10] = 0.107982890333015; fWeightMatrix0to1[4][10] = -0.21761367910201; fWeightMatrix0to1[5][10] = 1.54719669066481; fWeightMatrix0to1[6][10] = 0.522426444580529; fWeightMatrix0to1[7][10] = 0.497894271579836; fWeightMatrix0to1[8][10] = 0.0936365220122254; fWeightMatrix0to1[9][10] = 1.32645235495919; fWeightMatrix0to1[10][10] = -0.281005011430433; fWeightMatrix0to1[11][10] = 0.000200593689938705; fWeightMatrix0to1[12][10] = -0.131140462989321; fWeightMatrix0to1[13][10] = -0.198412229913122; fWeightMatrix0to1[0][11] = -0.20695989376211; fWeightMatrix0to1[1][11] = 0.190327291476647; fWeightMatrix0to1[2][11] = 2.50955807653624; fWeightMatrix0to1[3][11] = 0.550445328280083; fWeightMatrix0to1[4][11] = 0.262750531076024; fWeightMatrix0to1[5][11] = -4.61677646683115; fWeightMatrix0to1[6][11] = 0.894788912919222; fWeightMatrix0to1[7][11] = -2.31671280232046; fWeightMatrix0to1[8][11] = -0.223406689418309; fWeightMatrix0to1[9][11] = -1.14883743638968; fWeightMatrix0to1[10][11] = 3.50624034696198; fWeightMatrix0to1[11][11] = -0.376786694900849; fWeightMatrix0to1[12][11] = -1.3384165995452; fWeightMatrix0to1[13][11] = 1.29295232123321; fWeightMatrix0to1[0][12] = -0.839845486431886; fWeightMatrix0to1[1][12] = -1.15812930204474; fWeightMatrix0to1[2][12] = 1.99405278053335; fWeightMatrix0to1[3][12] = -0.835016096291749; fWeightMatrix0to1[4][12] = -0.860919545442909; fWeightMatrix0to1[5][12] = -1.71104250074613; fWeightMatrix0to1[6][12] = 1.02772192487685; fWeightMatrix0to1[7][12] = 0.243116903806791; fWeightMatrix0to1[8][12] = 0.531227814018569; fWeightMatrix0to1[9][12] = -0.46697552312891; fWeightMatrix0to1[10][12] = 0.228574845135253; fWeightMatrix0to1[11][12] = -0.661315254492352; fWeightMatrix0to1[12][12] = -1.33212543530646; fWeightMatrix0to1[13][12] = 1.74246181309502; fWeightMatrix0to1[0][13] = 1.94343370974186; fWeightMatrix0to1[1][13] = -1.69259350094562; fWeightMatrix0to1[2][13] = -0.762841651909115; fWeightMatrix0to1[3][13] = 6.01508228152125; fWeightMatrix0to1[4][13] = 3.16981491057278; fWeightMatrix0to1[5][13] = 6.85265165351458; fWeightMatrix0to1[6][13] = 0.859533200468065; fWeightMatrix0to1[7][13] = 10.5004923081298; fWeightMatrix0to1[8][13] = 0.329664338479198; fWeightMatrix0to1[9][13] = 1.44257663880819; fWeightMatrix0to1[10][13] = 2.12661474264176; fWeightMatrix0to1[11][13] = 0.511995299286146; fWeightMatrix0to1[12][13] = 0.217486341300104; fWeightMatrix0to1[13][13] = -6.03348941958589; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.1712678705469; fWeightMatrix1to2[1][0] = 0.114576313685598; fWeightMatrix1to2[2][0] = -1.13863731203305; fWeightMatrix1to2[3][0] = -2.11706089557985; fWeightMatrix1to2[4][0] = -0.830359484999573; fWeightMatrix1to2[5][0] = -1.56850056614734; fWeightMatrix1to2[6][0] = -0.97565313859575; fWeightMatrix1to2[7][0] = -2.25185765195552; fWeightMatrix1to2[8][0] = -0.367576735657429; fWeightMatrix1to2[9][0] = -0.470288074734951; fWeightMatrix1to2[10][0] = -1.57586380112308; fWeightMatrix1to2[11][0] = -0.383672060183092; fWeightMatrix1to2[12][0] = 0.775989847220865; fWeightMatrix1to2[0][1] = 0.958618281630641; fWeightMatrix1to2[1][1] = 0.914340224756175; fWeightMatrix1to2[2][1] = -0.180865066957393; fWeightMatrix1to2[3][1] = -1.82190866474809; fWeightMatrix1to2[4][1] = -0.957400065149547; fWeightMatrix1to2[5][1] = 1.71165103422111; fWeightMatrix1to2[6][1] = 1.83677145821215; fWeightMatrix1to2[7][1] = 0.66818249797524; fWeightMatrix1to2[8][1] = -0.0670644051668203; fWeightMatrix1to2[9][1] = 1.86582650607092; fWeightMatrix1to2[10][1] = -1.89233513051529; fWeightMatrix1to2[11][1] = -1.11259364891269; fWeightMatrix1to2[12][1] = -0.172207938530172; fWeightMatrix1to2[0][2] = -1.64050233977474; fWeightMatrix1to2[1][2] = 1.55192784226016; fWeightMatrix1to2[2][2] = 1.41036781049325; fWeightMatrix1to2[3][2] = -1.29720575717717; fWeightMatrix1to2[4][2] = 0.762297854005719; fWeightMatrix1to2[5][2] = -0.215429819984903; fWeightMatrix1to2[6][2] = -1.91983206653024; fWeightMatrix1to2[7][2] = -1.12622631175431; fWeightMatrix1to2[8][2] = 0.664758213851082; fWeightMatrix1to2[9][2] = -0.0661254834765783; fWeightMatrix1to2[10][2] = 1.07310750411638; fWeightMatrix1to2[11][2] = -1.16706118353775; fWeightMatrix1to2[12][2] = 1.52932412962488; fWeightMatrix1to2[0][3] = -2.10450889685125; fWeightMatrix1to2[1][3] = 1.7114106192873; fWeightMatrix1to2[2][3] = -0.604252455956109; fWeightMatrix1to2[3][3] = -1.75476936863407; fWeightMatrix1to2[4][3] = -1.33308310525681; fWeightMatrix1to2[5][3] = -1.81867757157561; fWeightMatrix1to2[6][3] = -1.27808764015174; fWeightMatrix1to2[7][3] = -1.25888615670174; fWeightMatrix1to2[8][3] = -1.88123282044016; fWeightMatrix1to2[9][3] = -1.90404227306972; fWeightMatrix1to2[10][3] = 0.291956077561966; fWeightMatrix1to2[11][3] = -1.29634047194899; fWeightMatrix1to2[12][3] = -5.31664026659496; fWeightMatrix1to2[0][4] = 0.198347018701291; fWeightMatrix1to2[1][4] = -2.37448411997077; fWeightMatrix1to2[2][4] = -1.0872093934513; fWeightMatrix1to2[3][4] = -1.62526647305641; fWeightMatrix1to2[4][4] = 0.200201483944103; fWeightMatrix1to2[5][4] = 1.08209440544885; fWeightMatrix1to2[6][4] = 0.0768681421901049; fWeightMatrix1to2[7][4] = -0.639999418282978; fWeightMatrix1to2[8][4] = -1.66958476330583; fWeightMatrix1to2[9][4] = -1.87514743435648; fWeightMatrix1to2[10][4] = -1.21488721489791; fWeightMatrix1to2[11][4] = -1.93306811249657; fWeightMatrix1to2[12][4] = -3.39160228261367; fWeightMatrix1to2[0][5] = -0.919075456687871; fWeightMatrix1to2[1][5] = 1.53347343295137; fWeightMatrix1to2[2][5] = -1.74462418476774; fWeightMatrix1to2[3][5] = -0.334484863565061; fWeightMatrix1to2[4][5] = -0.64006692734162; fWeightMatrix1to2[5][5] = -2.84230157709465; fWeightMatrix1to2[6][5] = 0.994765168168513; fWeightMatrix1to2[7][5] = 1.26805380107169; fWeightMatrix1to2[8][5] = -0.984089962286545; fWeightMatrix1to2[9][5] = -1.65941511919518; fWeightMatrix1to2[10][5] = 0.486408582900145; fWeightMatrix1to2[11][5] = 1.0461246660119; fWeightMatrix1to2[12][5] = -3.7726617879778; fWeightMatrix1to2[0][6] = -1.48467735229289; fWeightMatrix1to2[1][6] = -0.59156837383161; fWeightMatrix1to2[2][6] = 0.147007827930607; fWeightMatrix1to2[3][6] = 0.488705986766293; fWeightMatrix1to2[4][6] = 0.863766828885747; fWeightMatrix1to2[5][6] = -1.98368624653267; fWeightMatrix1to2[6][6] = -0.874216984243514; fWeightMatrix1to2[7][6] = -0.51355175054713; fWeightMatrix1to2[8][6] = -2.50079800567621; fWeightMatrix1to2[9][6] = -1.67364688585139; fWeightMatrix1to2[10][6] = -2.77804678261606; fWeightMatrix1to2[11][6] = -1.68319550048101; fWeightMatrix1to2[12][6] = -2.13154450167044; fWeightMatrix1to2[0][7] = 0.868906914947014; fWeightMatrix1to2[1][7] = -0.497189314810019; fWeightMatrix1to2[2][7] = 0.504801629830879; fWeightMatrix1to2[3][7] = -2.54732446733228; fWeightMatrix1to2[4][7] = 1.10191273460395; fWeightMatrix1to2[5][7] = -1.6494530401613; fWeightMatrix1to2[6][7] = -1.80885214685501; fWeightMatrix1to2[7][7] = 1.29190488113297; fWeightMatrix1to2[8][7] = -3.31264987603999; fWeightMatrix1to2[9][7] = -0.138115956408271; fWeightMatrix1to2[10][7] = 0.451833070585806; fWeightMatrix1to2[11][7] = -2.61548513537898; fWeightMatrix1to2[12][7] = -7.19062675570454; fWeightMatrix1to2[0][8] = -1.78265445690645; fWeightMatrix1to2[1][8] = -0.49433960912729; fWeightMatrix1to2[2][8] = 1.52778220511142; fWeightMatrix1to2[3][8] = -0.725466316801466; fWeightMatrix1to2[4][8] = 0.204261789425447; fWeightMatrix1to2[5][8] = -1.02633231282999; fWeightMatrix1to2[6][8] = -1.45313658246732; fWeightMatrix1to2[7][8] = -1.96376260793754; fWeightMatrix1to2[8][8] = -1.0035258554681; fWeightMatrix1to2[9][8] = -1.49081313874246; fWeightMatrix1to2[10][8] = 0.912085509013168; fWeightMatrix1to2[11][8] = -1.50233892687477; fWeightMatrix1to2[12][8] = -1.66840170462931; fWeightMatrix1to2[0][9] = 1.49562689416173; fWeightMatrix1to2[1][9] = -2.41172606896436; fWeightMatrix1to2[2][9] = -1.52430606629837; fWeightMatrix1to2[3][9] = 1.60023741122279; fWeightMatrix1to2[4][9] = -1.67969456478094; fWeightMatrix1to2[5][9] = 0.802765070090396; fWeightMatrix1to2[6][9] = 2.01209214370068; fWeightMatrix1to2[7][9] = -0.497144325234352; fWeightMatrix1to2[8][9] = -3.16035477332096; fWeightMatrix1to2[9][9] = -0.465013487109336; fWeightMatrix1to2[10][9] = -0.993873377086358; fWeightMatrix1to2[11][9] = 0.85381643536699; fWeightMatrix1to2[12][9] = -2.32847198820167; fWeightMatrix1to2[0][10] = 0.34225919419571; fWeightMatrix1to2[1][10] = 0.866123074526545; fWeightMatrix1to2[2][10] = -2.10661861127376; fWeightMatrix1to2[3][10] = 0.161378612695991; fWeightMatrix1to2[4][10] = -1.9504719571122; fWeightMatrix1to2[5][10] = -0.175847507085341; fWeightMatrix1to2[6][10] = -3.08149179426874; fWeightMatrix1to2[7][10] = -1.88278917235939; fWeightMatrix1to2[8][10] = -0.0100365302483326; fWeightMatrix1to2[9][10] = 0.410708973895931; fWeightMatrix1to2[10][10] = -0.071601219198901; fWeightMatrix1to2[11][10] = 0.411685023896657; fWeightMatrix1to2[12][10] = -0.72181930741021; fWeightMatrix1to2[0][11] = -1.45179961565191; fWeightMatrix1to2[1][11] = -1.47205776059378; fWeightMatrix1to2[2][11] = -1.45919731218784; fWeightMatrix1to2[3][11] = 1.39011620103071; fWeightMatrix1to2[4][11] = -1.69064593930023; fWeightMatrix1to2[5][11] = -0.968731062504349; fWeightMatrix1to2[6][11] = -2.79025093387424; fWeightMatrix1to2[7][11] = -0.595640434262227; fWeightMatrix1to2[8][11] = -0.431594636074083; fWeightMatrix1to2[9][11] = -0.987380036168963; fWeightMatrix1to2[10][11] = -1.73045006802283; fWeightMatrix1to2[11][11] = 0.806907035357177; fWeightMatrix1to2[12][11] = 0.45256779163616; fWeightMatrix1to2[0][12] = -0.0640997743377356; fWeightMatrix1to2[1][12] = -2.63163834201899; fWeightMatrix1to2[2][12] = -1.97629126506793; fWeightMatrix1to2[3][12] = -2.07163674873097; fWeightMatrix1to2[4][12] = -1.85209187621147; fWeightMatrix1to2[5][12] = -0.862493962584309; fWeightMatrix1to2[6][12] = -2.07189359727223; fWeightMatrix1to2[7][12] = -1.47906778356574; fWeightMatrix1to2[8][12] = 0.0888324828141507; fWeightMatrix1to2[9][12] = -2.57637493266489; fWeightMatrix1to2[10][12] = -1.70057641959682; fWeightMatrix1to2[11][12] = -1.95277816729843; fWeightMatrix1to2[12][12] = -0.167324855389737; fWeightMatrix1to2[0][13] = -1.79298256635746; fWeightMatrix1to2[1][13] = -4.86312644148848; fWeightMatrix1to2[2][13] = -1.06461438148376; fWeightMatrix1to2[3][13] = -0.76477239425994; fWeightMatrix1to2[4][13] = -1.65938430486798; fWeightMatrix1to2[5][13] = 0.0762547397314349; fWeightMatrix1to2[6][13] = -1.58462350848791; fWeightMatrix1to2[7][13] = -1.51951114604221; fWeightMatrix1to2[8][13] = 1.02720786320335; fWeightMatrix1to2[9][13] = -0.286392385257223; fWeightMatrix1to2[10][13] = -0.226185585742172; fWeightMatrix1to2[11][13] = 0.478870677040953; fWeightMatrix1to2[12][13] = 5.33442219293639; fWeightMatrix1to2[0][14] = 0.406823400064893; fWeightMatrix1to2[1][14] = -0.914476838406368; fWeightMatrix1to2[2][14] = -2.59255622068521; fWeightMatrix1to2[3][14] = -1.43228427719021; fWeightMatrix1to2[4][14] = -1.87852917739054; fWeightMatrix1to2[5][14] = -0.97940681900849; fWeightMatrix1to2[6][14] = -0.260844122550086; fWeightMatrix1to2[7][14] = -0.203487440377025; fWeightMatrix1to2[8][14] = -0.843066178560982; fWeightMatrix1to2[9][14] = -0.422474157563874; fWeightMatrix1to2[10][14] = 0.663468399067945; fWeightMatrix1to2[11][14] = -1.61018914667488; fWeightMatrix1to2[12][14] = 0.868261077248381; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.74563129435538; fWeightMatrix2to3[0][1] = -1.75921214093588; fWeightMatrix2to3[0][2] = -1.81484555033329; fWeightMatrix2to3[0][3] = -1.10535586117124; fWeightMatrix2to3[0][4] = 0.753073281302953; fWeightMatrix2to3[0][5] = -0.735086246090777; fWeightMatrix2to3[0][6] = -1.6279008514763; fWeightMatrix2to3[0][7] = 1.63033877771303; fWeightMatrix2to3[0][8] = -4.68109732941961; fWeightMatrix2to3[0][9] = 0.106967456747065; fWeightMatrix2to3[0][10] = 0.962969460850749; fWeightMatrix2to3[0][11] = -1.13258196606573; fWeightMatrix2to3[0][12] = -0.843126608201746; fWeightMatrix2to3[0][13] = 0.929364200887679; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l