// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 09:56:39 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job401 Training events: 47762 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [47.7866363525,750.656311035] LepAPt LepAPt 'F' [20.0001049042,178.378601074] LepBPt LepBPt 'F' [10.0004081726,68.4152984619] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.975870490074,19.2163314819] MetSpec MetSpec 'F' [15.0050735474,236.562576294] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1877574921,481.37588501] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.862621068954,308.832702637] addEt addEt 'F' [47.7866363525,378.177581787] dPhiLepSumMet dPhiLepSumMet 'F' [0.0177443344146,3.14158248901] dPhiLeptons dPhiLeptons 'F' [1.71661376953e-05,1.11657130718] dRLeptons dRLeptons 'F' [0.200001657009,1.13453125954] lep1_E lep1_E 'F' [20.0121021271,196.200546265] lep2_E lep2_E 'F' [10.0116767883,124.551948547] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 47.7866363525391; fVmax[0] = 750.656311035156; fVmin[1] = 20.0001049041748; fVmax[1] = 178.378601074219; fVmin[2] = 10.0004081726074; fVmax[2] = 68.4152984619141; fVmin[3] = 0.975870490074158; fVmax[3] = 19.2163314819336; fVmin[4] = 15.0050735473633; fVmax[4] = 236.562576293945; fVmin[5] = 30.1877574920654; fVmax[5] = 481.375885009766; fVmin[6] = 0.862621068954468; fVmax[6] = 308.832702636719; fVmin[7] = 47.7866363525391; fVmax[7] = 378.177581787109; fVmin[8] = 0.0177443344146013; fVmax[8] = 3.14158248901367; fVmin[9] = 1.71661376953125e-05; fVmax[9] = 1.11657130718231; fVmin[10] = 0.200001657009125; fVmax[10] = 1.13453125953674; fVmin[11] = 20.0121021270752; fVmax[11] = 196.200546264648; fVmin[12] = 10.0116767883301; fVmax[12] = 124.551948547363; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.156096357678606; fWeightMatrix0to1[1][0] = 2.0834386958498; fWeightMatrix0to1[2][0] = 0.803441213050607; fWeightMatrix0to1[3][0] = 1.31757207410795; fWeightMatrix0to1[4][0] = -2.43898246581683; fWeightMatrix0to1[5][0] = -0.762640809517283; fWeightMatrix0to1[6][0] = -0.474921622376861; fWeightMatrix0to1[7][0] = 1.97987374069402; fWeightMatrix0to1[8][0] = -1.63456629653259; fWeightMatrix0to1[9][0] = -0.645641369887991; fWeightMatrix0to1[10][0] = -1.40555183652321; fWeightMatrix0to1[11][0] = -0.566092100220191; fWeightMatrix0to1[12][0] = -1.46998862332917; fWeightMatrix0to1[13][0] = -1.07327609397625; fWeightMatrix0to1[0][1] = -0.463763656120233; fWeightMatrix0to1[1][1] = 0.385813892983001; fWeightMatrix0to1[2][1] = -3.52272586214023; fWeightMatrix0to1[3][1] = -1.01018792082769; fWeightMatrix0to1[4][1] = 3.22299550591727; fWeightMatrix0to1[5][1] = 1.42989987976887; fWeightMatrix0to1[6][1] = -4.55658164981245; fWeightMatrix0to1[7][1] = 2.1877531486667; fWeightMatrix0to1[8][1] = 0.910174007386775; fWeightMatrix0to1[9][1] = 0.476721482069941; fWeightMatrix0to1[10][1] = -1.54963615800978; fWeightMatrix0to1[11][1] = 1.12801325237175; fWeightMatrix0to1[12][1] = 1.04672768547086; fWeightMatrix0to1[13][1] = -1.81235857600825; fWeightMatrix0to1[0][2] = -1.988174171224; fWeightMatrix0to1[1][2] = 0.22931841579179; fWeightMatrix0to1[2][2] = -2.23709417739758; fWeightMatrix0to1[3][2] = 0.446123248571653; fWeightMatrix0to1[4][2] = 1.72142124160485; fWeightMatrix0to1[5][2] = -0.822838713053675; fWeightMatrix0to1[6][2] = -1.4596772222087; fWeightMatrix0to1[7][2] = 3.58335408176807; fWeightMatrix0to1[8][2] = -2.085766575334; fWeightMatrix0to1[9][2] = 2.37431057993156; fWeightMatrix0to1[10][2] = -2.63155767485027; fWeightMatrix0to1[11][2] = 2.95102002149471; fWeightMatrix0to1[12][2] = 0.586937080064388; fWeightMatrix0to1[13][2] = 0.137373174318996; fWeightMatrix0to1[0][3] = 2.01403062965132; fWeightMatrix0to1[1][3] = 1.66839013362181; fWeightMatrix0to1[2][3] = -0.171365042846369; fWeightMatrix0to1[3][3] = -2.7992566234188; fWeightMatrix0to1[4][3] = 1.2530745655505; fWeightMatrix0to1[5][3] = -0.287950451715537; fWeightMatrix0to1[6][3] = 6.02703425690282; fWeightMatrix0to1[7][3] = 0.627581316544028; fWeightMatrix0to1[8][3] = 0.959034848637461; fWeightMatrix0to1[9][3] = 1.63862828889702; fWeightMatrix0to1[10][3] = -2.02394879422628; fWeightMatrix0to1[11][3] = 2.34445300263735; fWeightMatrix0to1[12][3] = -1.88130668139812; fWeightMatrix0to1[13][3] = -1.17821665302851; fWeightMatrix0to1[0][4] = -1.17941041501147; fWeightMatrix0to1[1][4] = -1.82357908044775; fWeightMatrix0to1[2][4] = 0.999521560977105; fWeightMatrix0to1[3][4] = -0.679502306544939; fWeightMatrix0to1[4][4] = -0.797263149356104; fWeightMatrix0to1[5][4] = 0.462962106996921; fWeightMatrix0to1[6][4] = 0.53587863853039; fWeightMatrix0to1[7][4] = 3.28240019454823; fWeightMatrix0to1[8][4] = 1.96048077616551; fWeightMatrix0to1[9][4] = -0.394292015239088; fWeightMatrix0to1[10][4] = -2.05630500746119; fWeightMatrix0to1[11][4] = 0.939657671835564; fWeightMatrix0to1[12][4] = 0.410432270395812; fWeightMatrix0to1[13][4] = -1.45305271110088; fWeightMatrix0to1[0][5] = -0.609132825985038; fWeightMatrix0to1[1][5] = -1.40263502512609; fWeightMatrix0to1[2][5] = 1.05854119668451; fWeightMatrix0to1[3][5] = 1.3692155737858; fWeightMatrix0to1[4][5] = 0.293760019213864; fWeightMatrix0to1[5][5] = 0.697662320272819; fWeightMatrix0to1[6][5] = -1.83375921960821; fWeightMatrix0to1[7][5] = -2.76043876682619; fWeightMatrix0to1[8][5] = 0.792777809331565; fWeightMatrix0to1[9][5] = 0.922197909598335; fWeightMatrix0to1[10][5] = 2.74338029554817; fWeightMatrix0to1[11][5] = -0.330828177255766; fWeightMatrix0to1[12][5] = 0.71095243988924; fWeightMatrix0to1[13][5] = -1.48758048586963; fWeightMatrix0to1[0][6] = -0.686513539069062; fWeightMatrix0to1[1][6] = -0.324247148423973; fWeightMatrix0to1[2][6] = 0.609141615932214; fWeightMatrix0to1[3][6] = -2.28333494801789; fWeightMatrix0to1[4][6] = -1.40756192881448; fWeightMatrix0to1[5][6] = -1.25736925998305; fWeightMatrix0to1[6][6] = -0.781090463125631; fWeightMatrix0to1[7][6] = 1.78531468083025; fWeightMatrix0to1[8][6] = 1.77798273376716; fWeightMatrix0to1[9][6] = -1.39815149355333; fWeightMatrix0to1[10][6] = -1.73822837005058; fWeightMatrix0to1[11][6] = 2.77990652578708; fWeightMatrix0to1[12][6] = -1.03522733360965; fWeightMatrix0to1[13][6] = -2.0343126875928; fWeightMatrix0to1[0][7] = -1.1954675724244; fWeightMatrix0to1[1][7] = 0.209770158549048; fWeightMatrix0to1[2][7] = -6.29310969104288; fWeightMatrix0to1[3][7] = -5.99433258035166; fWeightMatrix0to1[4][7] = 4.94543275092723; fWeightMatrix0to1[5][7] = 1.20503930340679; fWeightMatrix0to1[6][7] = -6.97072649144295; fWeightMatrix0to1[7][7] = 10.3960026353672; fWeightMatrix0to1[8][7] = -0.959430150114058; fWeightMatrix0to1[9][7] = -0.238762724421221; fWeightMatrix0to1[10][7] = -2.47899882344187; fWeightMatrix0to1[11][7] = 3.90345967651414; fWeightMatrix0to1[12][7] = -1.72692023995289; fWeightMatrix0to1[13][7] = -0.134418151133854; fWeightMatrix0to1[0][8] = 0.337796822872319; fWeightMatrix0to1[1][8] = -0.0904734272798885; fWeightMatrix0to1[2][8] = 1.19244659943842; fWeightMatrix0to1[3][8] = 1.29954701597435; fWeightMatrix0to1[4][8] = 0.786648906330034; fWeightMatrix0to1[5][8] = -2.72133648378315; fWeightMatrix0to1[6][8] = 5.27244203495657; fWeightMatrix0to1[7][8] = 1.14751939429847; fWeightMatrix0to1[8][8] = 0.194261230686302; fWeightMatrix0to1[9][8] = -0.0437132359283082; fWeightMatrix0to1[10][8] = -2.53699708553425; fWeightMatrix0to1[11][8] = 2.53456568453651; fWeightMatrix0to1[12][8] = 1.03214869261135; fWeightMatrix0to1[13][8] = 0.656993897615082; fWeightMatrix0to1[0][9] = -0.361106349359798; fWeightMatrix0to1[1][9] = 0.298604639103033; fWeightMatrix0to1[2][9] = -1.97176033382587; fWeightMatrix0to1[3][9] = -0.295384116183018; fWeightMatrix0to1[4][9] = -0.459677890160166; fWeightMatrix0to1[5][9] = -2.05218036135235; fWeightMatrix0to1[6][9] = 1.42587715727273; fWeightMatrix0to1[7][9] = 1.24808284968914; fWeightMatrix0to1[8][9] = 2.88288282779888; fWeightMatrix0to1[9][9] = 0.981509035709546; fWeightMatrix0to1[10][9] = 0.605013813560523; fWeightMatrix0to1[11][9] = -3.3498615981695; fWeightMatrix0to1[12][9] = -0.467953719644752; fWeightMatrix0to1[13][9] = 0.9148098791188; fWeightMatrix0to1[0][10] = 1.56755674968768; fWeightMatrix0to1[1][10] = 0.237208233880035; fWeightMatrix0to1[2][10] = -1.71510960617312; fWeightMatrix0to1[3][10] = 0.0816700722755253; fWeightMatrix0to1[4][10] = -1.29435193032792; fWeightMatrix0to1[5][10] = 1.63327072150168; fWeightMatrix0to1[6][10] = -0.516183319268487; fWeightMatrix0to1[7][10] = -0.495335861564547; fWeightMatrix0to1[8][10] = 1.24080519381031; fWeightMatrix0to1[9][10] = -0.289872296725706; fWeightMatrix0to1[10][10] = -2.10242572768703; fWeightMatrix0to1[11][10] = 0.933963211979597; fWeightMatrix0to1[12][10] = 0.891505751070679; fWeightMatrix0to1[13][10] = -1.15438872200313; fWeightMatrix0to1[0][11] = -0.474537282775288; fWeightMatrix0to1[1][11] = 0.198386172400518; fWeightMatrix0to1[2][11] = 5.06109013469787; fWeightMatrix0to1[3][11] = -0.509601552752664; fWeightMatrix0to1[4][11] = 3.03625442754538; fWeightMatrix0to1[5][11] = -1.17978490230961; fWeightMatrix0to1[6][11] = 1.54256289762902; fWeightMatrix0to1[7][11] = 0.691033468462516; fWeightMatrix0to1[8][11] = -0.393315163671071; fWeightMatrix0to1[9][11] = 0.91778311796865; fWeightMatrix0to1[10][11] = 1.52488006935509; fWeightMatrix0to1[11][11] = 0.26479149985471; fWeightMatrix0to1[12][11] = -1.06775765585808; fWeightMatrix0to1[13][11] = 0.289846519726335; fWeightMatrix0to1[0][12] = -1.0251980960259; fWeightMatrix0to1[1][12] = -0.708152943551233; fWeightMatrix0to1[2][12] = 2.47746163423798; fWeightMatrix0to1[3][12] = 1.49307306503808; fWeightMatrix0to1[4][12] = 0.748140930313071; fWeightMatrix0to1[5][12] = 1.63238655666899; fWeightMatrix0to1[6][12] = 3.53264522708034; fWeightMatrix0to1[7][12] = -1.23748797734508; fWeightMatrix0to1[8][12] = -0.177268227382992; fWeightMatrix0to1[9][12] = 0.136250645238383; fWeightMatrix0to1[10][12] = -0.019952677227407; fWeightMatrix0to1[11][12] = -0.0351975244947098; fWeightMatrix0to1[12][12] = -2.32220863616807; fWeightMatrix0to1[13][12] = 0.275566062522372; fWeightMatrix0to1[0][13] = 1.86932569180983; fWeightMatrix0to1[1][13] = -1.56886595902614; fWeightMatrix0to1[2][13] = -4.45007186643984; fWeightMatrix0to1[3][13] = -4.40653073751435; fWeightMatrix0to1[4][13] = 4.88358289816929; fWeightMatrix0to1[5][13] = 0.615348969969078; fWeightMatrix0to1[6][13] = -10.0431790090058; fWeightMatrix0to1[7][13] = 12.6532900365085; fWeightMatrix0to1[8][13] = -0.573711539807433; fWeightMatrix0to1[9][13] = 1.0143913148901; fWeightMatrix0to1[10][13] = -1.79387355747273; fWeightMatrix0to1[11][13] = 4.89355967860478; fWeightMatrix0to1[12][13] = 1.40926693790008; fWeightMatrix0to1[13][13] = 0.355453904657791; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.77060497893135; fWeightMatrix1to2[1][0] = -0.179898512585445; fWeightMatrix1to2[2][0] = -0.09358441706331; fWeightMatrix1to2[3][0] = -1.32406939580072; fWeightMatrix1to2[4][0] = -0.682406180240691; fWeightMatrix1to2[5][0] = -0.62020450971043; fWeightMatrix1to2[6][0] = -0.309119052547966; fWeightMatrix1to2[7][0] = -1.47920882764997; fWeightMatrix1to2[8][0] = 0.467880332249137; fWeightMatrix1to2[9][0] = 0.17534602308184; fWeightMatrix1to2[10][0] = -1.49681176077928; fWeightMatrix1to2[11][0] = -0.0475754044346001; fWeightMatrix1to2[12][0] = -0.03371027845584; fWeightMatrix1to2[0][1] = 1.05679009156028; fWeightMatrix1to2[1][1] = -0.0826777403520957; fWeightMatrix1to2[2][1] = -0.577386926318947; fWeightMatrix1to2[3][1] = -1.9345875779609; fWeightMatrix1to2[4][1] = -0.922195304299487; fWeightMatrix1to2[5][1] = 1.98760816417254; fWeightMatrix1to2[6][1] = 1.22907876123275; fWeightMatrix1to2[7][1] = 0.731519121073805; fWeightMatrix1to2[8][1] = 0.183412655927488; fWeightMatrix1to2[9][1] = 1.89311130985596; fWeightMatrix1to2[10][1] = -1.65088930241217; fWeightMatrix1to2[11][1] = -1.25347731454622; fWeightMatrix1to2[12][1] = -0.201536470380368; fWeightMatrix1to2[0][2] = -1.79307031327384; fWeightMatrix1to2[1][2] = 4.58064430882363; fWeightMatrix1to2[2][2] = -0.907801690947097; fWeightMatrix1to2[3][2] = -0.612706711378931; fWeightMatrix1to2[4][2] = 0.795325640888794; fWeightMatrix1to2[5][2] = -0.815047575398103; fWeightMatrix1to2[6][2] = -1.76257290149239; fWeightMatrix1to2[7][2] = -1.05810731165622; fWeightMatrix1to2[8][2] = 0.557769022730671; fWeightMatrix1to2[9][2] = -0.421326154228449; fWeightMatrix1to2[10][2] = -0.264287190197006; fWeightMatrix1to2[11][2] = -1.23692102418376; fWeightMatrix1to2[12][2] = 1.23407719323942; fWeightMatrix1to2[0][3] = -1.69106292413047; fWeightMatrix1to2[1][3] = 3.06758725235441; fWeightMatrix1to2[2][3] = -1.13935278767004; fWeightMatrix1to2[3][3] = -0.712551274860987; fWeightMatrix1to2[4][3] = -1.34395781221754; fWeightMatrix1to2[5][3] = -2.84343825645803; fWeightMatrix1to2[6][3] = -0.63135507903021; fWeightMatrix1to2[7][3] = -2.18508929850707; fWeightMatrix1to2[8][3] = -2.00988600697289; fWeightMatrix1to2[9][3] = -1.62896547686663; fWeightMatrix1to2[10][3] = 0.428151628147463; fWeightMatrix1to2[11][3] = -0.845931805741951; fWeightMatrix1to2[12][3] = -2.25501609089008; fWeightMatrix1to2[0][4] = 0.567390440311759; fWeightMatrix1to2[1][4] = -4.98717203764196; fWeightMatrix1to2[2][4] = 2.65291080196095; fWeightMatrix1to2[3][4] = -1.35485528336128; fWeightMatrix1to2[4][4] = 0.230953160602219; fWeightMatrix1to2[5][4] = 1.13606276953176; fWeightMatrix1to2[6][4] = 1.25396543330499; fWeightMatrix1to2[7][4] = -0.563940750683826; fWeightMatrix1to2[8][4] = -2.11098602627854; fWeightMatrix1to2[9][4] = -1.65167503266712; fWeightMatrix1to2[10][4] = -1.37322271799261; fWeightMatrix1to2[11][4] = -1.63722197949654; fWeightMatrix1to2[12][4] = -0.683561453994447; fWeightMatrix1to2[0][5] = -0.407503115742045; fWeightMatrix1to2[1][5] = -0.280782777758262; fWeightMatrix1to2[2][5] = -1.76433256621874; fWeightMatrix1to2[3][5] = 0.28189664415218; fWeightMatrix1to2[4][5] = -0.513211919652415; fWeightMatrix1to2[5][5] = -2.89196262568544; fWeightMatrix1to2[6][5] = 0.846740556533746; fWeightMatrix1to2[7][5] = 1.32325831089395; fWeightMatrix1to2[8][5] = -1.78660087757399; fWeightMatrix1to2[9][5] = -1.83988637424236; fWeightMatrix1to2[10][5] = -0.0699646788920481; fWeightMatrix1to2[11][5] = 1.13294345436979; fWeightMatrix1to2[12][5] = -1.02930078101321; fWeightMatrix1to2[0][6] = -1.48005882411319; fWeightMatrix1to2[1][6] = 3.36383840556955; fWeightMatrix1to2[2][6] = 0.12554310331439; fWeightMatrix1to2[3][6] = 1.90874989432128; fWeightMatrix1to2[4][6] = 0.962856568111337; fWeightMatrix1to2[5][6] = -0.152865624267587; fWeightMatrix1to2[6][6] = -0.479648976677944; fWeightMatrix1to2[7][6] = 0.315183306588064; fWeightMatrix1to2[8][6] = -2.21125883607005; fWeightMatrix1to2[9][6] = -0.943930684185703; fWeightMatrix1to2[10][6] = -2.90952595656879; fWeightMatrix1to2[11][6] = -1.67627432691509; fWeightMatrix1to2[12][6] = -1.81540571219882; fWeightMatrix1to2[0][7] = 1.68018128526514; fWeightMatrix1to2[1][7] = -9.38375342997956; fWeightMatrix1to2[2][7] = 5.30993081506868; fWeightMatrix1to2[3][7] = -2.0088844739977; fWeightMatrix1to2[4][7] = 1.15041933230823; fWeightMatrix1to2[5][7] = -0.77078737882739; fWeightMatrix1to2[6][7] = 0.423959975559127; fWeightMatrix1to2[7][7] = 1.17665610783343; fWeightMatrix1to2[8][7] = -1.99381113434275; fWeightMatrix1to2[9][7] = -0.130326238698539; fWeightMatrix1to2[10][7] = 0.0622985405894562; fWeightMatrix1to2[11][7] = -1.92740297841067; fWeightMatrix1to2[12][7] = -2.5551482947094; fWeightMatrix1to2[0][8] = -1.74801395257835; fWeightMatrix1to2[1][8] = -1.4404133729459; fWeightMatrix1to2[2][8] = 2.36053891980715; fWeightMatrix1to2[3][8] = -0.75220291304117; fWeightMatrix1to2[4][8] = 0.268317878150487; fWeightMatrix1to2[5][8] = -0.679632651337308; fWeightMatrix1to2[6][8] = -1.17363709776841; fWeightMatrix1to2[7][8] = -1.44197242949359; fWeightMatrix1to2[8][8] = -1.03990112196349; fWeightMatrix1to2[9][8] = -1.25915975882779; fWeightMatrix1to2[10][8] = 0.646363440353336; fWeightMatrix1to2[11][8] = -1.33983084098294; fWeightMatrix1to2[12][8] = -1.23632650408226; fWeightMatrix1to2[0][9] = 1.65514367379513; fWeightMatrix1to2[1][9] = -0.218058216266414; fWeightMatrix1to2[2][9] = -0.858068703054999; fWeightMatrix1to2[3][9] = 1.77720954759727; fWeightMatrix1to2[4][9] = -1.79897025947967; fWeightMatrix1to2[5][9] = 1.8567712098992; fWeightMatrix1to2[6][9] = 0.808920440764202; fWeightMatrix1to2[7][9] = -1.02577403270271; fWeightMatrix1to2[8][9] = -0.992941868049912; fWeightMatrix1to2[9][9] = -0.360506139928729; fWeightMatrix1to2[10][9] = -1.04928938483684; fWeightMatrix1to2[11][9] = 0.500735600764524; fWeightMatrix1to2[12][9] = 1.83198633824961; fWeightMatrix1to2[0][10] = 0.440451265520073; fWeightMatrix1to2[1][10] = 0.732045203889547; fWeightMatrix1to2[2][10] = -4.61844623020807; fWeightMatrix1to2[3][10] = 0.542108950005348; fWeightMatrix1to2[4][10] = -1.92118772794875; fWeightMatrix1to2[5][10] = -1.79963923497118; fWeightMatrix1to2[6][10] = -2.12058040605178; fWeightMatrix1to2[7][10] = -1.85424878434778; fWeightMatrix1to2[8][10] = -1.0262516628918; fWeightMatrix1to2[9][10] = 0.506655102574136; fWeightMatrix1to2[10][10] = -0.473514371562282; fWeightMatrix1to2[11][10] = 1.04875441973753; fWeightMatrix1to2[12][10] = -0.841882907144659; fWeightMatrix1to2[0][11] = -0.954096436190935; fWeightMatrix1to2[1][11] = -4.13130214552897; fWeightMatrix1to2[2][11] = 1.56720203193989; fWeightMatrix1to2[3][11] = 2.319413555764; fWeightMatrix1to2[4][11] = -1.54263727814243; fWeightMatrix1to2[5][11] = 0.320805824762046; fWeightMatrix1to2[6][11] = -1.64306991167809; fWeightMatrix1to2[7][11] = 0.262939325252119; fWeightMatrix1to2[8][11] = -0.615362474445361; fWeightMatrix1to2[9][11] = -0.156873807622878; fWeightMatrix1to2[10][11] = -1.86252437653259; fWeightMatrix1to2[11][11] = 1.03253463938658; fWeightMatrix1to2[12][11] = 0.402101689683563; fWeightMatrix1to2[0][12] = -0.0170132299850423; fWeightMatrix1to2[1][12] = -1.06283289586489; fWeightMatrix1to2[2][12] = -1.39540647949185; fWeightMatrix1to2[3][12] = -1.41760352217876; fWeightMatrix1to2[4][12] = -1.71649863154372; fWeightMatrix1to2[5][12] = 0.196881996589887; fWeightMatrix1to2[6][12] = -0.827003555250737; fWeightMatrix1to2[7][12] = -0.553264628973757; fWeightMatrix1to2[8][12] = -0.556150782188443; fWeightMatrix1to2[9][12] = -2.10181329527626; fWeightMatrix1to2[10][12] = -1.67280659492896; fWeightMatrix1to2[11][12] = -1.55549485544233; fWeightMatrix1to2[12][12] = -0.612802508662605; fWeightMatrix1to2[0][13] = -1.90275889966112; fWeightMatrix1to2[1][13] = -1.78849545943202; fWeightMatrix1to2[2][13] = -0.461167129037299; fWeightMatrix1to2[3][13] = -0.635145923572381; fWeightMatrix1to2[4][13] = -1.60476713451827; fWeightMatrix1to2[5][13] = 0.670143183412271; fWeightMatrix1to2[6][13] = -1.61898964295536; fWeightMatrix1to2[7][13] = -0.823155703550498; fWeightMatrix1to2[8][13] = 0.901602887951372; fWeightMatrix1to2[9][13] = 0.121700049955183; fWeightMatrix1to2[10][13] = -0.342469218016467; fWeightMatrix1to2[11][13] = 0.0693521830255072; fWeightMatrix1to2[12][13] = -0.124548114053519; fWeightMatrix1to2[0][14] = 0.890655557969695; fWeightMatrix1to2[1][14] = -0.817510842287; fWeightMatrix1to2[2][14] = -1.95211298674488; fWeightMatrix1to2[3][14] = -0.654805048956011; fWeightMatrix1to2[4][14] = -1.7289138396915; fWeightMatrix1to2[5][14] = 0.148055263364597; fWeightMatrix1to2[6][14] = 0.478676188460305; fWeightMatrix1to2[7][14] = 0.481848392589416; fWeightMatrix1to2[8][14] = 0.0110325494005804; fWeightMatrix1to2[9][14] = 0.218648559277742; fWeightMatrix1to2[10][14] = 0.664284576778962; fWeightMatrix1to2[11][14] = -1.26099676890029; fWeightMatrix1to2[12][14] = 0.0658710002344749; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.249823485182569; fWeightMatrix2to3[0][1] = -0.792592032173999; fWeightMatrix2to3[0][2] = -1.33250087137519; fWeightMatrix2to3[0][3] = -1.14742917214832; fWeightMatrix2to3[0][4] = 0.922048092345129; fWeightMatrix2to3[0][5] = 1.40844937402654; fWeightMatrix2to3[0][6] = 0.229363331263492; fWeightMatrix2to3[0][7] = 1.93391005542539; fWeightMatrix2to3[0][8] = 0.483793210368219; fWeightMatrix2to3[0][9] = 1.04286612175288; fWeightMatrix2to3[0][10] = 0.385130557501929; fWeightMatrix2to3[0][11] = 0.780073489871519; fWeightMatrix2to3[0][12] = -0.0453593494497526; fWeightMatrix2to3[0][13] = 0.824531322475022; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l