// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:37:09 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job415 Training events: 34306 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [49.2191009521,771.463745117] LepAPt LepAPt 'F' [20.0000362396,162.429443359] LepBPt LepBPt 'F' [10.0003643036,65.7519302368] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.13847851753,21.1559963226] MetSpec MetSpec 'F' [15.0119848251,261.761749268] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1199874878,517.957397461] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [3.44767546654,304.726287842] addEt addEt 'F' [49.2191009521,385.879455566] dPhiLepSumMet dPhiLepSumMet 'F' [0.00602381536737,3.14158987999] dPhiLeptons dPhiLeptons 'F' [2.03847885132e-05,1.13010489941] dRLeptons dRLeptons 'F' [0.200001657009,1.13414525986] lep1_E lep1_E 'F' [20.0563106537,202.024719238] lep2_E lep2_E 'F' [10.0202493668,113.683082581] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 49.2191009521484; fVmax[0] = 771.463745117188; fVmin[1] = 20.000036239624; fVmax[1] = 162.429443359375; fVmin[2] = 10.0003643035889; fVmax[2] = 65.7519302368164; fVmin[3] = 1.13847851753235; fVmax[3] = 21.1559963226318; fVmin[4] = 15.0119848251343; fVmax[4] = 261.761749267578; fVmin[5] = 30.119987487793; fVmax[5] = 517.957397460938; fVmin[6] = 3.44767546653748; fVmax[6] = 304.726287841797; fVmin[7] = 49.2191009521484; fVmax[7] = 385.879455566406; fVmin[8] = 0.00602381536737084; fVmax[8] = 3.14158987998962; fVmin[9] = 2.03847885131836e-05; fVmax[9] = 1.13010489940643; fVmin[10] = 0.200001657009125; fVmax[10] = 1.13414525985718; fVmin[11] = 20.0563106536865; fVmax[11] = 202.024719238281; fVmin[12] = 10.0202493667603; fVmax[12] = 113.683082580566; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.120089457682876; fWeightMatrix0to1[1][0] = 2.15254946858806; fWeightMatrix0to1[2][0] = 0.85053807904538; fWeightMatrix0to1[3][0] = 2.04040171146855; fWeightMatrix0to1[4][0] = -1.33285481197514; fWeightMatrix0to1[5][0] = -1.28946791245544; fWeightMatrix0to1[6][0] = -1.32195252820841; fWeightMatrix0to1[7][0] = 2.06792075349895; fWeightMatrix0to1[8][0] = -1.28517640152689; fWeightMatrix0to1[9][0] = -0.625030619941634; fWeightMatrix0to1[10][0] = -1.53108931486036; fWeightMatrix0to1[11][0] = -0.117025636996532; fWeightMatrix0to1[12][0] = -0.746264649767288; fWeightMatrix0to1[13][0] = -0.51786798726546; fWeightMatrix0to1[0][1] = -0.454081581075536; fWeightMatrix0to1[1][1] = 1.09487992453112; fWeightMatrix0to1[2][1] = -0.325144358139095; fWeightMatrix0to1[3][1] = 2.00650285735334; fWeightMatrix0to1[4][1] = 1.38759062101092; fWeightMatrix0to1[5][1] = 1.46296700634381; fWeightMatrix0to1[6][1] = -1.86538080928268; fWeightMatrix0to1[7][1] = 0.128494419883346; fWeightMatrix0to1[8][1] = 0.835032924758334; fWeightMatrix0to1[9][1] = 0.856626906095704; fWeightMatrix0to1[10][1] = -1.43907521488769; fWeightMatrix0to1[11][1] = -0.0448397083447431; fWeightMatrix0to1[12][1] = 1.53046403276345; fWeightMatrix0to1[13][1] = -1.26953086280974; fWeightMatrix0to1[0][2] = -1.76110689235785; fWeightMatrix0to1[1][2] = 0.781427052742413; fWeightMatrix0to1[2][2] = 0.325678087754487; fWeightMatrix0to1[3][2] = 1.70054056553453; fWeightMatrix0to1[4][2] = 0.475408418380842; fWeightMatrix0to1[5][2] = 0.17675425530021; fWeightMatrix0to1[6][2] = -0.702353351075194; fWeightMatrix0to1[7][2] = 4.08451864549696; fWeightMatrix0to1[8][2] = -1.46093204323702; fWeightMatrix0to1[9][2] = 5.33822792076092; fWeightMatrix0to1[10][2] = -1.94755320432595; fWeightMatrix0to1[11][2] = 1.29727601840514; fWeightMatrix0to1[12][2] = 0.286119251102682; fWeightMatrix0to1[13][2] = -0.174601242496141; fWeightMatrix0to1[0][3] = 1.97416045896562; fWeightMatrix0to1[1][3] = 0.778170336560668; fWeightMatrix0to1[2][3] = -1.88407870619113; fWeightMatrix0to1[3][3] = -1.41135469232445; fWeightMatrix0to1[4][3] = 1.02710576091271; fWeightMatrix0to1[5][3] = 0.0415447370993914; fWeightMatrix0to1[6][3] = 3.08525008065004; fWeightMatrix0to1[7][3] = 0.387848475653052; fWeightMatrix0to1[8][3] = 1.65283154989757; fWeightMatrix0to1[9][3] = 1.03478589476361; fWeightMatrix0to1[10][3] = -1.00662246712403; fWeightMatrix0to1[11][3] = 1.80447290115223; fWeightMatrix0to1[12][3] = -1.55315083546717; fWeightMatrix0to1[13][3] = -0.225581985720041; fWeightMatrix0to1[0][4] = -1.1254723842463; fWeightMatrix0to1[1][4] = -1.91669012098154; fWeightMatrix0to1[2][4] = 0.998849286983267; fWeightMatrix0to1[3][4] = 0.991817292976486; fWeightMatrix0to1[4][4] = -0.915592239987471; fWeightMatrix0to1[5][4] = 0.381128360336175; fWeightMatrix0to1[6][4] = -0.123584108253021; fWeightMatrix0to1[7][4] = 1.49166593364946; fWeightMatrix0to1[8][4] = 1.95853066568528; fWeightMatrix0to1[9][4] = 0.325897081311802; fWeightMatrix0to1[10][4] = 0.447788648042333; fWeightMatrix0to1[11][4] = -0.811744453156149; fWeightMatrix0to1[12][4] = 1.19874383617374; fWeightMatrix0to1[13][4] = -0.787278496316424; fWeightMatrix0to1[0][5] = -0.526947833870353; fWeightMatrix0to1[1][5] = -1.31397450801548; fWeightMatrix0to1[2][5] = 1.05339809233393; fWeightMatrix0to1[3][5] = 1.4571651517703; fWeightMatrix0to1[4][5] = 1.58137872927988; fWeightMatrix0to1[5][5] = -0.164367684434395; fWeightMatrix0to1[6][5] = -2.10266852844652; fWeightMatrix0to1[7][5] = -3.15501589658453; fWeightMatrix0to1[8][5] = 1.12011583153039; fWeightMatrix0to1[9][5] = 0.20439405273331; fWeightMatrix0to1[10][5] = 2.1722138029929; fWeightMatrix0to1[11][5] = 0.96774848918592; fWeightMatrix0to1[12][5] = 1.58965363372294; fWeightMatrix0to1[13][5] = -0.731550288412139; fWeightMatrix0to1[0][6] = -0.71244162563691; fWeightMatrix0to1[1][6] = 0.0746342808834646; fWeightMatrix0to1[2][6] = 0.834603253354656; fWeightMatrix0to1[3][6] = -0.851450125452847; fWeightMatrix0to1[4][6] = -0.340806508419963; fWeightMatrix0to1[5][6] = -1.40726448048615; fWeightMatrix0to1[6][6] = -1.83958680214539; fWeightMatrix0to1[7][6] = 2.68698150551916; fWeightMatrix0to1[8][6] = 1.73768694462073; fWeightMatrix0to1[9][6] = 0.398471950612497; fWeightMatrix0to1[10][6] = -0.981420788241204; fWeightMatrix0to1[11][6] = 1.88597857970487; fWeightMatrix0to1[12][6] = -0.715337878554999; fWeightMatrix0to1[13][6] = -2.20049771801285; fWeightMatrix0to1[0][7] = -1.27853686857193; fWeightMatrix0to1[1][7] = 0.915535329822734; fWeightMatrix0to1[2][7] = -2.48634276675129; fWeightMatrix0to1[3][7] = -1.49601561737392; fWeightMatrix0to1[4][7] = 3.18864535029252; fWeightMatrix0to1[5][7] = 1.93120461410136; fWeightMatrix0to1[6][7] = -4.24161813510106; fWeightMatrix0to1[7][7] = 8.06791276628049; fWeightMatrix0to1[8][7] = -0.821064570448601; fWeightMatrix0to1[9][7] = 3.02357765991431; fWeightMatrix0to1[10][7] = -0.981962486319965; fWeightMatrix0to1[11][7] = 0.249428332355721; fWeightMatrix0to1[12][7] = -2.00007473614216; fWeightMatrix0to1[13][7] = -0.839846952114623; fWeightMatrix0to1[0][8] = 0.0367365289148692; fWeightMatrix0to1[1][8] = 0.176056391745969; fWeightMatrix0to1[2][8] = -1.93236182188649; fWeightMatrix0to1[3][8] = 1.4328184480029; fWeightMatrix0to1[4][8] = 0.0937352166892912; fWeightMatrix0to1[5][8] = -1.56315521196828; fWeightMatrix0to1[6][8] = 2.60638436912213; fWeightMatrix0to1[7][8] = -2.35698144208583; fWeightMatrix0to1[8][8] = -1.65774945471848; fWeightMatrix0to1[9][8] = 0.777233004303185; fWeightMatrix0to1[10][8] = 0.489915408065292; fWeightMatrix0to1[11][8] = 1.74246541995189; fWeightMatrix0to1[12][8] = -0.497473613312359; fWeightMatrix0to1[13][8] = 0.772628020463739; fWeightMatrix0to1[0][9] = -0.387363708813406; fWeightMatrix0to1[1][9] = 0.121238989427788; fWeightMatrix0to1[2][9] = -2.92745361293875; fWeightMatrix0to1[3][9] = 1.31408762952435; fWeightMatrix0to1[4][9] = 0.987413250237348; fWeightMatrix0to1[5][9] = -1.83553461252991; fWeightMatrix0to1[6][9] = 0.281634675530099; fWeightMatrix0to1[7][9] = 0.0446168633519038; fWeightMatrix0to1[8][9] = -0.0254777889057737; fWeightMatrix0to1[9][9] = 0.601216655571461; fWeightMatrix0to1[10][9] = 1.25349250546945; fWeightMatrix0to1[11][9] = -1.229299252963; fWeightMatrix0to1[12][9] = -0.0909120896811675; fWeightMatrix0to1[13][9] = 1.55771267033409; fWeightMatrix0to1[0][10] = 1.5808444457468; fWeightMatrix0to1[1][10] = -0.399513055569783; fWeightMatrix0to1[2][10] = -2.65558236466389; fWeightMatrix0to1[3][10] = 0.138115517827935; fWeightMatrix0to1[4][10] = -1.87106682716682; fWeightMatrix0to1[5][10] = 1.37240706557535; fWeightMatrix0to1[6][10] = -1.54569076765159; fWeightMatrix0to1[7][10] = 0.130003234385636; fWeightMatrix0to1[8][10] = -0.521373468876996; fWeightMatrix0to1[9][10] = -0.713216430267595; fWeightMatrix0to1[10][10] = -1.71215472055556; fWeightMatrix0to1[11][10] = 1.02774258276573; fWeightMatrix0to1[12][10] = 1.23968848066639; fWeightMatrix0to1[13][10] = -0.634952408712982; fWeightMatrix0to1[0][11] = -0.489234132040391; fWeightMatrix0to1[1][11] = 0.594505335733171; fWeightMatrix0to1[2][11] = 1.60015904623284; fWeightMatrix0to1[3][11] = 0.834380774251822; fWeightMatrix0to1[4][11] = 1.86129122826876; fWeightMatrix0to1[5][11] = -1.25538505261114; fWeightMatrix0to1[6][11] = 2.93628227293542; fWeightMatrix0to1[7][11] = 0.210402135890891; fWeightMatrix0to1[8][11] = 0.554306513161971; fWeightMatrix0to1[9][11] = 0.788949024300924; fWeightMatrix0to1[10][11] = 1.87623727908088; fWeightMatrix0to1[11][11] = -0.446573448139688; fWeightMatrix0to1[12][11] = -0.757455959648193; fWeightMatrix0to1[13][11] = 1.13016734028416; fWeightMatrix0to1[0][12] = -0.81648801902336; fWeightMatrix0to1[1][12] = -0.248524499154443; fWeightMatrix0to1[2][12] = 2.55262443684311; fWeightMatrix0to1[3][12] = 1.19950022991161; fWeightMatrix0to1[4][12] = -0.546876163323278; fWeightMatrix0to1[5][12] = 2.05800246265686; fWeightMatrix0to1[6][12] = 3.56227014690151; fWeightMatrix0to1[7][12] = -1.44269138575136; fWeightMatrix0to1[8][12] = 0.612913825176766; fWeightMatrix0to1[9][12] = 0.0590235961118073; fWeightMatrix0to1[10][12] = 0.816411441375226; fWeightMatrix0to1[11][12] = -1.735796284838; fWeightMatrix0to1[12][12] = -2.20667355539576; fWeightMatrix0to1[13][12] = 1.21235102357448; fWeightMatrix0to1[0][13] = 1.66423010560508; fWeightMatrix0to1[1][13] = -0.502901424720802; fWeightMatrix0to1[2][13] = -0.307815183389059; fWeightMatrix0to1[3][13] = -1.09174734496986; fWeightMatrix0to1[4][13] = 2.9175257083599; fWeightMatrix0to1[5][13] = 1.82777998011329; fWeightMatrix0to1[6][13] = -5.2772623816736; fWeightMatrix0to1[7][13] = 11.156917685313; fWeightMatrix0to1[8][13] = -1.12337335943931; fWeightMatrix0to1[9][13] = 6.56597017962155; fWeightMatrix0to1[10][13] = -0.55553625941257; fWeightMatrix0to1[11][13] = -0.181782337230126; fWeightMatrix0to1[12][13] = -0.745105131115661; fWeightMatrix0to1[13][13] = -2.88800985775118; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.32351875734635; fWeightMatrix1to2[1][0] = -0.389415066980367; fWeightMatrix1to2[2][0] = -0.0279052866801723; fWeightMatrix1to2[3][0] = -1.76328470918284; fWeightMatrix1to2[4][0] = -0.698948930071025; fWeightMatrix1to2[5][0] = -0.92590453439661; fWeightMatrix1to2[6][0] = -0.0864309675175915; fWeightMatrix1to2[7][0] = -2.15320933323762; fWeightMatrix1to2[8][0] = 0.277980533324642; fWeightMatrix1to2[9][0] = -0.170638872436261; fWeightMatrix1to2[10][0] = -1.36470676517877; fWeightMatrix1to2[11][0] = -0.18492106576885; fWeightMatrix1to2[12][0] = -0.199009571001434; fWeightMatrix1to2[0][1] = 1.98166083991601; fWeightMatrix1to2[1][1] = 0.18403089776931; fWeightMatrix1to2[2][1] = -0.222540905608673; fWeightMatrix1to2[3][1] = -2.07598382532341; fWeightMatrix1to2[4][1] = -0.936083855538514; fWeightMatrix1to2[5][1] = 1.64290854937972; fWeightMatrix1to2[6][1] = 1.26054528852186; fWeightMatrix1to2[7][1] = 0.667695816219133; fWeightMatrix1to2[8][1] = 0.0926563717347396; fWeightMatrix1to2[9][1] = 1.89605876989719; fWeightMatrix1to2[10][1] = -1.74254390863067; fWeightMatrix1to2[11][1] = -1.31072893481635; fWeightMatrix1to2[12][1] = -0.266260570634618; fWeightMatrix1to2[0][2] = -3.55814789307377; fWeightMatrix1to2[1][2] = 1.92183357858134; fWeightMatrix1to2[2][2] = 1.69921975740603; fWeightMatrix1to2[3][2] = -1.76571507186962; fWeightMatrix1to2[4][2] = 0.777687770300489; fWeightMatrix1to2[5][2] = -0.194185904150317; fWeightMatrix1to2[6][2] = -1.3644624866284; fWeightMatrix1to2[7][2] = -1.1353757072548; fWeightMatrix1to2[8][2] = 0.921077109591406; fWeightMatrix1to2[9][2] = -0.136698395294774; fWeightMatrix1to2[10][2] = 0.287390972657846; fWeightMatrix1to2[11][2] = -1.20876860587491; fWeightMatrix1to2[12][2] = 1.70218259708938; fWeightMatrix1to2[0][3] = -1.52903466639574; fWeightMatrix1to2[1][3] = 0.613479744716518; fWeightMatrix1to2[2][3] = 0.0553538140857676; fWeightMatrix1to2[3][3] = -1.59527092667943; fWeightMatrix1to2[4][3] = -1.31183981921151; fWeightMatrix1to2[5][3] = -1.90605545143766; fWeightMatrix1to2[6][3] = -0.299927081420826; fWeightMatrix1to2[7][3] = -1.89276380260405; fWeightMatrix1to2[8][3] = -1.67427493046678; fWeightMatrix1to2[9][3] = -1.77961515927349; fWeightMatrix1to2[10][3] = 0.162600020669063; fWeightMatrix1to2[11][3] = -0.993968836227962; fWeightMatrix1to2[12][3] = -1.60767095336697; fWeightMatrix1to2[0][4] = 3.29758571148636; fWeightMatrix1to2[1][4] = -0.773000612167975; fWeightMatrix1to2[2][4] = -0.229163703877986; fWeightMatrix1to2[3][4] = -1.76719068928246; fWeightMatrix1to2[4][4] = 0.186965580076465; fWeightMatrix1to2[5][4] = 1.05579684768933; fWeightMatrix1to2[6][4] = 1.01363837678243; fWeightMatrix1to2[7][4] = -1.21137527493931; fWeightMatrix1to2[8][4] = -1.94339655172803; fWeightMatrix1to2[9][4] = -1.6874947712557; fWeightMatrix1to2[10][4] = -1.90491117722346; fWeightMatrix1to2[11][4] = -1.85412467945659; fWeightMatrix1to2[12][4] = -0.294248175204646; fWeightMatrix1to2[0][5] = 0.984906698099038; fWeightMatrix1to2[1][5] = 0.540262051855287; fWeightMatrix1to2[2][5] = -0.843360743488751; fWeightMatrix1to2[3][5] = -0.446684637277053; fWeightMatrix1to2[4][5] = -0.546160422645752; fWeightMatrix1to2[5][5] = -2.9387046669209; fWeightMatrix1to2[6][5] = 1.18973613528536; fWeightMatrix1to2[7][5] = 1.03521613624312; fWeightMatrix1to2[8][5] = -1.90557880807511; fWeightMatrix1to2[9][5] = -1.72135943576455; fWeightMatrix1to2[10][5] = 0.308302125853757; fWeightMatrix1to2[11][5] = 1.37094250261764; fWeightMatrix1to2[12][5] = -1.05157402361559; fWeightMatrix1to2[0][6] = -3.64165186250943; fWeightMatrix1to2[1][6] = 0.247545814183872; fWeightMatrix1to2[2][6] = 1.11437566633484; fWeightMatrix1to2[3][6] = 1.80750102648245; fWeightMatrix1to2[4][6] = 1.02764442170314; fWeightMatrix1to2[5][6] = -1.35193694464637; fWeightMatrix1to2[6][6] = 0.199317148163864; fWeightMatrix1to2[7][6] = -0.379302226805741; fWeightMatrix1to2[8][6] = -1.89806948538088; fWeightMatrix1to2[9][6] = -1.40223004599961; fWeightMatrix1to2[10][6] = -2.41201427364585; fWeightMatrix1to2[11][6] = -2.14018074870054; fWeightMatrix1to2[12][6] = -2.60787769317572; fWeightMatrix1to2[0][7] = 7.14469355118306; fWeightMatrix1to2[1][7] = -3.04445441835204; fWeightMatrix1to2[2][7] = 1.41397169294455; fWeightMatrix1to2[3][7] = -2.25974471179241; fWeightMatrix1to2[4][7] = 1.09720778268172; fWeightMatrix1to2[5][7] = -1.34350463267559; fWeightMatrix1to2[6][7] = 0.306155893729448; fWeightMatrix1to2[7][7] = 0.313905271999056; fWeightMatrix1to2[8][7] = -2.28375561210726; fWeightMatrix1to2[9][7] = 0.113411925508277; fWeightMatrix1to2[10][7] = -0.639755091847384; fWeightMatrix1to2[11][7] = -1.89129842875254; fWeightMatrix1to2[12][7] = -2.10016625374403; fWeightMatrix1to2[0][8] = -1.54194712944822; fWeightMatrix1to2[1][8] = -0.890620190867397; fWeightMatrix1to2[2][8] = 1.8421183298272; fWeightMatrix1to2[3][8] = -0.616572997835539; fWeightMatrix1to2[4][8] = 0.262742186320173; fWeightMatrix1to2[5][8] = -0.916482589627196; fWeightMatrix1to2[6][8] = -1.27471843423841; fWeightMatrix1to2[7][8] = -1.74605530561382; fWeightMatrix1to2[8][8] = -1.00960468177627; fWeightMatrix1to2[9][8] = -1.49776951119883; fWeightMatrix1to2[10][8] = 0.822579800539793; fWeightMatrix1to2[11][8] = -1.35650788033116; fWeightMatrix1to2[12][8] = -1.20406526324805; fWeightMatrix1to2[0][9] = 5.65119903468524; fWeightMatrix1to2[1][9] = -0.402022307692751; fWeightMatrix1to2[2][9] = -0.775943007899748; fWeightMatrix1to2[3][9] = 1.67635813195242; fWeightMatrix1to2[4][9] = -1.80337412343162; fWeightMatrix1to2[5][9] = 1.23452731269863; fWeightMatrix1to2[6][9] = 1.45498373663645; fWeightMatrix1to2[7][9] = -1.20351748226346; fWeightMatrix1to2[8][9] = -1.4104253147767; fWeightMatrix1to2[9][9] = 0.120429661350414; fWeightMatrix1to2[10][9] = -1.29968860453695; fWeightMatrix1to2[11][9] = 0.374623649373524; fWeightMatrix1to2[12][9] = 1.08631590612262; fWeightMatrix1to2[0][10] = -0.962903181175043; fWeightMatrix1to2[1][10] = 1.45451843723817; fWeightMatrix1to2[2][10] = -1.56035444898137; fWeightMatrix1to2[3][10] = 0.520934197385885; fWeightMatrix1to2[4][10] = -1.90849740524801; fWeightMatrix1to2[5][10] = -0.460694339777134; fWeightMatrix1to2[6][10] = -1.48109051864727; fWeightMatrix1to2[7][10] = -1.85486815970781; fWeightMatrix1to2[8][10] = -0.869886799749887; fWeightMatrix1to2[9][10] = 0.257772826180588; fWeightMatrix1to2[10][10] = -0.535851278874098; fWeightMatrix1to2[11][10] = 0.775108125239724; fWeightMatrix1to2[12][10] = -0.702412685676834; fWeightMatrix1to2[0][11] = 0.332221333645022; fWeightMatrix1to2[1][11] = -2.21927001734634; fWeightMatrix1to2[2][11] = -0.389755551447533; fWeightMatrix1to2[3][11] = 1.90832133468411; fWeightMatrix1to2[4][11] = -1.58510855392017; fWeightMatrix1to2[5][11] = -0.739761496532713; fWeightMatrix1to2[6][11] = -1.90103272796042; fWeightMatrix1to2[7][11] = -0.811459322768201; fWeightMatrix1to2[8][11] = -0.951299986336027; fWeightMatrix1to2[9][11] = -0.584334970244031; fWeightMatrix1to2[10][11] = -1.71099446737897; fWeightMatrix1to2[11][11] = 0.906955463745593; fWeightMatrix1to2[12][11] = 0.0257092262893646; fWeightMatrix1to2[0][12] = -0.880077371734364; fWeightMatrix1to2[1][12] = -0.993536012270824; fWeightMatrix1to2[2][12] = -1.3249210975518; fWeightMatrix1to2[3][12] = -1.16084552200211; fWeightMatrix1to2[4][12] = -1.76208022178563; fWeightMatrix1to2[5][12] = -0.0688644259227586; fWeightMatrix1to2[6][12] = -1.27287490299383; fWeightMatrix1to2[7][12] = -1.25312923538786; fWeightMatrix1to2[8][12] = -0.466456009642484; fWeightMatrix1to2[9][12] = -2.81017339833382; fWeightMatrix1to2[10][12] = -1.48397238335673; fWeightMatrix1to2[11][12] = -1.68528603731963; fWeightMatrix1to2[12][12] = -0.421230842263889; fWeightMatrix1to2[0][13] = -2.25480305677116; fWeightMatrix1to2[1][13] = -1.82048130660791; fWeightMatrix1to2[2][13] = -0.854525101758473; fWeightMatrix1to2[3][13] = -0.913885422132197; fWeightMatrix1to2[4][13] = -1.59308142165023; fWeightMatrix1to2[5][13] = 0.448475544561064; fWeightMatrix1to2[6][13] = -0.876652965796557; fWeightMatrix1to2[7][13] = -1.09110094864729; fWeightMatrix1to2[8][13] = 0.945006869878002; fWeightMatrix1to2[9][13] = -0.0384177862244189; fWeightMatrix1to2[10][13] = 0.073691131405275; fWeightMatrix1to2[11][13] = -0.182090882095331; fWeightMatrix1to2[12][13] = -0.127913483675346; fWeightMatrix1to2[0][14] = 2.46048576831961; fWeightMatrix1to2[1][14] = -0.994786540450627; fWeightMatrix1to2[2][14] = -1.4935636456615; fWeightMatrix1to2[3][14] = -1.11092498457854; fWeightMatrix1to2[4][14] = -1.75281041093402; fWeightMatrix1to2[5][14] = -0.359324883694181; fWeightMatrix1to2[6][14] = 0.674529419103109; fWeightMatrix1to2[7][14] = -0.173748297095458; fWeightMatrix1to2[8][14] = -0.192932379342536; fWeightMatrix1to2[9][14] = -0.161461930261323; fWeightMatrix1to2[10][14] = 0.714410253396257; fWeightMatrix1to2[11][14] = -1.40973281790295; fWeightMatrix1to2[12][14] = -0.119788664664893; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.664448682024908; fWeightMatrix2to3[0][1] = -0.703807451138189; fWeightMatrix2to3[0][2] = 0.257103701152336; fWeightMatrix2to3[0][3] = -1.2576916754568; fWeightMatrix2to3[0][4] = 0.895798904775559; fWeightMatrix2to3[0][5] = -0.13236229016189; fWeightMatrix2to3[0][6] = 0.245548503219875; fWeightMatrix2to3[0][7] = 1.63775883706169; fWeightMatrix2to3[0][8] = 0.179644101235615; fWeightMatrix2to3[0][9] = 0.640432743595119; fWeightMatrix2to3[0][10] = 0.696128103997344; fWeightMatrix2to3[0][11] = 0.904326030341091; fWeightMatrix2to3[0][12] = -0.305669430022453; fWeightMatrix2to3[0][13] = 0.106429083784534; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l