// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Tue Jan 10 17:29:18 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run21068/job400 Training events: 1260 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [60.671245575,377.676025391] LepAPt LepAPt 'F' [20.0074310303,126.850402832] LepBPt LepBPt 'F' [10.0047864914,54.2998809814] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.0460740737617,12.663476944] MetSpec MetSpec 'F' [0.000646304921247,155.952484131] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.990530014,226.939117432] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.726633906364,156.612548828] addEt addEt 'F' [33.6726493835,237.179931641] dPhiLepSumMet dPhiLepSumMet 'F' [0.0043980749324,3.14124560356] dPhiLeptons dPhiLeptons 'F' [0.000980377197266,1.03099870682] dRLeptons dRLeptons 'F' [0.200389400125,1.03571224213] lep1_E lep1_E 'F' [20.6455631256,303.092254639] lep2_E lep2_E 'F' [10.4138088226,131.389846802] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 60.6712455749512; fVmax[0] = 377.676025390625; fVmin[1] = 20.0074310302734; fVmax[1] = 126.850402832031; fVmin[2] = 10.004786491394; fVmax[2] = 54.2998809814453; fVmin[3] = 0.0460740737617016; fVmax[3] = 12.6634769439697; fVmin[4] = 0.000646304921247065; fVmax[4] = 155.952484130859; fVmin[5] = 30.9905300140381; fVmax[5] = 226.939117431641; fVmin[6] = 0.726633906364441; fVmax[6] = 156.612548828125; fVmin[7] = 33.6726493835449; fVmax[7] = 237.179931640625; fVmin[8] = 0.00439807493239641; fVmax[8] = 3.1412456035614; fVmin[9] = 0.000980377197265625; fVmax[9] = 1.03099870681763; fVmin[10] = 0.20038940012455; fVmax[10] = 1.03571224212646; fVmin[11] = 20.6455631256104; fVmax[11] = 303.092254638672; fVmin[12] = 10.4138088226318; fVmax[12] = 131.389846801758; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.286522366294792; fWeightMatrix0to1[1][0] = 1.88725540204571; fWeightMatrix0to1[2][0] = 1.03458813914293; fWeightMatrix0to1[3][0] = 1.75575092053297; fWeightMatrix0to1[4][0] = -1.52277940135079; fWeightMatrix0to1[5][0] = -1.73979170249005; fWeightMatrix0to1[6][0] = -1.09826026932613; fWeightMatrix0to1[7][0] = 2.63166964278042; fWeightMatrix0to1[8][0] = -1.21109954996247; fWeightMatrix0to1[9][0] = -0.704233819793322; fWeightMatrix0to1[10][0] = -1.64666351356841; fWeightMatrix0to1[11][0] = -0.573133919792165; fWeightMatrix0to1[12][0] = -1.61643756405462; fWeightMatrix0to1[13][0] = -0.115768814845877; fWeightMatrix0to1[0][1] = -0.613608908065789; fWeightMatrix0to1[1][1] = 0.526927012703438; fWeightMatrix0to1[2][1] = -0.400349841832598; fWeightMatrix0to1[3][1] = 1.72751933178592; fWeightMatrix0to1[4][1] = 0.254015679101072; fWeightMatrix0to1[5][1] = 1.3704863725679; fWeightMatrix0to1[6][1] = -0.526811735330286; fWeightMatrix0to1[7][1] = -0.695790289841607; fWeightMatrix0to1[8][1] = 0.406405689009652; fWeightMatrix0to1[9][1] = 0.411937772666198; fWeightMatrix0to1[10][1] = -0.799326757634946; fWeightMatrix0to1[11][1] = -0.439440582822004; fWeightMatrix0to1[12][1] = 1.49840539217291; fWeightMatrix0to1[13][1] = -1.41342051209975; fWeightMatrix0to1[0][2] = -1.92899290634003; fWeightMatrix0to1[1][2] = 0.240393970633132; fWeightMatrix0to1[2][2] = 0.749139054202634; fWeightMatrix0to1[3][2] = 1.64438951238075; fWeightMatrix0to1[4][2] = 0.011206174728277; fWeightMatrix0to1[5][2] = -0.558468364401088; fWeightMatrix0to1[6][2] = -0.345246690045118; fWeightMatrix0to1[7][2] = 0.493465756074663; fWeightMatrix0to1[8][2] = -1.43812202258306; fWeightMatrix0to1[9][2] = 2.16541736305644; fWeightMatrix0to1[10][2] = -1.14366484576979; fWeightMatrix0to1[11][2] = 0.887065256978124; fWeightMatrix0to1[12][2] = 0.599584593494266; fWeightMatrix0to1[13][2] = 2.02057006733763; fWeightMatrix0to1[0][3] = 2.07366961249606; fWeightMatrix0to1[1][3] = 1.43324929165644; fWeightMatrix0to1[2][3] = -0.607969072076741; fWeightMatrix0to1[3][3] = -1.6489886513003; fWeightMatrix0to1[4][3] = 0.667225918695308; fWeightMatrix0to1[5][3] = 0.0240166983352316; fWeightMatrix0to1[6][3] = 1.5609329760527; fWeightMatrix0to1[7][3] = 1.77965052595208; fWeightMatrix0to1[8][3] = 1.67960328186658; fWeightMatrix0to1[9][3] = 1.66796714891155; fWeightMatrix0to1[10][3] = -2.09064133510349; fWeightMatrix0to1[11][3] = 1.31198509004378; fWeightMatrix0to1[12][3] = -2.33553561756648; fWeightMatrix0to1[13][3] = -1.10093740020357; fWeightMatrix0to1[0][4] = -1.18912894270454; fWeightMatrix0to1[1][4] = -1.59622544544874; fWeightMatrix0to1[2][4] = 1.65937677816578; fWeightMatrix0to1[3][4] = 0.492791272182739; fWeightMatrix0to1[4][4] = -1.34838477236691; fWeightMatrix0to1[5][4] = 0.2835852255936; fWeightMatrix0to1[6][4] = -0.396953108113276; fWeightMatrix0to1[7][4] = 0.926697831664822; fWeightMatrix0to1[8][4] = 1.96474029008502; fWeightMatrix0to1[9][4] = 0.169299452419955; fWeightMatrix0to1[10][4] = -0.014343403014478; fWeightMatrix0to1[11][4] = -1.29357229589032; fWeightMatrix0to1[12][4] = 0.520348498873308; fWeightMatrix0to1[13][4] = -0.85778223935176; fWeightMatrix0to1[0][5] = -0.790649697704057; fWeightMatrix0to1[1][5] = -1.62061455786194; fWeightMatrix0to1[2][5] = 0.855469218414659; fWeightMatrix0to1[3][5] = 1.19438326790571; fWeightMatrix0to1[4][5] = 1.86899764389085; fWeightMatrix0to1[5][5] = -0.660630583381346; fWeightMatrix0to1[6][5] = -2.50682741688847; fWeightMatrix0to1[7][5] = -1.12288459336878; fWeightMatrix0to1[8][5] = 1.16846689203211; fWeightMatrix0to1[9][5] = 0.724079930039047; fWeightMatrix0to1[10][5] = 2.07947234503915; fWeightMatrix0to1[11][5] = 0.468670398453757; fWeightMatrix0to1[12][5] = 0.453810944875461; fWeightMatrix0to1[13][5] = -0.156067922922021; fWeightMatrix0to1[0][6] = -0.730634664604929; fWeightMatrix0to1[1][6] = -0.253432549089898; fWeightMatrix0to1[2][6] = 1.10591512485936; fWeightMatrix0to1[3][6] = -1.0770943616184; fWeightMatrix0to1[4][6] = -1.28204379398945; fWeightMatrix0to1[5][6] = -1.88740928083461; fWeightMatrix0to1[6][6] = -1.09707870566463; fWeightMatrix0to1[7][6] = 0.393153435335319; fWeightMatrix0to1[8][6] = 1.87322191345686; fWeightMatrix0to1[9][6] = -1.28500510122909; fWeightMatrix0to1[10][6] = -0.706369860392116; fWeightMatrix0to1[11][6] = 1.52074921383537; fWeightMatrix0to1[12][6] = -1.58607325569131; fWeightMatrix0to1[13][6] = -1.2550221085557; fWeightMatrix0to1[0][7] = -1.3167250740687; fWeightMatrix0to1[1][7] = 0.0619706616198957; fWeightMatrix0to1[2][7] = -1.79570046090188; fWeightMatrix0to1[3][7] = -1.55032468208097; fWeightMatrix0to1[4][7] = 1.057270731562; fWeightMatrix0to1[5][7] = 1.38105569936535; fWeightMatrix0to1[6][7] = -2.21433561820731; fWeightMatrix0to1[7][7] = 2.58130674852154; fWeightMatrix0to1[8][7] = -0.74832120201518; fWeightMatrix0to1[9][7] = -0.385172859468216; fWeightMatrix0to1[10][7] = -0.175804479472569; fWeightMatrix0to1[11][7] = 0.220752343255371; fWeightMatrix0to1[12][7] = -3.03494704614933; fWeightMatrix0to1[13][7] = 1.01735389592462; fWeightMatrix0to1[0][8] = 0.38411314544751; fWeightMatrix0to1[1][8] = 0.604905671696411; fWeightMatrix0to1[2][8] = -1.54503645056559; fWeightMatrix0to1[3][8] = 1.84388630919833; fWeightMatrix0to1[4][8] = 0.109295026809923; fWeightMatrix0to1[5][8] = -0.986037863263279; fWeightMatrix0to1[6][8] = -0.870285069626333; fWeightMatrix0to1[7][8] = -0.607099042211552; fWeightMatrix0to1[8][8] = -1.40015174511671; fWeightMatrix0to1[9][8] = -0.0411784289785462; fWeightMatrix0to1[10][8] = -0.45276147877852; fWeightMatrix0to1[11][8] = 1.56719057690558; fWeightMatrix0to1[12][8] = -0.49784884005105; fWeightMatrix0to1[13][8] = 0.339560016215593; fWeightMatrix0to1[0][9] = -0.422080779186209; fWeightMatrix0to1[1][9] = 0.127739007074652; fWeightMatrix0to1[2][9] = -1.86930213101446; fWeightMatrix0to1[3][9] = 0.978850194284441; fWeightMatrix0to1[4][9] = 0.0866763827205046; fWeightMatrix0to1[5][9] = -1.97916039176449; fWeightMatrix0to1[6][9] = 1.03335346075438; fWeightMatrix0to1[7][9] = -1.20271874695533; fWeightMatrix0to1[8][9] = -0.0335454549367237; fWeightMatrix0to1[9][9] = 1.31716551360159; fWeightMatrix0to1[10][9] = 1.84578670876948; fWeightMatrix0to1[11][9] = -1.79595386496075; fWeightMatrix0to1[12][9] = 0.0868021654102777; fWeightMatrix0to1[13][9] = 1.50962912107103; fWeightMatrix0to1[0][10] = 1.51625426265888; fWeightMatrix0to1[1][10] = 0.416945755538514; fWeightMatrix0to1[2][10] = -1.34070688205791; fWeightMatrix0to1[3][10] = -0.26703483927344; fWeightMatrix0to1[4][10] = -2.02995477677108; fWeightMatrix0to1[5][10] = 1.36965819718447; fWeightMatrix0to1[6][10] = 0.565557550165653; fWeightMatrix0to1[7][10] = -0.381716186693492; fWeightMatrix0to1[8][10] = -0.776326101371904; fWeightMatrix0to1[9][10] = 0.667408788588328; fWeightMatrix0to1[10][10] = -1.29906306764684; fWeightMatrix0to1[11][10] = 0.747579022646064; fWeightMatrix0to1[12][10] = 0.288488535811337; fWeightMatrix0to1[13][10] = -0.404748222012029; fWeightMatrix0to1[0][11] = -0.708888181668677; fWeightMatrix0to1[1][11] = 0.431640120736054; fWeightMatrix0to1[2][11] = 1.09777415404333; fWeightMatrix0to1[3][11] = 0.60378718285837; fWeightMatrix0to1[4][11] = 0.595574283036833; fWeightMatrix0to1[5][11] = -1.40141909600458; fWeightMatrix0to1[6][11] = 0.650139537915781; fWeightMatrix0to1[7][11] = 0.646339982619605; fWeightMatrix0to1[8][11] = -0.20625352453703; fWeightMatrix0to1[9][11] = 0.723088760237859; fWeightMatrix0to1[10][11] = 1.56768123064046; fWeightMatrix0to1[11][11] = -0.970889689485199; fWeightMatrix0to1[12][11] = -0.401729659526059; fWeightMatrix0to1[13][11] = 0.124600417098888; fWeightMatrix0to1[0][12] = -0.993739414109334; fWeightMatrix0to1[1][12] = -0.0921472866060661; fWeightMatrix0to1[2][12] = 1.64907825038987; fWeightMatrix0to1[3][12] = 1.11801657612531; fWeightMatrix0to1[4][12] = -0.373854373709091; fWeightMatrix0to1[5][12] = 1.26650162677298; fWeightMatrix0to1[6][12] = 1.68797769957359; fWeightMatrix0to1[7][12] = -1.47913189270086; fWeightMatrix0to1[8][12] = 0.165797179010993; fWeightMatrix0to1[9][12] = 0.291229503248794; fWeightMatrix0to1[10][12] = 0.756435519750508; fWeightMatrix0to1[11][12] = -2.0063728048294; fWeightMatrix0to1[12][12] = -1.42883859780576; fWeightMatrix0to1[13][12] = 1.76999540964676; fWeightMatrix0to1[0][13] = 1.89171947260492; fWeightMatrix0to1[1][13] = -1.75205221023273; fWeightMatrix0to1[2][13] = -0.241699778819702; fWeightMatrix0to1[3][13] = -0.358472997708387; fWeightMatrix0to1[4][13] = 0.245164678029388; fWeightMatrix0to1[5][13] = 1.61359344678653; fWeightMatrix0to1[6][13] = -1.9259779321472; fWeightMatrix0to1[7][13] = 0.254868660770318; fWeightMatrix0to1[8][13] = -0.730802288590237; fWeightMatrix0to1[9][13] = 0.808077151063988; fWeightMatrix0to1[10][13] = 1.28671186237147; fWeightMatrix0to1[11][13] = 1.29973107673049; fWeightMatrix0to1[12][13] = -1.51757565793991; fWeightMatrix0to1[13][13] = 0.498085940867854; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.52486775467747; fWeightMatrix1to2[1][0] = 0.426023079236248; fWeightMatrix1to2[2][0] = 0.375835745452262; fWeightMatrix1to2[3][0] = -1.77456216890997; fWeightMatrix1to2[4][0] = -0.661518822127812; fWeightMatrix1to2[5][0] = -0.473444227443232; fWeightMatrix1to2[6][0] = -0.358497747643072; fWeightMatrix1to2[7][0] = -1.93294102532768; fWeightMatrix1to2[8][0] = 1.01382069632889; fWeightMatrix1to2[9][0] = 0.640479898739759; fWeightMatrix1to2[10][0] = -0.583640836132518; fWeightMatrix1to2[11][0] = 0.0615677396979849; fWeightMatrix1to2[12][0] = 1.56936575274021; fWeightMatrix1to2[0][1] = 1.02210227276727; fWeightMatrix1to2[1][1] = 0.314417290274737; fWeightMatrix1to2[2][1] = -0.176777152181447; fWeightMatrix1to2[3][1] = -2.00384278379447; fWeightMatrix1to2[4][1] = -0.921217799261972; fWeightMatrix1to2[5][1] = 1.63542906241445; fWeightMatrix1to2[6][1] = 1.31312600130518; fWeightMatrix1to2[7][1] = 0.754665346624678; fWeightMatrix1to2[8][1] = 0.133575905676936; fWeightMatrix1to2[9][1] = 1.96819125948413; fWeightMatrix1to2[10][1] = -1.79196715017889; fWeightMatrix1to2[11][1] = -1.30545686783361; fWeightMatrix1to2[12][1] = -0.123463815692537; fWeightMatrix1to2[0][2] = -1.45984963228105; fWeightMatrix1to2[1][2] = 0.874845593518741; fWeightMatrix1to2[2][2] = 1.73699896571388; fWeightMatrix1to2[3][2] = -1.30621508154441; fWeightMatrix1to2[4][2] = 0.79846908421209; fWeightMatrix1to2[5][2] = -0.216151012204619; fWeightMatrix1to2[6][2] = -1.72377899999605; fWeightMatrix1to2[7][2] = -1.126964779818; fWeightMatrix1to2[8][2] = 0.973542744716368; fWeightMatrix1to2[9][2] = 0.0111356947820115; fWeightMatrix1to2[10][2] = 0.894454678606861; fWeightMatrix1to2[11][2] = -1.27016648004324; fWeightMatrix1to2[12][2] = 1.68178117093184; fWeightMatrix1to2[0][3] = -1.73117046524679; fWeightMatrix1to2[1][3] = 0.593280323033836; fWeightMatrix1to2[2][3] = 0.0553877535739254; fWeightMatrix1to2[3][3] = -1.57691508077604; fWeightMatrix1to2[4][3] = -1.31119383389925; fWeightMatrix1to2[5][3] = -1.88329218657836; fWeightMatrix1to2[6][3] = -0.38746645244843; fWeightMatrix1to2[7][3] = -1.91742720995605; fWeightMatrix1to2[8][3] = -1.65183417513039; fWeightMatrix1to2[9][3] = -1.79682885481369; fWeightMatrix1to2[10][3] = 0.180734733289192; fWeightMatrix1to2[11][3] = -1.01441065454723; fWeightMatrix1to2[12][3] = -1.59565368115017; fWeightMatrix1to2[0][4] = 1.01386449409726; fWeightMatrix1to2[1][4] = -0.668362387061251; fWeightMatrix1to2[2][4] = 0.284274213673215; fWeightMatrix1to2[3][4] = -1.30016764477459; fWeightMatrix1to2[4][4] = 0.241302854985123; fWeightMatrix1to2[5][4] = 1.08870046713619; fWeightMatrix1to2[6][4] = 0.877367135910187; fWeightMatrix1to2[7][4] = -0.891571683543565; fWeightMatrix1to2[8][4] = -1.95512711136616; fWeightMatrix1to2[9][4] = -1.71421038717968; fWeightMatrix1to2[10][4] = -0.780382431510122; fWeightMatrix1to2[11][4] = -1.61268228758049; fWeightMatrix1to2[12][4] = 0.292089994945369; fWeightMatrix1to2[0][5] = -0.100061696793301; fWeightMatrix1to2[1][5] = 1.19187262741015; fWeightMatrix1to2[2][5] = -0.425997276673979; fWeightMatrix1to2[3][5] = -0.10583887633885; fWeightMatrix1to2[4][5] = -0.497275437924329; fWeightMatrix1to2[5][5] = -2.6195397688325; fWeightMatrix1to2[6][5] = 0.918838041436973; fWeightMatrix1to2[7][5] = 1.28452583596286; fWeightMatrix1to2[8][5] = -1.30368555888358; fWeightMatrix1to2[9][5] = -0.997512530783628; fWeightMatrix1to2[10][5] = 1.03366624262064; fWeightMatrix1to2[11][5] = 1.61131400991339; fWeightMatrix1to2[12][5] = 0.380804155597417; fWeightMatrix1to2[0][6] = -1.00300520575886; fWeightMatrix1to2[1][6] = 1.38099991864204; fWeightMatrix1to2[2][6] = 1.52781349626197; fWeightMatrix1to2[3][6] = 0.772222085497882; fWeightMatrix1to2[4][6] = 1.03082286821288; fWeightMatrix1to2[5][6] = -1.03309541729085; fWeightMatrix1to2[6][6] = 0.311548353829738; fWeightMatrix1to2[7][6] = 0.20085436267719; fWeightMatrix1to2[8][6] = -1.58884403977022; fWeightMatrix1to2[9][6] = -0.0266183067314733; fWeightMatrix1to2[10][6] = -1.84119492465681; fWeightMatrix1to2[11][6] = -1.78591411757196; fWeightMatrix1to2[12][6] = -1.56627538199792; fWeightMatrix1to2[0][7] = 1.51691534090512; fWeightMatrix1to2[1][7] = -2.28752518516177; fWeightMatrix1to2[2][7] = 1.49159380104086; fWeightMatrix1to2[3][7] = -1.66591822730947; fWeightMatrix1to2[4][7] = 1.14901639311763; fWeightMatrix1to2[5][7] = -1.44716064793835; fWeightMatrix1to2[6][7] = 0.309608063862466; fWeightMatrix1to2[7][7] = 0.611297113870058; fWeightMatrix1to2[8][7] = -2.00188274484222; fWeightMatrix1to2[9][7] = -0.382677771349337; fWeightMatrix1to2[10][7] = -0.0825744997066915; fWeightMatrix1to2[11][7] = -2.46761443422454; fWeightMatrix1to2[12][7] = -1.54704289918988; fWeightMatrix1to2[0][8] = -1.75541704897924; fWeightMatrix1to2[1][8] = -0.828966478089693; fWeightMatrix1to2[2][8] = 1.82752266030993; fWeightMatrix1to2[3][8] = -0.580890324175979; fWeightMatrix1to2[4][8] = 0.288025868206231; fWeightMatrix1to2[5][8] = -0.973203843552008; fWeightMatrix1to2[6][8] = -1.18548651965347; fWeightMatrix1to2[7][8] = -1.77763712189069; fWeightMatrix1to2[8][8] = -0.993590809121902; fWeightMatrix1to2[9][8] = -1.48215403705191; fWeightMatrix1to2[10][8] = 0.846443651112434; fWeightMatrix1to2[11][8] = -1.4787078089688; fWeightMatrix1to2[12][8] = -1.24044556286773; fWeightMatrix1to2[0][9] = 1.71478779180549; fWeightMatrix1to2[1][9] = -1.0249716873773; fWeightMatrix1to2[2][9] = -0.759101428220276; fWeightMatrix1to2[3][9] = 1.49851994121841; fWeightMatrix1to2[4][9] = -1.78933514433764; fWeightMatrix1to2[5][9] = 1.23764756029631; fWeightMatrix1to2[6][9] = 0.791010793733435; fWeightMatrix1to2[7][9] = -1.01113523290261; fWeightMatrix1to2[8][9] = -1.08637529971915; fWeightMatrix1to2[9][9] = 0.34111539387551; fWeightMatrix1to2[10][9] = -1.21546403662599; fWeightMatrix1to2[11][9] = 0.766999516222317; fWeightMatrix1to2[12][9] = 1.88737133279635; fWeightMatrix1to2[0][10] = 1.03674014589697; fWeightMatrix1to2[1][10] = 0.921713823937121; fWeightMatrix1to2[2][10] = -1.23451398683451; fWeightMatrix1to2[3][10] = 0.760228158551058; fWeightMatrix1to2[4][10] = -1.92372826911368; fWeightMatrix1to2[5][10] = -0.00656916107015855; fWeightMatrix1to2[6][10] = -2.21151002596723; fWeightMatrix1to2[7][10] = -1.82257933906768; fWeightMatrix1to2[8][10] = -0.511939419462813; fWeightMatrix1to2[9][10] = 0.851120649994243; fWeightMatrix1to2[10][10] = 0.932621393557268; fWeightMatrix1to2[11][10] = 1.7176668472621; fWeightMatrix1to2[12][10] = 0.278373599107968; fWeightMatrix1to2[0][11] = -0.594202427224245; fWeightMatrix1to2[1][11] = -1.49272761006507; fWeightMatrix1to2[2][11] = 0.0559557722792408; fWeightMatrix1to2[3][11] = 1.51670652417111; fWeightMatrix1to2[4][11] = -1.5346941478812; fWeightMatrix1to2[5][11] = -0.383888039350927; fWeightMatrix1to2[6][11] = -1.70669077860042; fWeightMatrix1to2[7][11] = -0.52218425763458; fWeightMatrix1to2[8][11] = -0.634762177408585; fWeightMatrix1to2[9][11] = 0.373164889588931; fWeightMatrix1to2[10][11] = -0.898238224928875; fWeightMatrix1to2[11][11] = 1.27224488175385; fWeightMatrix1to2[12][11] = 1.60847303175671; fWeightMatrix1to2[0][12] = 0.392735755904236; fWeightMatrix1to2[1][12] = 0.246609683863294; fWeightMatrix1to2[2][12] = -0.729964687509518; fWeightMatrix1to2[3][12] = -1.51260874264665; fWeightMatrix1to2[4][12] = -1.73644016582005; fWeightMatrix1to2[5][12] = 0.60093337461827; fWeightMatrix1to2[6][12] = -1.13430767004406; fWeightMatrix1to2[7][12] = -1.11589859463008; fWeightMatrix1to2[8][12] = 0.206925835894317; fWeightMatrix1to2[9][12] = -2.34898478621622; fWeightMatrix1to2[10][12] = -0.740053475803545; fWeightMatrix1to2[11][12] = -1.12974612690341; fWeightMatrix1to2[12][12] = 1.02583074207755; fWeightMatrix1to2[0][13] = -1.4666883203598; fWeightMatrix1to2[1][13] = -2.67206430583463; fWeightMatrix1to2[2][13] = -0.715950838239593; fWeightMatrix1to2[3][13] = -1.25528439907061; fWeightMatrix1to2[4][13] = -1.60922983439281; fWeightMatrix1to2[5][13] = 1.01122439663145; fWeightMatrix1to2[6][13] = -1.55168282767227; fWeightMatrix1to2[7][13] = -1.18064021210667; fWeightMatrix1to2[8][13] = 1.60622871538847; fWeightMatrix1to2[9][13] = 0.657952732331324; fWeightMatrix1to2[10][13] = 0.0833754831915906; fWeightMatrix1to2[11][13] = 0.0936652611602441; fWeightMatrix1to2[12][13] = 1.33604312773054; fWeightMatrix1to2[0][14] = 1.18882464979629; fWeightMatrix1to2[1][14] = -0.180217297472534; fWeightMatrix1to2[2][14] = -1.02101941836127; fWeightMatrix1to2[3][14] = -1.10308990768765; fWeightMatrix1to2[4][14] = -1.70761412064404; fWeightMatrix1to2[5][14] = 0.121948828218104; fWeightMatrix1to2[6][14] = 0.457992657476338; fWeightMatrix1to2[7][14] = 0.0521548684432605; fWeightMatrix1to2[8][14] = 0.542216776269295; fWeightMatrix1to2[9][14] = 0.673263174976122; fWeightMatrix1to2[10][14] = 1.59228419160967; fWeightMatrix1to2[11][14] = -1.13175892643266; fWeightMatrix1to2[12][14] = 1.66539018913589; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.544940564843099; fWeightMatrix2to3[0][1] = -1.19748404385017; fWeightMatrix2to3[0][2] = 0.16074340814641; fWeightMatrix2to3[0][3] = -0.0625621939431144; fWeightMatrix2to3[0][4] = 0.946037474986381; fWeightMatrix2to3[0][5] = 0.114450836808419; fWeightMatrix2to3[0][6] = 0.570636937911124; fWeightMatrix2to3[0][7] = 1.75624472565978; fWeightMatrix2to3[0][8] = -0.214016970807011; fWeightMatrix2to3[0][9] = 0.71865079744677; fWeightMatrix2to3[0][10] = 0.871355048984397; fWeightMatrix2to3[0][11] = 1.47036574482029; fWeightMatrix2to3[0][12] = -1.62485065280112; fWeightMatrix2to3[0][13] = 1.22611938685994; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l