// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:52:20 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job409 Training events: 52820 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [48.6484184265,888.225952148] LepAPt LepAPt 'F' [20.0000362396,186.522079468] LepBPt LepBPt 'F' [10.0008134842,74.1102600098] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.28498280048,23.5595493317] MetSpec MetSpec 'F' [15.0020647049,207.062454224] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.2123203278,532.093322754] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [1.91322863102,348.466033936] addEt addEt 'F' [48.6484184265,450.732757568] dPhiLepSumMet dPhiLepSumMet 'F' [0.00602381536737,3.14159226418] dPhiLeptons dPhiLeptons 'F' [5.72204589844e-06,1.13010489941] dRLeptons dRLeptons 'F' [0.200046181679,1.15372478962] lep1_E lep1_E 'F' [20.0376224518,203.165985107] lep2_E lep2_E 'F' [10.0093269348,139.444885254] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 48.6484184265137; fVmax[0] = 888.225952148438; fVmin[1] = 20.000036239624; fVmax[1] = 186.522079467773; fVmin[2] = 10.0008134841919; fVmax[2] = 74.1102600097656; fVmin[3] = 1.2849828004837; fVmax[3] = 23.559549331665; fVmin[4] = 15.002064704895; fVmax[4] = 207.062454223633; fVmin[5] = 30.2123203277588; fVmax[5] = 532.093322753906; fVmin[6] = 1.91322863101959; fVmax[6] = 348.466033935547; fVmin[7] = 48.6484184265137; fVmax[7] = 450.732757568359; fVmin[8] = 0.00602381536737084; fVmax[8] = 3.14159226417542; fVmin[9] = 5.7220458984375e-06; fVmax[9] = 1.13010489940643; fVmin[10] = 0.200046181678772; fVmax[10] = 1.15372478961945; fVmin[11] = 20.0376224517822; fVmax[11] = 203.165985107422; fVmin[12] = 10.0093269348145; fVmax[12] = 139.444885253906; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.318587829187436; fWeightMatrix0to1[1][0] = 2.24608171861358; fWeightMatrix0to1[2][0] = 0.801365715311199; fWeightMatrix0to1[3][0] = 2.14056766931916; fWeightMatrix0to1[4][0] = -2.41649202388; fWeightMatrix0to1[5][0] = -1.21236763034993; fWeightMatrix0to1[6][0] = -1.35605402549028; fWeightMatrix0to1[7][0] = 1.0963172468029; fWeightMatrix0to1[8][0] = -1.76994363694538; fWeightMatrix0to1[9][0] = -0.845219570368531; fWeightMatrix0to1[10][0] = -0.911185710210365; fWeightMatrix0to1[11][0] = -0.193222627129629; fWeightMatrix0to1[12][0] = -0.332730078375281; fWeightMatrix0to1[13][0] = -0.552787834720972; fWeightMatrix0to1[0][1] = -0.663074607209087; fWeightMatrix0to1[1][1] = 0.885151449015055; fWeightMatrix0to1[2][1] = -0.579015587958013; fWeightMatrix0to1[3][1] = 3.65392000822174; fWeightMatrix0to1[4][1] = -0.369922613254455; fWeightMatrix0to1[5][1] = 3.10026525690683; fWeightMatrix0to1[6][1] = -1.22863979591801; fWeightMatrix0to1[7][1] = 3.99470620842503; fWeightMatrix0to1[8][1] = 0.613476934061556; fWeightMatrix0to1[9][1] = 0.817300427647039; fWeightMatrix0to1[10][1] = 0.700489223138961; fWeightMatrix0to1[11][1] = 0.0466911191823546; fWeightMatrix0to1[12][1] = 3.73694891910802; fWeightMatrix0to1[13][1] = -1.97656817626367; fWeightMatrix0to1[0][2] = -1.64608165272065; fWeightMatrix0to1[1][2] = 0.665236157665833; fWeightMatrix0to1[2][2] = 1.2111990808854; fWeightMatrix0to1[3][2] = 2.20064885664614; fWeightMatrix0to1[4][2] = -0.522864820753782; fWeightMatrix0to1[5][2] = 1.1258137198711; fWeightMatrix0to1[6][2] = -0.849541338162737; fWeightMatrix0to1[7][2] = 3.31726855727733; fWeightMatrix0to1[8][2] = -1.9768915230344; fWeightMatrix0to1[9][2] = 4.39187955631914; fWeightMatrix0to1[10][2] = -3.41859827093214; fWeightMatrix0to1[11][2] = 0.497356193489005; fWeightMatrix0to1[12][2] = 0.13781340809089; fWeightMatrix0to1[13][2] = -0.949607985825751; fWeightMatrix0to1[0][3] = 1.85088115881687; fWeightMatrix0to1[1][3] = 1.32628743592119; fWeightMatrix0to1[2][3] = -1.84618500320577; fWeightMatrix0to1[3][3] = -3.22266329804571; fWeightMatrix0to1[4][3] = 0.495146309910934; fWeightMatrix0to1[5][3] = -1.54319414415649; fWeightMatrix0to1[6][3] = 1.85382947657235; fWeightMatrix0to1[7][3] = -0.471064206014345; fWeightMatrix0to1[8][3] = 1.83277153608121; fWeightMatrix0to1[9][3] = 0.619118821802352; fWeightMatrix0to1[10][3] = -0.923300759288939; fWeightMatrix0to1[11][3] = 2.57220222413991; fWeightMatrix0to1[12][3] = -1.69329200886949; fWeightMatrix0to1[13][3] = 1.1521713877954; fWeightMatrix0to1[0][4] = -1.16251195489149; fWeightMatrix0to1[1][4] = -1.50239674601359; fWeightMatrix0to1[2][4] = 1.37004423156711; fWeightMatrix0to1[3][4] = -0.396868264410573; fWeightMatrix0to1[4][4] = -1.7424736362304; fWeightMatrix0to1[5][4] = 0.34316230400857; fWeightMatrix0to1[6][4] = -0.603660809324365; fWeightMatrix0to1[7][4] = 5.91336738101187; fWeightMatrix0to1[8][4] = 1.93364891993094; fWeightMatrix0to1[9][4] = -0.0891837993852124; fWeightMatrix0to1[10][4] = 0.474052139079397; fWeightMatrix0to1[11][4] = -0.592136051160592; fWeightMatrix0to1[12][4] = 1.50193853802965; fWeightMatrix0to1[13][4] = -0.999528642080342; fWeightMatrix0to1[0][5] = -0.709935948492873; fWeightMatrix0to1[1][5] = -1.15793412483071; fWeightMatrix0to1[2][5] = 0.99374357783625; fWeightMatrix0to1[3][5] = 2.09675991989141; fWeightMatrix0to1[4][5] = 0.911469532813814; fWeightMatrix0to1[5][5] = 0.318333531577822; fWeightMatrix0to1[6][5] = -2.60964545903953; fWeightMatrix0to1[7][5] = -2.46856958148899; fWeightMatrix0to1[8][5] = 0.531927806710962; fWeightMatrix0to1[9][5] = 0.979057999362726; fWeightMatrix0to1[10][5] = 2.6349120077548; fWeightMatrix0to1[11][5] = 0.611196443975042; fWeightMatrix0to1[12][5] = 2.01098838422679; fWeightMatrix0to1[13][5] = -1.39514635667974; fWeightMatrix0to1[0][6] = -0.881323151033509; fWeightMatrix0to1[1][6] = 0.0411818721606623; fWeightMatrix0to1[2][6] = 0.716119099050555; fWeightMatrix0to1[3][6] = -1.56819896322444; fWeightMatrix0to1[4][6] = -1.91535455553707; fWeightMatrix0to1[5][6] = -1.28296716765534; fWeightMatrix0to1[6][6] = -0.96445880007364; fWeightMatrix0to1[7][6] = 2.30463580695632; fWeightMatrix0to1[8][6] = 1.49819162043761; fWeightMatrix0to1[9][6] = -1.77523281530309; fWeightMatrix0to1[10][6] = -0.671042384396747; fWeightMatrix0to1[11][6] = 1.53714667306957; fWeightMatrix0to1[12][6] = -0.208949631098885; fWeightMatrix0to1[13][6] = -1.75484870867227; fWeightMatrix0to1[0][7] = -1.36377578146893; fWeightMatrix0to1[1][7] = 0.43188281998914; fWeightMatrix0to1[2][7] = -2.5008475882089; fWeightMatrix0to1[3][7] = 0.169478543006714; fWeightMatrix0to1[4][7] = 0.452643356581506; fWeightMatrix0to1[5][7] = 3.77243240999095; fWeightMatrix0to1[6][7] = -3.03783676079652; fWeightMatrix0to1[7][7] = 9.72441811710481; fWeightMatrix0to1[8][7] = -0.750669631787249; fWeightMatrix0to1[9][7] = 1.0622260673994; fWeightMatrix0to1[10][7] = 0.160593158740319; fWeightMatrix0to1[11][7] = 0.377334937927205; fWeightMatrix0to1[12][7] = -0.608847629767673; fWeightMatrix0to1[13][7] = -1.37296981923402; fWeightMatrix0to1[0][8] = 0.325452267491157; fWeightMatrix0to1[1][8] = 0.253068919003431; fWeightMatrix0to1[2][8] = -2.10047359541101; fWeightMatrix0to1[3][8] = 3.41283801680661; fWeightMatrix0to1[4][8] = 0.686424443108555; fWeightMatrix0to1[5][8] = -3.14815879950033; fWeightMatrix0to1[6][8] = 1.39313448059811; fWeightMatrix0to1[7][8] = 1.04079257207933; fWeightMatrix0to1[8][8] = -3.58756384389903; fWeightMatrix0to1[9][8] = -0.30394660321237; fWeightMatrix0to1[10][8] = -1.0714410067985; fWeightMatrix0to1[11][8] = 2.04577639204671; fWeightMatrix0to1[12][8] = -0.305105792988646; fWeightMatrix0to1[13][8] = 1.32074026843218; fWeightMatrix0to1[0][9] = -0.566132010886408; fWeightMatrix0to1[1][9] = 0.0422001059183611; fWeightMatrix0to1[2][9] = -2.63475499173797; fWeightMatrix0to1[3][9] = 0.190399887637359; fWeightMatrix0to1[4][9] = -0.113722142418874; fWeightMatrix0to1[5][9] = -1.10820010727696; fWeightMatrix0to1[6][9] = -1.23678806936781; fWeightMatrix0to1[7][9] = -0.491303188701959; fWeightMatrix0to1[8][9] = -0.166443635325742; fWeightMatrix0to1[9][9] = 1.49153003436028; fWeightMatrix0to1[10][9] = 0.952785448860988; fWeightMatrix0to1[11][9] = -0.459315077484854; fWeightMatrix0to1[12][9] = -1.17153280821594; fWeightMatrix0to1[13][9] = 1.31345123643575; fWeightMatrix0to1[0][10] = 1.29160504362927; fWeightMatrix0to1[1][10] = 0.047014091103529; fWeightMatrix0to1[2][10] = -2.82424363591553; fWeightMatrix0to1[3][10] = 0.727275661772358; fWeightMatrix0to1[4][10] = -1.86044597021555; fWeightMatrix0to1[5][10] = 0.388106291628933; fWeightMatrix0to1[6][10] = 0.0373258619001816; fWeightMatrix0to1[7][10] = 0.0434477553757131; fWeightMatrix0to1[8][10] = -1.52530327737371; fWeightMatrix0to1[9][10] = -0.228976891325984; fWeightMatrix0to1[10][10] = -0.667330780082385; fWeightMatrix0to1[11][10] = 1.67401407895771; fWeightMatrix0to1[12][10] = 0.283776925421255; fWeightMatrix0to1[13][10] = -0.737754078074476; fWeightMatrix0to1[0][11] = -0.660865195714469; fWeightMatrix0to1[1][11] = 0.363967978420443; fWeightMatrix0to1[2][11] = 0.0290781914251291; fWeightMatrix0to1[3][11] = 1.0183702974136; fWeightMatrix0to1[4][11] = 0.270632865277266; fWeightMatrix0to1[5][11] = -1.9726918002536; fWeightMatrix0to1[6][11] = 4.07835069436509; fWeightMatrix0to1[7][11] = -0.0953256229753874; fWeightMatrix0to1[8][11] = -0.706270794458552; fWeightMatrix0to1[9][11] = -0.558022410514442; fWeightMatrix0to1[10][11] = 2.79236478073397; fWeightMatrix0to1[11][11] = -0.498526406060181; fWeightMatrix0to1[12][11] = -0.0393552325477118; fWeightMatrix0to1[13][11] = 1.58559598571047; fWeightMatrix0to1[0][12] = -0.906603246260435; fWeightMatrix0to1[1][12] = -0.302131386008793; fWeightMatrix0to1[2][12] = 1.15990682477148; fWeightMatrix0to1[3][12] = 0.539952692790772; fWeightMatrix0to1[4][12] = -0.747704500457795; fWeightMatrix0to1[5][12] = 0.705627356445346; fWeightMatrix0to1[6][12] = 3.48331780560342; fWeightMatrix0to1[7][12] = -0.209726289024601; fWeightMatrix0to1[8][12] = -0.270073771590601; fWeightMatrix0to1[9][12] = 0.871450629672878; fWeightMatrix0to1[10][12] = -0.426277059508375; fWeightMatrix0to1[11][12] = -1.60902636038153; fWeightMatrix0to1[12][12] = -3.19538077671; fWeightMatrix0to1[13][12] = 1.95155199442994; fWeightMatrix0to1[0][13] = 2.04782206705652; fWeightMatrix0to1[1][13] = -1.82296342640204; fWeightMatrix0to1[2][13] = -0.485988133627702; fWeightMatrix0to1[3][13] = 2.5788335732292; fWeightMatrix0to1[4][13] = 1.09684173743826; fWeightMatrix0to1[5][13] = 5.27711948390852; fWeightMatrix0to1[6][13] = -2.81214687034107; fWeightMatrix0to1[7][13] = 15.6908786136547; fWeightMatrix0to1[8][13] = 0.126953777866336; fWeightMatrix0to1[9][13] = 4.23284510066892; fWeightMatrix0to1[10][13] = -0.537399539153471; fWeightMatrix0to1[11][13] = -0.0640349203542944; fWeightMatrix0to1[12][13] = -0.258931077278313; fWeightMatrix0to1[13][13] = -4.41759093529837; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.04221774656392; fWeightMatrix1to2[1][0] = -0.0690466512172958; fWeightMatrix1to2[2][0] = -0.566573594543459; fWeightMatrix1to2[3][0] = -1.95178234267826; fWeightMatrix1to2[4][0] = -1.02540225363023; fWeightMatrix1to2[5][0] = -1.3834802614511; fWeightMatrix1to2[6][0] = -0.824647671482816; fWeightMatrix1to2[7][0] = -1.8482766602634; fWeightMatrix1to2[8][0] = 1.34549859652613; fWeightMatrix1to2[9][0] = -0.320692287716411; fWeightMatrix1to2[10][0] = -0.87828254194366; fWeightMatrix1to2[11][0] = -0.207845533026066; fWeightMatrix1to2[12][0] = -0.384057926627033; fWeightMatrix1to2[0][1] = 1.05858432166294; fWeightMatrix1to2[1][1] = 0.345864852367204; fWeightMatrix1to2[2][1] = -0.160427512770581; fWeightMatrix1to2[3][1] = -2.03387713870974; fWeightMatrix1to2[4][1] = -0.929721724857499; fWeightMatrix1to2[5][1] = 1.64197498090561; fWeightMatrix1to2[6][1] = 1.32661054732398; fWeightMatrix1to2[7][1] = 0.763114626462325; fWeightMatrix1to2[8][1] = -0.109853121207158; fWeightMatrix1to2[9][1] = 1.95498530881289; fWeightMatrix1to2[10][1] = -1.69394591107885; fWeightMatrix1to2[11][1] = -1.31144192096933; fWeightMatrix1to2[12][1] = -0.261977158437405; fWeightMatrix1to2[0][2] = -1.52321515380432; fWeightMatrix1to2[1][2] = 0.953794723316451; fWeightMatrix1to2[2][2] = 0.861236200630296; fWeightMatrix1to2[3][2] = -1.59465492220381; fWeightMatrix1to2[4][2] = 0.624123021608635; fWeightMatrix1to2[5][2] = -0.300194852987334; fWeightMatrix1to2[6][2] = -1.17646898178916; fWeightMatrix1to2[7][2] = -1.20976617051151; fWeightMatrix1to2[8][2] = 2.30743780700564; fWeightMatrix1to2[9][2] = -0.0178585122434179; fWeightMatrix1to2[10][2] = 0.478807414566625; fWeightMatrix1to2[11][2] = -1.15554365612749; fWeightMatrix1to2[12][2] = 2.04452652790247; fWeightMatrix1to2[0][3] = -1.68777929635584; fWeightMatrix1to2[1][3] = -0.164528169067619; fWeightMatrix1to2[2][3] = 0.19909326832523; fWeightMatrix1to2[3][3] = -1.88926013012331; fWeightMatrix1to2[4][3] = -1.22428604001335; fWeightMatrix1to2[5][3] = -2.1178714291233; fWeightMatrix1to2[6][3] = 0.556126529016562; fWeightMatrix1to2[7][3] = -1.80308155885222; fWeightMatrix1to2[8][3] = -4.31845182828314; fWeightMatrix1to2[9][3] = -1.55041864186227; fWeightMatrix1to2[10][3] = 0.574117882420677; fWeightMatrix1to2[11][3] = -1.02367505383532; fWeightMatrix1to2[12][3] = -2.16783151116509; fWeightMatrix1to2[0][4] = 0.262357220130376; fWeightMatrix1to2[1][4] = -1.8385743158013; fWeightMatrix1to2[2][4] = -1.0131523955637; fWeightMatrix1to2[3][4] = -1.54376098772157; fWeightMatrix1to2[4][4] = -0.121139623043545; fWeightMatrix1to2[5][4] = 0.214435853001023; fWeightMatrix1to2[6][4] = 0.153194407340387; fWeightMatrix1to2[7][4] = -0.816171964892507; fWeightMatrix1to2[8][4] = -1.56213355166625; fWeightMatrix1to2[9][4] = -2.26932091515546; fWeightMatrix1to2[10][4] = -1.3642289082282; fWeightMatrix1to2[11][4] = -1.73315900974921; fWeightMatrix1to2[12][4] = -1.61791654365105; fWeightMatrix1to2[0][5] = -0.443538917930014; fWeightMatrix1to2[1][5] = 2.48397829683834; fWeightMatrix1to2[2][5] = -0.657787201241627; fWeightMatrix1to2[3][5] = -0.120906108935621; fWeightMatrix1to2[4][5] = -0.498688812470758; fWeightMatrix1to2[5][5] = -3.00345217161517; fWeightMatrix1to2[6][5] = 0.704612985963917; fWeightMatrix1to2[7][5] = 1.30435386434813; fWeightMatrix1to2[8][5] = -3.66628270760692; fWeightMatrix1to2[9][5] = -1.3841552448053; fWeightMatrix1to2[10][5] = 0.239581899158921; fWeightMatrix1to2[11][5] = 1.18447028632621; fWeightMatrix1to2[12][5] = 0.0707456844924219; fWeightMatrix1to2[0][6] = -1.61895440106382; fWeightMatrix1to2[1][6] = 0.34586602560198; fWeightMatrix1to2[2][6] = 0.972614206531274; fWeightMatrix1to2[3][6] = 1.69012545563115; fWeightMatrix1to2[4][6] = 0.65297211724043; fWeightMatrix1to2[5][6] = -0.88135948856485; fWeightMatrix1to2[6][6] = -0.175260408560798; fWeightMatrix1to2[7][6] = -0.329174076666259; fWeightMatrix1to2[8][6] = 3.76241546319364; fWeightMatrix1to2[9][6] = -1.74996443002634; fWeightMatrix1to2[10][6] = -1.87978656257326; fWeightMatrix1to2[11][6] = -1.52327484810712; fWeightMatrix1to2[12][6] = -1.37402986344577; fWeightMatrix1to2[0][7] = 1.60831673416134; fWeightMatrix1to2[1][7] = -3.84533395738249; fWeightMatrix1to2[2][7] = 1.54790912922056; fWeightMatrix1to2[3][7] = -3.23910313215837; fWeightMatrix1to2[4][7] = 1.05553319412942; fWeightMatrix1to2[5][7] = -1.72768665724334; fWeightMatrix1to2[6][7] = 1.68967540424523; fWeightMatrix1to2[7][7] = 1.65572605814363; fWeightMatrix1to2[8][7] = -8.14567094785881; fWeightMatrix1to2[9][7] = 0.658597089941259; fWeightMatrix1to2[10][7] = -0.592855532926148; fWeightMatrix1to2[11][7] = -1.8432207481652; fWeightMatrix1to2[12][7] = -3.45757580488504; fWeightMatrix1to2[0][8] = -1.76896144680834; fWeightMatrix1to2[1][8] = -2.16209341546042; fWeightMatrix1to2[2][8] = 1.49133055867585; fWeightMatrix1to2[3][8] = -0.833402617224421; fWeightMatrix1to2[4][8] = 0.269409661099169; fWeightMatrix1to2[5][8] = -1.11716317199837; fWeightMatrix1to2[6][8] = -1.3816975743005; fWeightMatrix1to2[7][8] = -1.49712823838257; fWeightMatrix1to2[8][8] = -2.07359735201312; fWeightMatrix1to2[9][8] = -1.41026565441212; fWeightMatrix1to2[10][8] = 0.904484218123293; fWeightMatrix1to2[11][8] = -1.41373128449721; fWeightMatrix1to2[12][8] = -1.87274352922208; fWeightMatrix1to2[0][9] = 2.1449291079807; fWeightMatrix1to2[1][9] = -0.728697703785453; fWeightMatrix1to2[2][9] = 0.439715283770402; fWeightMatrix1to2[3][9] = 1.12088887391886; fWeightMatrix1to2[4][9] = -1.69544472084388; fWeightMatrix1to2[5][9] = 1.13595709155518; fWeightMatrix1to2[6][9] = 1.52299819294674; fWeightMatrix1to2[7][9] = -1.05147922762897; fWeightMatrix1to2[8][9] = -4.03050733152446; fWeightMatrix1to2[9][9] = 0.230041639335681; fWeightMatrix1to2[10][9] = -1.06020043568762; fWeightMatrix1to2[11][9] = 0.599344181707703; fWeightMatrix1to2[12][9] = 2.35008298249491; fWeightMatrix1to2[0][10] = 0.43310071538202; fWeightMatrix1to2[1][10] = -0.17124802863014; fWeightMatrix1to2[2][10] = -1.96201218624857; fWeightMatrix1to2[3][10] = 0.755121366985355; fWeightMatrix1to2[4][10] = -1.99618741678358; fWeightMatrix1to2[5][10] = -0.829919369371453; fWeightMatrix1to2[6][10] = -2.43588920490717; fWeightMatrix1to2[7][10] = -1.99787449474378; fWeightMatrix1to2[8][10] = 1.02377092080923; fWeightMatrix1to2[9][10] = 0.255856971837253; fWeightMatrix1to2[10][10] = 0.674773485568675; fWeightMatrix1to2[11][10] = 0.959677362896679; fWeightMatrix1to2[12][10] = -1.74147030391163; fWeightMatrix1to2[0][11] = -1.232143478456; fWeightMatrix1to2[1][11] = -1.86272561912482; fWeightMatrix1to2[2][11] = -0.547664927662138; fWeightMatrix1to2[3][11] = 0.96687311951584; fWeightMatrix1to2[4][11] = -1.73938946263497; fWeightMatrix1to2[5][11] = -1.06435870811396; fWeightMatrix1to2[6][11] = -1.71251867176765; fWeightMatrix1to2[7][11] = 0.0729962954941336; fWeightMatrix1to2[8][11] = -0.463956812101544; fWeightMatrix1to2[9][11] = -0.679961037668981; fWeightMatrix1to2[10][11] = -1.82885191220471; fWeightMatrix1to2[11][11] = 0.842665787836321; fWeightMatrix1to2[12][11] = -0.111788450201664; fWeightMatrix1to2[0][12] = -0.258723200079076; fWeightMatrix1to2[1][12] = 0.355567608272885; fWeightMatrix1to2[2][12] = -2.1515480220778; fWeightMatrix1to2[3][12] = -1.69253832238098; fWeightMatrix1to2[4][12] = -1.86447146125158; fWeightMatrix1to2[5][12] = -0.524560543636224; fWeightMatrix1to2[6][12] = -1.6242560342252; fWeightMatrix1to2[7][12] = -1.43156644723988; fWeightMatrix1to2[8][12] = -1.44350327842647; fWeightMatrix1to2[9][12] = -2.47084336858066; fWeightMatrix1to2[10][12] = -1.0765165769583; fWeightMatrix1to2[11][12] = -1.76552603824534; fWeightMatrix1to2[12][12] = -0.487219560082829; fWeightMatrix1to2[0][13] = -1.81831505272241; fWeightMatrix1to2[1][13] = -3.03064848362648; fWeightMatrix1to2[2][13] = -0.65325999288921; fWeightMatrix1to2[3][13] = -1.34674218307031; fWeightMatrix1to2[4][13] = -1.73592733772486; fWeightMatrix1to2[5][13] = 0.0272675420551247; fWeightMatrix1to2[6][13] = -1.73952473590335; fWeightMatrix1to2[7][13] = -1.5704719104159; fWeightMatrix1to2[8][13] = 2.90892476731973; fWeightMatrix1to2[9][13] = -0.0412472373515762; fWeightMatrix1to2[10][13] = 0.810869787263324; fWeightMatrix1to2[11][13] = 0.0438834617900117; fWeightMatrix1to2[12][13] = -0.0937021348756732; fWeightMatrix1to2[0][14] = 0.600202513081836; fWeightMatrix1to2[1][14] = -0.704428959658678; fWeightMatrix1to2[2][14] = -2.16344782611475; fWeightMatrix1to2[3][14] = -1.29616048462656; fWeightMatrix1to2[4][14] = -2.07172469960806; fWeightMatrix1to2[5][14] = -0.79860092174591; fWeightMatrix1to2[6][14] = -0.212638271441413; fWeightMatrix1to2[7][14] = 0.141992237618711; fWeightMatrix1to2[8][14] = 0.78423920431216; fWeightMatrix1to2[9][14] = -0.242137106587152; fWeightMatrix1to2[10][14] = 1.09468474062975; fWeightMatrix1to2[11][14] = -1.42579847450069; fWeightMatrix1to2[12][14] = -0.275127662241509; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.349993070223721; fWeightMatrix2to3[0][1] = -1.50222021411281; fWeightMatrix2to3[0][2] = 0.687939974573129; fWeightMatrix2to3[0][3] = -0.999553942841023; fWeightMatrix2to3[0][4] = 0.476013395733573; fWeightMatrix2to3[0][5] = -0.624023725348389; fWeightMatrix2to3[0][6] = 0.373422192407752; fWeightMatrix2to3[0][7] = 1.85865845146971; fWeightMatrix2to3[0][8] = -0.625166706722248; fWeightMatrix2to3[0][9] = 0.629207708406663; fWeightMatrix2to3[0][10] = 0.858239815901647; fWeightMatrix2to3[0][11] = -0.403988800109799; fWeightMatrix2to3[0][12] = -0.814082965056632; fWeightMatrix2to3[0][13] = 0.578893774600044; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l