// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:27:01 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job418 Training events: 23126 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [50.1963043213,950.957641602] LepAPt LepAPt 'F' [20.001373291,127.397720337] LepBPt LepBPt 'F' [10.0010261536,71.4863357544] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.19181919098,23.3634910583] MetSpec MetSpec 'F' [15.0101337433,210.330810547] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.2201824188,513.559753418] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [1.44759953022,434.137237549] addEt addEt 'F' [48.2450904846,476.475494385] dPhiLepSumMet dPhiLepSumMet 'F' [0.000472510961117,3.14157700539] dPhiLeptons dPhiLeptons 'F' [3.32593917847e-05,1.06520390511] dRLeptons dRLeptons 'F' [0.200003907084,1.13498270512] lep1_E lep1_E 'F' [20.0262546539,208.21144104] lep2_E lep2_E 'F' [10.0028533936,105.04108429] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 50.1963043212891; fVmax[0] = 950.957641601562; fVmin[1] = 20.0013732910156; fVmax[1] = 127.397720336914; fVmin[2] = 10.0010261535645; fVmax[2] = 71.4863357543945; fVmin[3] = 1.191819190979; fVmax[3] = 23.3634910583496; fVmin[4] = 15.0101337432861; fVmax[4] = 210.330810546875; fVmin[5] = 30.2201824188232; fVmax[5] = 513.559753417969; fVmin[6] = 1.44759953022003; fVmax[6] = 434.137237548828; fVmin[7] = 48.2450904846191; fVmax[7] = 476.475494384766; fVmin[8] = 0.000472510961117223; fVmax[8] = 3.14157700538635; fVmin[9] = 3.3259391784668e-05; fVmax[9] = 1.06520390510559; fVmin[10] = 0.200003907084465; fVmax[10] = 1.13498270511627; fVmin[11] = 20.0262546539307; fVmax[11] = 208.211441040039; fVmin[12] = 10.0028533935547; fVmax[12] = 105.041084289551; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.107499542646937; fWeightMatrix0to1[1][0] = 2.15511816332034; fWeightMatrix0to1[2][0] = 0.993734115140769; fWeightMatrix0to1[3][0] = 1.70050077764577; fWeightMatrix0to1[4][0] = -1.94356642506852; fWeightMatrix0to1[5][0] = -1.45177046944385; fWeightMatrix0to1[6][0] = -0.665819627335347; fWeightMatrix0to1[7][0] = 1.56276468605614; fWeightMatrix0to1[8][0] = -1.29644344610286; fWeightMatrix0to1[9][0] = -1.36750099050923; fWeightMatrix0to1[10][0] = -1.43786539579775; fWeightMatrix0to1[11][0] = -0.336948362612245; fWeightMatrix0to1[12][0] = -1.37356184444456; fWeightMatrix0to1[13][0] = -0.134146566788349; fWeightMatrix0to1[0][1] = -0.483127957081558; fWeightMatrix0to1[1][1] = 1.16116649014738; fWeightMatrix0to1[2][1] = -0.509377875319868; fWeightMatrix0to1[3][1] = 2.27778048847054; fWeightMatrix0to1[4][1] = 2.91658701281398; fWeightMatrix0to1[5][1] = 2.60075411792021; fWeightMatrix0to1[6][1] = -0.693135948299256; fWeightMatrix0to1[7][1] = 0.839422905718943; fWeightMatrix0to1[8][1] = 0.799271285420497; fWeightMatrix0to1[9][1] = 2.70131886333789; fWeightMatrix0to1[10][1] = -1.06863788948008; fWeightMatrix0to1[11][1] = -0.287824561724324; fWeightMatrix0to1[12][1] = 1.25968313734249; fWeightMatrix0to1[13][1] = -2.43041414450377; fWeightMatrix0to1[0][2] = -1.75726703866116; fWeightMatrix0to1[1][2] = 0.653364950461056; fWeightMatrix0to1[2][2] = 0.689637133143548; fWeightMatrix0to1[3][2] = 1.56789894180951; fWeightMatrix0to1[4][2] = 1.57123034266326; fWeightMatrix0to1[5][2] = 1.85460706149602; fWeightMatrix0to1[6][2] = -2.62719092573287; fWeightMatrix0to1[7][2] = 1.74351004917877; fWeightMatrix0to1[8][2] = -1.42867212659026; fWeightMatrix0to1[9][2] = 4.44788161085272; fWeightMatrix0to1[10][2] = -0.869528257983791; fWeightMatrix0to1[11][2] = 0.971920303484319; fWeightMatrix0to1[12][2] = 0.810829711467928; fWeightMatrix0to1[13][2] = -1.93273007709395; fWeightMatrix0to1[0][3] = 1.99401923558937; fWeightMatrix0to1[1][3] = 1.20408259146712; fWeightMatrix0to1[2][3] = -0.609194671988468; fWeightMatrix0to1[3][3] = -1.73011415586817; fWeightMatrix0to1[4][3] = 0.28949473127863; fWeightMatrix0to1[5][3] = -0.253415823297862; fWeightMatrix0to1[6][3] = 1.95572967853401; fWeightMatrix0to1[7][3] = 0.966309499381031; fWeightMatrix0to1[8][3] = 1.59876018177627; fWeightMatrix0to1[9][3] = 0.96678857688044; fWeightMatrix0to1[10][3] = -1.3861844879277; fWeightMatrix0to1[11][3] = 1.17498973716157; fWeightMatrix0to1[12][3] = -1.96040997584968; fWeightMatrix0to1[13][3] = -0.847741879162851; fWeightMatrix0to1[0][4] = -1.13723150376612; fWeightMatrix0to1[1][4] = -1.53934933655364; fWeightMatrix0to1[2][4] = 1.70344789019179; fWeightMatrix0to1[3][4] = 0.935687250121877; fWeightMatrix0to1[4][4] = -0.372852601802916; fWeightMatrix0to1[5][4] = 1.94838026127576; fWeightMatrix0to1[6][4] = -2.19600234265932; fWeightMatrix0to1[7][4] = 3.79938892972426; fWeightMatrix0to1[8][4] = 1.87031128487609; fWeightMatrix0to1[9][4] = 1.52678538865858; fWeightMatrix0to1[10][4] = 0.552568226585793; fWeightMatrix0to1[11][4] = -1.51235242998282; fWeightMatrix0to1[12][4] = 0.863840427388944; fWeightMatrix0to1[13][4] = -3.6387524393381; fWeightMatrix0to1[0][5] = -0.503915522765194; fWeightMatrix0to1[1][5] = -1.18401820907198; fWeightMatrix0to1[2][5] = 0.803613806562957; fWeightMatrix0to1[3][5] = 1.13499382737343; fWeightMatrix0to1[4][5] = 1.7685232364702; fWeightMatrix0to1[5][5] = 0.110836898113055; fWeightMatrix0to1[6][5] = -2.38661633399668; fWeightMatrix0to1[7][5] = -1.67980572566656; fWeightMatrix0to1[8][5] = 1.13257262045551; fWeightMatrix0to1[9][5] = 0.747807069517102; fWeightMatrix0to1[10][5] = 2.12004270382359; fWeightMatrix0to1[11][5] = 0.713716963777685; fWeightMatrix0to1[12][5] = 0.924819083427154; fWeightMatrix0to1[13][5] = -0.980137924186968; fWeightMatrix0to1[0][6] = -0.694246515216403; fWeightMatrix0to1[1][6] = -0.0747101919714384; fWeightMatrix0to1[2][6] = 1.28454466392856; fWeightMatrix0to1[3][6] = -0.885544077258453; fWeightMatrix0to1[4][6] = -1.20280803583319; fWeightMatrix0to1[5][6] = -1.53668722417069; fWeightMatrix0to1[6][6] = 0.0736029897576562; fWeightMatrix0to1[7][6] = -0.292824435810951; fWeightMatrix0to1[8][6] = 1.72733931116689; fWeightMatrix0to1[9][6] = -1.98207780778433; fWeightMatrix0to1[10][6] = -0.616924987457077; fWeightMatrix0to1[11][6] = 1.56733395248436; fWeightMatrix0to1[12][6] = -0.966802158874161; fWeightMatrix0to1[13][6] = -1.34311521784267; fWeightMatrix0to1[0][7] = -1.30644814492603; fWeightMatrix0to1[1][7] = 0.48785264099044; fWeightMatrix0to1[2][7] = -1.85447913154064; fWeightMatrix0to1[3][7] = -1.26735697956272; fWeightMatrix0to1[4][7] = 2.48224472675538; fWeightMatrix0to1[5][7] = 3.19160805039111; fWeightMatrix0to1[6][7] = -2.53715025107365; fWeightMatrix0to1[7][7] = 3.48394683013319; fWeightMatrix0to1[8][7] = -0.861118022020971; fWeightMatrix0to1[9][7] = 1.03107768380798; fWeightMatrix0to1[10][7] = 0.244520079366989; fWeightMatrix0to1[11][7] = 0.126583873842757; fWeightMatrix0to1[12][7] = -1.86749494352023; fWeightMatrix0to1[13][7] = -1.63731495503185; fWeightMatrix0to1[0][8] = 0.0478702713279933; fWeightMatrix0to1[1][8] = 0.381789638183922; fWeightMatrix0to1[2][8] = -1.35255757347603; fWeightMatrix0to1[3][8] = 2.46909382690686; fWeightMatrix0to1[4][8] = 1.16246746623273; fWeightMatrix0to1[5][8] = -1.14324241451009; fWeightMatrix0to1[6][8] = 1.31145205843814; fWeightMatrix0to1[7][8] = 0.0777988725258128; fWeightMatrix0to1[8][8] = -1.75667207332687; fWeightMatrix0to1[9][8] = -0.0607251347482949; fWeightMatrix0to1[10][8] = -0.244692685837634; fWeightMatrix0to1[11][8] = 3.14198301905533; fWeightMatrix0to1[12][8] = 0.659219640602442; fWeightMatrix0to1[13][8] = -1.13020357475864; fWeightMatrix0to1[0][9] = -0.302915897632607; fWeightMatrix0to1[1][9] = -0.0960425891903053; fWeightMatrix0to1[2][9] = -1.68254619355352; fWeightMatrix0to1[3][9] = 0.144003179243364; fWeightMatrix0to1[4][9] = 1.10770095198939; fWeightMatrix0to1[5][9] = -0.900881707921338; fWeightMatrix0to1[6][9] = -0.264347398838281; fWeightMatrix0to1[7][9] = -0.824267991334738; fWeightMatrix0to1[8][9] = 0.0295509585579907; fWeightMatrix0to1[9][9] = 1.35207005122482; fWeightMatrix0to1[10][9] = 2.05368034411641; fWeightMatrix0to1[11][9] = -2.39738717126541; fWeightMatrix0to1[12][9] = 0.40888900267283; fWeightMatrix0to1[13][9] = 0.236095157978623; fWeightMatrix0to1[0][10] = 1.62784162704322; fWeightMatrix0to1[1][10] = -0.157117602701044; fWeightMatrix0to1[2][10] = -1.45737667523636; fWeightMatrix0to1[3][10] = -0.186342724268856; fWeightMatrix0to1[4][10] = -1.55029663952395; fWeightMatrix0to1[5][10] = 1.55095922794929; fWeightMatrix0to1[6][10] = -0.975733602063508; fWeightMatrix0to1[7][10] = -0.731470334214361; fWeightMatrix0to1[8][10] = -0.400758740174086; fWeightMatrix0to1[9][10] = -0.769056543884727; fWeightMatrix0to1[10][10] = -1.11692709213566; fWeightMatrix0to1[11][10] = 0.914623795599494; fWeightMatrix0to1[12][10] = 1.44266694993522; fWeightMatrix0to1[13][10] = -0.28544251035051; fWeightMatrix0to1[0][11] = -0.507438417257472; fWeightMatrix0to1[1][11] = 0.445779368650434; fWeightMatrix0to1[2][11] = 0.916756530295992; fWeightMatrix0to1[3][11] = 0.749343932472637; fWeightMatrix0to1[4][11] = 2.52548292752001; fWeightMatrix0to1[5][11] = -2.1016074569145; fWeightMatrix0to1[6][11] = 3.99828159900511; fWeightMatrix0to1[7][11] = 0.999523358649717; fWeightMatrix0to1[8][11] = 0.503031276520208; fWeightMatrix0to1[9][11] = 0.0753946875278917; fWeightMatrix0to1[10][11] = 1.12566223985028; fWeightMatrix0to1[11][11] = -0.122862117810381; fWeightMatrix0to1[12][11] = -0.536035595013158; fWeightMatrix0to1[13][11] = 2.58109458106566; fWeightMatrix0to1[0][12] = -0.787823493501777; fWeightMatrix0to1[1][12] = -0.353825716917434; fWeightMatrix0to1[2][12] = 1.69257368174655; fWeightMatrix0to1[3][12] = 0.3437660756037; fWeightMatrix0to1[4][12] = 0.347898308282861; fWeightMatrix0to1[5][12] = 1.29657576152365; fWeightMatrix0to1[6][12] = 1.78370409317518; fWeightMatrix0to1[7][12] = -1.1281210954386; fWeightMatrix0to1[8][12] = 0.63964244607156; fWeightMatrix0to1[9][12] = -0.417444300649753; fWeightMatrix0to1[10][12] = 0.872054271531697; fWeightMatrix0to1[11][12] = -1.01560162164888; fWeightMatrix0to1[12][12] = -1.89828647782925; fWeightMatrix0to1[13][12] = 0.361230290619605; fWeightMatrix0to1[0][13] = 1.5663968221888; fWeightMatrix0to1[1][13] = -1.41749424907269; fWeightMatrix0to1[2][13] = -0.413363529024557; fWeightMatrix0to1[3][13] = 0.325051287437716; fWeightMatrix0to1[4][13] = 3.41226576207286; fWeightMatrix0to1[5][13] = 5.40777966261254; fWeightMatrix0to1[6][13] = -4.77203629645541; fWeightMatrix0to1[7][13] = 5.54460119303664; fWeightMatrix0to1[8][13] = -1.14598091579308; fWeightMatrix0to1[9][13] = 5.65406467481118; fWeightMatrix0to1[10][13] = 1.56444102630861; fWeightMatrix0to1[11][13] = 0.407106082565903; fWeightMatrix0to1[12][13] = 0.855804298453703; fWeightMatrix0to1[13][13] = -6.14579900539814; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.90137592189204; fWeightMatrix1to2[1][0] = 0.111702227665584; fWeightMatrix1to2[2][0] = 0.359851208450754; fWeightMatrix1to2[3][0] = -1.71516476824851; fWeightMatrix1to2[4][0] = -0.703653285479839; fWeightMatrix1to2[5][0] = -0.881401978471904; fWeightMatrix1to2[6][0] = -0.221625795140342; fWeightMatrix1to2[7][0] = -1.84548347278257; fWeightMatrix1to2[8][0] = 0.603198238011516; fWeightMatrix1to2[9][0] = -0.0248861419560847; fWeightMatrix1to2[10][0] = -1.08186934549478; fWeightMatrix1to2[11][0] = -0.3646127134211; fWeightMatrix1to2[12][0] = -0.187204004980309; fWeightMatrix1to2[0][1] = 1.38635316373237; fWeightMatrix1to2[1][1] = 0.451372314114454; fWeightMatrix1to2[2][1] = -0.188997131181193; fWeightMatrix1to2[3][1] = -2.04517861712453; fWeightMatrix1to2[4][1] = -0.924140192600136; fWeightMatrix1to2[5][1] = 1.62184629963294; fWeightMatrix1to2[6][1] = 1.33706868177073; fWeightMatrix1to2[7][1] = 0.7796793043946; fWeightMatrix1to2[8][1] = 0.0780164639238076; fWeightMatrix1to2[9][1] = 1.93909643375072; fWeightMatrix1to2[10][1] = -1.71432683179219; fWeightMatrix1to2[11][1] = -1.30419279471409; fWeightMatrix1to2[12][1] = -0.256736449739028; fWeightMatrix1to2[0][2] = -1.3326601699853; fWeightMatrix1to2[1][2] = 0.861159063945738; fWeightMatrix1to2[2][2] = 1.67799001980145; fWeightMatrix1to2[3][2] = -1.30949425453467; fWeightMatrix1to2[4][2] = 0.786135200380954; fWeightMatrix1to2[5][2] = -0.156097713627394; fWeightMatrix1to2[6][2] = -1.85455281121594; fWeightMatrix1to2[7][2] = -1.14231629685176; fWeightMatrix1to2[8][2] = 0.876748916634797; fWeightMatrix1to2[9][2] = 0.0332319940607209; fWeightMatrix1to2[10][2] = 0.918911067216671; fWeightMatrix1to2[11][2] = -1.1600344903218; fWeightMatrix1to2[12][2] = 1.53743853354256; fWeightMatrix1to2[0][3] = 0.795003159227937; fWeightMatrix1to2[1][3] = 0.115309742408345; fWeightMatrix1to2[2][3] = -0.166909099210979; fWeightMatrix1to2[3][3] = -1.67778334108965; fWeightMatrix1to2[4][3] = -1.31004452127575; fWeightMatrix1to2[5][3] = -2.1995680336239; fWeightMatrix1to2[6][3] = -0.464930832622081; fWeightMatrix1to2[7][3] = -1.85600889205106; fWeightMatrix1to2[8][3] = -2.45087110315287; fWeightMatrix1to2[9][3] = -1.93607050600148; fWeightMatrix1to2[10][3] = 0.126732380180304; fWeightMatrix1to2[11][3] = -1.09801960156692; fWeightMatrix1to2[12][3] = -2.20650201216542; fWeightMatrix1to2[0][4] = 4.5050217059935; fWeightMatrix1to2[1][4] = -1.20375887420063; fWeightMatrix1to2[2][4] = -0.326051119742761; fWeightMatrix1to2[3][4] = -2.13866162586018; fWeightMatrix1to2[4][4] = 0.211225265058646; fWeightMatrix1to2[5][4] = 0.857092187250771; fWeightMatrix1to2[6][4] = 0.73544129204813; fWeightMatrix1to2[7][4] = -0.828642661142552; fWeightMatrix1to2[8][4] = -2.31642086392485; fWeightMatrix1to2[9][4] = -1.70743556717549; fWeightMatrix1to2[10][4] = -1.40253024697189; fWeightMatrix1to2[11][4] = -1.74729659937703; fWeightMatrix1to2[12][4] = -0.538456052126306; fWeightMatrix1to2[0][5] = 3.5167328890068; fWeightMatrix1to2[1][5] = 1.3213704525402; fWeightMatrix1to2[2][5] = -0.738312809104612; fWeightMatrix1to2[3][5] = 0.343082683440775; fWeightMatrix1to2[4][5] = -0.532796158069822; fWeightMatrix1to2[5][5] = -3.0404305070202; fWeightMatrix1to2[6][5] = 0.738856868596543; fWeightMatrix1to2[7][5] = 1.26649083278503; fWeightMatrix1to2[8][5] = -1.29691149823823; fWeightMatrix1to2[9][5] = -1.47975494886058; fWeightMatrix1to2[10][5] = 0.486538766196163; fWeightMatrix1to2[11][5] = 1.21018625174943; fWeightMatrix1to2[12][5] = -0.4965353144411; fWeightMatrix1to2[0][6] = -3.68760368848035; fWeightMatrix1to2[1][6] = -0.676057847884836; fWeightMatrix1to2[2][6] = 1.5753234809501; fWeightMatrix1to2[3][6] = 1.54729684610421; fWeightMatrix1to2[4][6] = 0.980227192217137; fWeightMatrix1to2[5][6] = -1.08242402828208; fWeightMatrix1to2[6][6] = 0.0494600055045176; fWeightMatrix1to2[7][6] = -0.398367346800374; fWeightMatrix1to2[8][6] = -2.02515031208299; fWeightMatrix1to2[9][6] = -1.46867912378948; fWeightMatrix1to2[10][6] = -2.20425035245061; fWeightMatrix1to2[11][6] = -1.7314469004275; fWeightMatrix1to2[12][6] = -2.38419726367276; fWeightMatrix1to2[0][7] = 4.29590030930756; fWeightMatrix1to2[1][7] = -2.32036430292168; fWeightMatrix1to2[2][7] = 1.52231579846855; fWeightMatrix1to2[3][7] = -2.26592565295882; fWeightMatrix1to2[4][7] = 1.12440955395031; fWeightMatrix1to2[5][7] = -1.38315399579246; fWeightMatrix1to2[6][7] = -0.0828665769232915; fWeightMatrix1to2[7][7] = 0.975710715160979; fWeightMatrix1to2[8][7] = -2.39813265982675; fWeightMatrix1to2[9][7] = 0.114890087937389; fWeightMatrix1to2[10][7] = -0.115980391124092; fWeightMatrix1to2[11][7] = -2.20102349156071; fWeightMatrix1to2[12][7] = -2.76395675820439; fWeightMatrix1to2[0][8] = -1.52296669585513; fWeightMatrix1to2[1][8] = -0.83673150886561; fWeightMatrix1to2[2][8] = 1.84901839383738; fWeightMatrix1to2[3][8] = -0.628082906912671; fWeightMatrix1to2[4][8] = 0.270378640842148; fWeightMatrix1to2[5][8] = -0.918513914739556; fWeightMatrix1to2[6][8] = -1.28025416062014; fWeightMatrix1to2[7][8] = -1.76541726070047; fWeightMatrix1to2[8][8] = -1.04849625093539; fWeightMatrix1to2[9][8] = -1.42417766659857; fWeightMatrix1to2[10][8] = 0.803294435631897; fWeightMatrix1to2[11][8] = -1.34754645293448; fWeightMatrix1to2[12][8] = -1.25699198471266; fWeightMatrix1to2[0][9] = 4.96729260778584; fWeightMatrix1to2[1][9] = -0.124572124583328; fWeightMatrix1to2[2][9] = -0.951716487362472; fWeightMatrix1to2[3][9] = 2.13793735774988; fWeightMatrix1to2[4][9] = -1.7975071673958; fWeightMatrix1to2[5][9] = 0.867158282604903; fWeightMatrix1to2[6][9] = 0.70679523036574; fWeightMatrix1to2[7][9] = -0.740942800839077; fWeightMatrix1to2[8][9] = -1.99237268526953; fWeightMatrix1to2[9][9] = -0.244332619597355; fWeightMatrix1to2[10][9] = -1.33179238015619; fWeightMatrix1to2[11][9] = 0.76896366747498; fWeightMatrix1to2[12][9] = 1.9578529790866; fWeightMatrix1to2[0][10] = 0.820099358452761; fWeightMatrix1to2[1][10] = 0.683233679392632; fWeightMatrix1to2[2][10] = -1.63745872941119; fWeightMatrix1to2[3][10] = 0.915382730282799; fWeightMatrix1to2[4][10] = -1.93325017884947; fWeightMatrix1to2[5][10] = -0.506849661166192; fWeightMatrix1to2[6][10] = -2.24033648743769; fWeightMatrix1to2[7][10] = -1.65688112966872; fWeightMatrix1to2[8][10] = -0.868592600212646; fWeightMatrix1to2[9][10] = 0.346169815978516; fWeightMatrix1to2[10][10] = -0.215867444779777; fWeightMatrix1to2[11][10] = 0.873204284419449; fWeightMatrix1to2[12][10] = -1.25120297397439; fWeightMatrix1to2[0][11] = -1.66042721534665; fWeightMatrix1to2[1][11] = -1.65313138700288; fWeightMatrix1to2[2][11] = 0.0233703268454236; fWeightMatrix1to2[3][11] = 1.40038560503042; fWeightMatrix1to2[4][11] = -1.5706679993803; fWeightMatrix1to2[5][11] = -0.702609382799724; fWeightMatrix1to2[6][11] = -1.63756385201199; fWeightMatrix1to2[7][11] = -0.25120682941356; fWeightMatrix1to2[8][11] = -0.752893424713007; fWeightMatrix1to2[9][11] = -0.637346002149919; fWeightMatrix1to2[10][11] = -1.73522075526524; fWeightMatrix1to2[11][11] = 0.74457028502025; fWeightMatrix1to2[12][11] = 0.249119535730737; fWeightMatrix1to2[0][12] = 0.577953343043415; fWeightMatrix1to2[1][12] = -1.08185527832529; fWeightMatrix1to2[2][12] = -0.992871574030252; fWeightMatrix1to2[3][12] = -1.71981037305956; fWeightMatrix1to2[4][12] = -1.7429331736726; fWeightMatrix1to2[5][12] = -0.0664025462728371; fWeightMatrix1to2[6][12] = -0.748900354066165; fWeightMatrix1to2[7][12] = -1.00046110245023; fWeightMatrix1to2[8][12] = -0.339871009158598; fWeightMatrix1to2[9][12] = -2.47384697335482; fWeightMatrix1to2[10][12] = -1.49494741520652; fWeightMatrix1to2[11][12] = -1.84572191950557; fWeightMatrix1to2[12][12] = -0.685823784425245; fWeightMatrix1to2[0][13] = -4.47290102301002; fWeightMatrix1to2[1][13] = -2.14844113772656; fWeightMatrix1to2[2][13] = -0.510859299292962; fWeightMatrix1to2[3][13] = -0.939115570806873; fWeightMatrix1to2[4][13] = -1.6268600139809; fWeightMatrix1to2[5][13] = 0.498390561413771; fWeightMatrix1to2[6][13] = -1.44701259748806; fWeightMatrix1to2[7][13] = -1.28198273591593; fWeightMatrix1to2[8][13] = 1.64757957526507; fWeightMatrix1to2[9][13] = -0.225160244090755; fWeightMatrix1to2[10][13] = 0.347387367077968; fWeightMatrix1to2[11][13] = -0.0945569280172272; fWeightMatrix1to2[12][13] = 0.714945071138509; fWeightMatrix1to2[0][14] = 1.91973476879886; fWeightMatrix1to2[1][14] = -0.513341395074803; fWeightMatrix1to2[2][14] = -1.04036736895177; fWeightMatrix1to2[3][14] = -1.07080920870108; fWeightMatrix1to2[4][14] = -1.74699069550532; fWeightMatrix1to2[5][14] = -0.292402151143841; fWeightMatrix1to2[6][14] = 0.432532255438713; fWeightMatrix1to2[7][14] = 0.169653722703992; fWeightMatrix1to2[8][14] = 0.119092927660091; fWeightMatrix1to2[9][14] = 0.0886225279569349; fWeightMatrix1to2[10][14] = 1.0493149105878; fWeightMatrix1to2[11][14] = -1.58112647271433; fWeightMatrix1to2[12][14] = -0.123873008307847; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.766812939202826; fWeightMatrix2to3[0][1] = -0.316134910255038; fWeightMatrix2to3[0][2] = -0.878253090378915; fWeightMatrix2to3[0][3] = -1.38366547375587; fWeightMatrix2to3[0][4] = 0.902280508351926; fWeightMatrix2to3[0][5] = -0.380034864847104; fWeightMatrix2to3[0][6] = 0.404126347400581; fWeightMatrix2to3[0][7] = 1.82955481261194; fWeightMatrix2to3[0][8] = 0.867458675933583; fWeightMatrix2to3[0][9] = 0.931344783635016; fWeightMatrix2to3[0][10] = 1.12923091921339; fWeightMatrix2to3[0][11] = -0.712534134880202; fWeightMatrix2to3[0][12] = -0.922179729327231; fWeightMatrix2to3[0][13] = 0.206085985143468; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l