// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:29:47 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job417 Training events: 26280 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [49.2191009521,850.621887207] LepAPt LepAPt 'F' [20.0000362396,162.429443359] LepBPt LepBPt 'F' [10.0018510818,64.8868713379] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.16568470001,17.5202083588] MetSpec MetSpec 'F' [15.0119848251,252.419906616] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1229667664,533.675720215] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [5.16211557388,344.837463379] addEt addEt 'F' [49.2191009521,422.705413818] dPhiLepSumMet dPhiLepSumMet 'F' [0.054340723902,3.14157605171] dPhiLeptons dPhiLeptons 'F' [3.08752059937e-05,1.13010489941] dRLeptons dRLeptons 'F' [0.200001657009,1.13414525986] lep1_E lep1_E 'F' [20.0598621368,220.396377563] lep2_E lep2_E 'F' [10.0297393799,100.190933228] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 49.2191009521484; fVmax[0] = 850.621887207031; fVmin[1] = 20.000036239624; fVmax[1] = 162.429443359375; fVmin[2] = 10.0018510818481; fVmax[2] = 64.8868713378906; fVmin[3] = 1.16568470001221; fVmax[3] = 17.5202083587646; fVmin[4] = 15.0119848251343; fVmax[4] = 252.419906616211; fVmin[5] = 30.1229667663574; fVmax[5] = 533.675720214844; fVmin[6] = 5.16211557388306; fVmax[6] = 344.837463378906; fVmin[7] = 49.2191009521484; fVmax[7] = 422.705413818359; fVmin[8] = 0.0543407239019871; fVmax[8] = 3.14157605171204; fVmin[9] = 3.08752059936523e-05; fVmax[9] = 1.13010489940643; fVmin[10] = 0.200001657009125; fVmax[10] = 1.13414525985718; fVmin[11] = 20.0598621368408; fVmax[11] = 220.396377563477; fVmin[12] = 10.0297393798828; fVmax[12] = 100.190933227539; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.27788759412375; fWeightMatrix0to1[1][0] = 2.06851442741713; fWeightMatrix0to1[2][0] = 1.16400509285645; fWeightMatrix0to1[3][0] = 1.65541881883671; fWeightMatrix0to1[4][0] = -1.54002953120279; fWeightMatrix0to1[5][0] = -1.13480136272062; fWeightMatrix0to1[6][0] = -0.864667631264863; fWeightMatrix0to1[7][0] = 1.07539308842296; fWeightMatrix0to1[8][0] = -1.71774097877057; fWeightMatrix0to1[9][0] = -0.154078535503003; fWeightMatrix0to1[10][0] = -1.0816237804885; fWeightMatrix0to1[11][0] = -0.199757229141263; fWeightMatrix0to1[12][0] = -0.540404327854663; fWeightMatrix0to1[13][0] = -0.225214115171391; fWeightMatrix0to1[0][1] = -0.565757237337637; fWeightMatrix0to1[1][1] = 1.10351456142242; fWeightMatrix0to1[2][1] = -0.132805576412873; fWeightMatrix0to1[3][1] = 2.67614348463749; fWeightMatrix0to1[4][1] = 0.53841883209347; fWeightMatrix0to1[5][1] = 1.95322493802253; fWeightMatrix0to1[6][1] = -1.67034220646605; fWeightMatrix0to1[7][1] = 2.15047067579573; fWeightMatrix0to1[8][1] = 0.611395234997826; fWeightMatrix0to1[9][1] = 0.889493616868136; fWeightMatrix0to1[10][1] = -1.55371343840759; fWeightMatrix0to1[11][1] = -0.454853125164405; fWeightMatrix0to1[12][1] = 1.94203601397419; fWeightMatrix0to1[13][1] = -1.1288620438561; fWeightMatrix0to1[0][2] = -1.90034981643979; fWeightMatrix0to1[1][2] = 0.42155411364907; fWeightMatrix0to1[2][2] = 0.9452923468138; fWeightMatrix0to1[3][2] = 2.20807699384876; fWeightMatrix0to1[4][2] = 0.331639329764352; fWeightMatrix0to1[5][2] = 2.77230497572307; fWeightMatrix0to1[6][2] = -0.871923806009563; fWeightMatrix0to1[7][2] = 4.38264840537621; fWeightMatrix0to1[8][2] = -1.84376001817985; fWeightMatrix0to1[9][2] = 2.38682089034733; fWeightMatrix0to1[10][2] = -4.52618151098667; fWeightMatrix0to1[11][2] = 1.50561076492281; fWeightMatrix0to1[12][2] = 0.660578721830998; fWeightMatrix0to1[13][2] = 1.67299857769122; fWeightMatrix0to1[0][3] = 1.96135359173406; fWeightMatrix0to1[1][3] = 1.12816685182941; fWeightMatrix0to1[2][3] = -0.441675159921396; fWeightMatrix0to1[3][3] = -1.0374327014586; fWeightMatrix0to1[4][3] = 1.42051049290089; fWeightMatrix0to1[5][3] = 0.544778678627522; fWeightMatrix0to1[6][3] = 2.04419385097686; fWeightMatrix0to1[7][3] = 0.614326731927812; fWeightMatrix0to1[8][3] = 1.26177970134436; fWeightMatrix0to1[9][3] = 1.58695636163129; fWeightMatrix0to1[10][3] = -0.83933763708416; fWeightMatrix0to1[11][3] = 0.398962999276005; fWeightMatrix0to1[12][3] = -1.52030201454386; fWeightMatrix0to1[13][3] = -0.143571324240249; fWeightMatrix0to1[0][4] = -1.25142900098117; fWeightMatrix0to1[1][4] = -1.66526735297449; fWeightMatrix0to1[2][4] = 1.80753114448334; fWeightMatrix0to1[3][4] = 0.913338511815218; fWeightMatrix0to1[4][4] = -0.900073021884232; fWeightMatrix0to1[5][4] = 1.10885174042453; fWeightMatrix0to1[6][4] = -0.467870907094025; fWeightMatrix0to1[7][4] = 1.7503521643272; fWeightMatrix0to1[8][4] = 1.66520897104546; fWeightMatrix0to1[9][4] = 0.349408098814426; fWeightMatrix0to1[10][4] = 0.399303094911451; fWeightMatrix0to1[11][4] = -1.70256048147209; fWeightMatrix0to1[12][4] = 1.42258942407801; fWeightMatrix0to1[13][4] = -0.336696402436948; fWeightMatrix0to1[0][5] = -0.698971141764014; fWeightMatrix0to1[1][5] = -1.37014065054836; fWeightMatrix0to1[2][5] = 1.00512233878001; fWeightMatrix0to1[3][5] = 0.87800266491681; fWeightMatrix0to1[4][5] = 1.75852298050488; fWeightMatrix0to1[5][5] = -0.281411481540648; fWeightMatrix0to1[6][5] = -1.83954511402221; fWeightMatrix0to1[7][5] = -3.16028443701925; fWeightMatrix0to1[8][5] = 0.682392648405269; fWeightMatrix0to1[9][5] = 1.44594262704642; fWeightMatrix0to1[10][5] = 2.57960011552325; fWeightMatrix0to1[11][5] = 1.10857324680525; fWeightMatrix0to1[12][5] = 1.67924621415834; fWeightMatrix0to1[13][5] = -0.703167034809635; fWeightMatrix0to1[0][6] = -0.819201420294396; fWeightMatrix0to1[1][6] = 0.097675705665198; fWeightMatrix0to1[2][6] = 1.4291472254362; fWeightMatrix0to1[3][6] = -0.718847094309017; fWeightMatrix0to1[4][6] = -1.15935893867888; fWeightMatrix0to1[5][6] = 0.0714491066186137; fWeightMatrix0to1[6][6] = -1.61368036043478; fWeightMatrix0to1[7][6] = 2.44115038738559; fWeightMatrix0to1[8][6] = 1.64300361260715; fWeightMatrix0to1[9][6] = -0.844201512870239; fWeightMatrix0to1[10][6] = -1.35776714610075; fWeightMatrix0to1[11][6] = 1.2306146542129; fWeightMatrix0to1[12][6] = -0.0773331050362361; fWeightMatrix0to1[13][6] = -1.19155768191787; fWeightMatrix0to1[0][7] = -1.40354113544088; fWeightMatrix0to1[1][7] = 0.581610320685711; fWeightMatrix0to1[2][7] = -1.59149266074568; fWeightMatrix0to1[3][7] = -0.415911440972635; fWeightMatrix0to1[4][7] = 1.36298983887063; fWeightMatrix0to1[5][7] = 4.04659002532859; fWeightMatrix0to1[6][7] = -3.57061806203721; fWeightMatrix0to1[7][7] = 7.13276045012594; fWeightMatrix0to1[8][7] = -0.900089814249177; fWeightMatrix0to1[9][7] = -0.132473204701088; fWeightMatrix0to1[10][7] = -1.74483661183365; fWeightMatrix0to1[11][7] = -0.387417092934503; fWeightMatrix0to1[12][7] = -1.32331129384247; fWeightMatrix0to1[13][7] = 0.731386649336561; fWeightMatrix0to1[0][8] = 0.273086936940645; fWeightMatrix0to1[1][8] = 0.38249432214788; fWeightMatrix0to1[2][8] = -1.70789245657758; fWeightMatrix0to1[3][8] = 2.75443173807246; fWeightMatrix0to1[4][8] = 0.0522720823564103; fWeightMatrix0to1[5][8] = -2.4716816551514; fWeightMatrix0to1[6][8] = 2.47251673766045; fWeightMatrix0to1[7][8] = -0.0936985513105139; fWeightMatrix0to1[8][8] = -1.81581032292784; fWeightMatrix0to1[9][8] = -0.819743894008365; fWeightMatrix0to1[10][8] = -0.0321177279668725; fWeightMatrix0to1[11][8] = 1.79032818374508; fWeightMatrix0to1[12][8] = -0.75745155326321; fWeightMatrix0to1[13][8] = -0.273116074131065; fWeightMatrix0to1[0][9] = -0.327573375908992; fWeightMatrix0to1[1][9] = 0.343251019208656; fWeightMatrix0to1[2][9] = -1.44292175763319; fWeightMatrix0to1[3][9] = 0.594901570525796; fWeightMatrix0to1[4][9] = -0.0226086136281507; fWeightMatrix0to1[5][9] = -0.411765502548769; fWeightMatrix0to1[6][9] = 0.585367083789852; fWeightMatrix0to1[7][9] = 0.926490061740278; fWeightMatrix0to1[8][9] = 0.660760671687608; fWeightMatrix0to1[9][9] = 1.51107557968835; fWeightMatrix0to1[10][9] = 1.5107641108155; fWeightMatrix0to1[11][9] = -2.79373078316481; fWeightMatrix0to1[12][9] = -0.270432347136698; fWeightMatrix0to1[13][9] = 1.58210880147331; fWeightMatrix0to1[0][10] = 1.63201914140062; fWeightMatrix0to1[1][10] = -0.255495391138869; fWeightMatrix0to1[2][10] = -1.19331416858073; fWeightMatrix0to1[3][10] = 0.0980240772178642; fWeightMatrix0to1[4][10] = -1.997250040769; fWeightMatrix0to1[5][10] = 2.09276529332874; fWeightMatrix0to1[6][10] = -1.03334846648976; fWeightMatrix0to1[7][10] = -1.1371102702938; fWeightMatrix0to1[8][10] = -0.21196994233679; fWeightMatrix0to1[9][10] = 0.516935808968038; fWeightMatrix0to1[10][10] = -0.175838868250948; fWeightMatrix0to1[11][10] = -0.0569998102799626; fWeightMatrix0to1[12][10] = 1.76520597124947; fWeightMatrix0to1[13][10] = -0.12630653793859; fWeightMatrix0to1[0][11] = -0.577972407919066; fWeightMatrix0to1[1][11] = 0.682758606180819; fWeightMatrix0to1[2][11] = 1.15835211744495; fWeightMatrix0to1[3][11] = 0.615186395528901; fWeightMatrix0to1[4][11] = 1.14083570029934; fWeightMatrix0to1[5][11] = -3.03634253397125; fWeightMatrix0to1[6][11] = 2.7674323294427; fWeightMatrix0to1[7][11] = 1.6362147802053; fWeightMatrix0to1[8][11] = 0.226726136420128; fWeightMatrix0to1[9][11] = 1.37127723640069; fWeightMatrix0to1[10][11] = 1.7741728320557; fWeightMatrix0to1[11][11] = -0.463908001565868; fWeightMatrix0to1[12][11] = -0.342586173566552; fWeightMatrix0to1[13][11] = 0.810987145723433; fWeightMatrix0to1[0][12] = -0.947443556346031; fWeightMatrix0to1[1][12] = -0.358036172725117; fWeightMatrix0to1[2][12] = 1.49049291748967; fWeightMatrix0to1[3][12] = -0.194784223578033; fWeightMatrix0to1[4][12] = -0.0465803816699066; fWeightMatrix0to1[5][12] = 0.31736728718563; fWeightMatrix0to1[6][12] = 3.42288236815238; fWeightMatrix0to1[7][12] = -1.12012435976072; fWeightMatrix0to1[8][12] = 0.14294718931531; fWeightMatrix0to1[9][12] = 0.713597278087468; fWeightMatrix0to1[10][12] = 0.25882369145482; fWeightMatrix0to1[11][12] = -1.00469724859612; fWeightMatrix0to1[12][12] = -1.7494258968914; fWeightMatrix0to1[13][12] = 1.43211463910688; fWeightMatrix0to1[0][13] = 1.78129812106171; fWeightMatrix0to1[1][13] = -1.15821553659715; fWeightMatrix0to1[2][13] = -0.421040263620839; fWeightMatrix0to1[3][13] = 1.35468727883687; fWeightMatrix0to1[4][13] = 0.265845362107413; fWeightMatrix0to1[5][13] = 5.04423372117627; fWeightMatrix0to1[6][13] = -4.87099678618584; fWeightMatrix0to1[7][13] = 11.6011970383281; fWeightMatrix0to1[8][13] = 0.0910406619629793; fWeightMatrix0to1[9][13] = -0.106295494486396; fWeightMatrix0to1[10][13] = -2.74860947159645; fWeightMatrix0to1[11][13] = -0.531609830274449; fWeightMatrix0to1[12][13] = -0.407834511871752; fWeightMatrix0to1[13][13] = -0.712473258051706; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.92804404749358; fWeightMatrix1to2[1][0] = -0.236374310404925; fWeightMatrix1to2[2][0] = -0.364761490844343; fWeightMatrix1to2[3][0] = -1.83246951519543; fWeightMatrix1to2[4][0] = -0.971777502868807; fWeightMatrix1to2[5][0] = -1.18709402913202; fWeightMatrix1to2[6][0] = -0.422081930394634; fWeightMatrix1to2[7][0] = -2.50997481144091; fWeightMatrix1to2[8][0] = 0.43351387853797; fWeightMatrix1to2[9][0] = -0.324337094175057; fWeightMatrix1to2[10][0] = -0.960974258317824; fWeightMatrix1to2[11][0] = -0.203722988610531; fWeightMatrix1to2[12][0] = -0.145391549355634; fWeightMatrix1to2[0][1] = 1.10807089990687; fWeightMatrix1to2[1][1] = 0.34980288009676; fWeightMatrix1to2[2][1] = -0.148672797137792; fWeightMatrix1to2[3][1] = -2.44133976949562; fWeightMatrix1to2[4][1] = -0.907828011557765; fWeightMatrix1to2[5][1] = 1.50654360683885; fWeightMatrix1to2[6][1] = 1.42885861477475; fWeightMatrix1to2[7][1] = 0.764031904170815; fWeightMatrix1to2[8][1] = 0.092883883054202; fWeightMatrix1to2[9][1] = 1.96787666355831; fWeightMatrix1to2[10][1] = -1.59311869467441; fWeightMatrix1to2[11][1] = -1.31335704749004; fWeightMatrix1to2[12][1] = -0.221584952043482; fWeightMatrix1to2[0][2] = -1.4868185219561; fWeightMatrix1to2[1][2] = 0.859065877732048; fWeightMatrix1to2[2][2] = 1.66389226572444; fWeightMatrix1to2[3][2] = -1.1880158216674; fWeightMatrix1to2[4][2] = 0.787199747351992; fWeightMatrix1to2[5][2] = -0.158941500454308; fWeightMatrix1to2[6][2] = -1.71502894465121; fWeightMatrix1to2[7][2] = -1.13453702215312; fWeightMatrix1to2[8][2] = 0.936996845322039; fWeightMatrix1to2[9][2] = -0.0853137020769664; fWeightMatrix1to2[10][2] = 0.920664769898754; fWeightMatrix1to2[11][2] = -1.18623322197322; fWeightMatrix1to2[12][2] = 1.70700768812624; fWeightMatrix1to2[0][3] = -1.36337746166923; fWeightMatrix1to2[1][3] = 0.441917681239279; fWeightMatrix1to2[2][3] = 0.0608031574266282; fWeightMatrix1to2[3][3] = -2.70883418720362; fWeightMatrix1to2[4][3] = -1.26203010729245; fWeightMatrix1to2[5][3] = -2.17684359848637; fWeightMatrix1to2[6][3] = 0.135226612327203; fWeightMatrix1to2[7][3] = -1.86911105128734; fWeightMatrix1to2[8][3] = -1.73772573128386; fWeightMatrix1to2[9][3] = -1.75585403950354; fWeightMatrix1to2[10][3] = 0.474476674695516; fWeightMatrix1to2[11][3] = -1.01473839828486; fWeightMatrix1to2[12][3] = -1.81386045577336; fWeightMatrix1to2[0][4] = 0.884083573727223; fWeightMatrix1to2[1][4] = -1.55023334963667; fWeightMatrix1to2[2][4] = -0.586016390189219; fWeightMatrix1to2[3][4] = 0.206946574149013; fWeightMatrix1to2[4][4] = 0.144253097167291; fWeightMatrix1to2[5][4] = 1.5175383342404; fWeightMatrix1to2[6][4] = 1.21447880704042; fWeightMatrix1to2[7][4] = -1.21096793982619; fWeightMatrix1to2[8][4] = -2.04080720203983; fWeightMatrix1to2[9][4] = -2.20130432985485; fWeightMatrix1to2[10][4] = -1.44450451513293; fWeightMatrix1to2[11][4] = -2.0077368302499; fWeightMatrix1to2[12][4] = -0.661200071632214; fWeightMatrix1to2[0][5] = -0.590894639747337; fWeightMatrix1to2[1][5] = 1.00345824018125; fWeightMatrix1to2[2][5] = -0.708155831339981; fWeightMatrix1to2[3][5] = -3.88439720100192; fWeightMatrix1to2[4][5] = -0.61761456755408; fWeightMatrix1to2[5][5] = -3.20415474512468; fWeightMatrix1to2[6][5] = 0.685647100218492; fWeightMatrix1to2[7][5] = 0.89083170262466; fWeightMatrix1to2[8][5] = -1.45064375839084; fWeightMatrix1to2[9][5] = -1.45795835006626; fWeightMatrix1to2[10][5] = 1.13465702969244; fWeightMatrix1to2[11][5] = 1.16269234896236; fWeightMatrix1to2[12][5] = -0.319108060716613; fWeightMatrix1to2[0][6] = -1.11287073916899; fWeightMatrix1to2[1][6] = -0.890997686713517; fWeightMatrix1to2[2][6] = 0.790197065373091; fWeightMatrix1to2[3][6] = 3.54890247797312; fWeightMatrix1to2[4][6] = 0.697845273825226; fWeightMatrix1to2[5][6] = -0.905001303444611; fWeightMatrix1to2[6][6] = -0.532660068812292; fWeightMatrix1to2[7][6] = -0.758205592843213; fWeightMatrix1to2[8][6] = -1.82351824055827; fWeightMatrix1to2[9][6] = -1.71194973139813; fWeightMatrix1to2[10][6] = -3.04817833070521; fWeightMatrix1to2[11][6] = -1.63197430511409; fWeightMatrix1to2[12][6] = -2.68773951074219; fWeightMatrix1to2[0][7] = 1.82434322338214; fWeightMatrix1to2[1][7] = -2.45521391739646; fWeightMatrix1to2[2][7] = 1.32086004537677; fWeightMatrix1to2[3][7] = -7.61165967843263; fWeightMatrix1to2[4][7] = 0.989217393799674; fWeightMatrix1to2[5][7] = -2.0003130827827; fWeightMatrix1to2[6][7] = 0.682634153811315; fWeightMatrix1to2[7][7] = 0.272154693134497; fWeightMatrix1to2[8][7] = -2.33628259269072; fWeightMatrix1to2[9][7] = 0.0378079622216236; fWeightMatrix1to2[10][7] = 1.0000055440371; fWeightMatrix1to2[11][7] = -2.05253961532986; fWeightMatrix1to2[12][7] = -2.43450354616031; fWeightMatrix1to2[0][8] = -1.73733279344608; fWeightMatrix1to2[1][8] = -0.89520450512858; fWeightMatrix1to2[2][8] = 1.78704749974873; fWeightMatrix1to2[3][8] = -1.74895509580211; fWeightMatrix1to2[4][8] = 0.316783826861684; fWeightMatrix1to2[5][8] = -1.13100533606827; fWeightMatrix1to2[6][8] = -1.08767678764262; fWeightMatrix1to2[7][8] = -1.72722440593941; fWeightMatrix1to2[8][8] = -1.07892985675957; fWeightMatrix1to2[9][8] = -1.44200664517859; fWeightMatrix1to2[10][8] = 0.802233085856469; fWeightMatrix1to2[11][8] = -1.28628454800698; fWeightMatrix1to2[12][8] = -1.26962847953934; fWeightMatrix1to2[0][9] = 1.57921972290466; fWeightMatrix1to2[1][9] = -0.0514471494247006; fWeightMatrix1to2[2][9] = -0.493092308849196; fWeightMatrix1to2[3][9] = 1.4037234566834; fWeightMatrix1to2[4][9] = -1.79458408158199; fWeightMatrix1to2[5][9] = 1.14836665312587; fWeightMatrix1to2[6][9] = 1.01442811784496; fWeightMatrix1to2[7][9] = -0.995662696394587; fWeightMatrix1to2[8][9] = -1.15126062690944; fWeightMatrix1to2[9][9] = -0.142225385161192; fWeightMatrix1to2[10][9] = -0.820451091196149; fWeightMatrix1to2[11][9] = 0.486771287618096; fWeightMatrix1to2[12][9] = 1.85735589266022; fWeightMatrix1to2[0][10] = 0.484684081842917; fWeightMatrix1to2[1][10] = 0.810448264807002; fWeightMatrix1to2[2][10] = -2.00937062663011; fWeightMatrix1to2[3][10] = 3.43359978981522; fWeightMatrix1to2[4][10] = -1.95347510914723; fWeightMatrix1to2[5][10] = -0.521327932627732; fWeightMatrix1to2[6][10] = -1.95994872835744; fWeightMatrix1to2[7][10] = -1.83176181287732; fWeightMatrix1to2[8][10] = -0.745715801480469; fWeightMatrix1to2[9][10] = 0.171874989162752; fWeightMatrix1to2[10][10] = -0.273936518867073; fWeightMatrix1to2[11][10] = 0.884399089005478; fWeightMatrix1to2[12][10] = -0.739920565759247; fWeightMatrix1to2[0][11] = -1.00084514303694; fWeightMatrix1to2[1][11] = -2.09569445808595; fWeightMatrix1to2[2][11] = -0.724882902279395; fWeightMatrix1to2[3][11] = 2.11850682718183; fWeightMatrix1to2[4][11] = -1.87045471348315; fWeightMatrix1to2[5][11] = -0.975211816665716; fWeightMatrix1to2[6][11] = -1.98514385113776; fWeightMatrix1to2[7][11] = -1.0376631604327; fWeightMatrix1to2[8][11] = -1.3823515493906; fWeightMatrix1to2[9][11] = -0.893506174772181; fWeightMatrix1to2[10][11] = -1.49215072147283; fWeightMatrix1to2[11][11] = 1.04238321929712; fWeightMatrix1to2[12][11] = -0.269684649891064; fWeightMatrix1to2[0][12] = -0.143988474316619; fWeightMatrix1to2[1][12] = -0.481227440661623; fWeightMatrix1to2[2][12] = -1.19700892162344; fWeightMatrix1to2[3][12] = -1.55847781894402; fWeightMatrix1to2[4][12] = -1.74424893514536; fWeightMatrix1to2[5][12] = 0.0561769059889694; fWeightMatrix1to2[6][12] = -0.911937968346426; fWeightMatrix1to2[7][12] = -1.11744360335239; fWeightMatrix1to2[8][12] = 0.189959365707129; fWeightMatrix1to2[9][12] = -2.21981257909933; fWeightMatrix1to2[10][12] = -0.68227901421009; fWeightMatrix1to2[11][12] = -1.93428312793158; fWeightMatrix1to2[12][12] = 0.439765750354457; fWeightMatrix1to2[0][13] = -1.6120834463365; fWeightMatrix1to2[1][13] = -2.19483521933578; fWeightMatrix1to2[2][13] = -0.534544523433113; fWeightMatrix1to2[3][13] = -1.15131573857447; fWeightMatrix1to2[4][13] = -1.64613721695555; fWeightMatrix1to2[5][13] = 0.751459668332294; fWeightMatrix1to2[6][13] = -1.21653942064809; fWeightMatrix1to2[7][13] = -1.19682106116584; fWeightMatrix1to2[8][13] = 1.66844975252422; fWeightMatrix1to2[9][13] = 0.132460936608373; fWeightMatrix1to2[10][13] = 0.66998730956645; fWeightMatrix1to2[11][13] = -0.0356297069945564; fWeightMatrix1to2[12][13] = 0.841772184175837; fWeightMatrix1to2[0][14] = 0.767111046658646; fWeightMatrix1to2[1][14] = -0.844951305544016; fWeightMatrix1to2[2][14] = -1.79570079885625; fWeightMatrix1to2[3][14] = -1.16700437810679; fWeightMatrix1to2[4][14] = -2.01349948642154; fWeightMatrix1to2[5][14] = -0.596191798378007; fWeightMatrix1to2[6][14] = 0.429085363084945; fWeightMatrix1to2[7][14] = -0.464779659898447; fWeightMatrix1to2[8][14] = -0.0354694218579089; fWeightMatrix1to2[9][14] = -0.237033198321974; fWeightMatrix1to2[10][14] = 1.21059803460927; fWeightMatrix1to2[11][14] = -1.42349119235359; fWeightMatrix1to2[12][14] = -0.0451333984189299; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.50633428608185; fWeightMatrix2to3[0][1] = -0.292277882221236; fWeightMatrix2to3[0][2] = 0.11588782411623; fWeightMatrix2to3[0][3] = -0.827764817358601; fWeightMatrix2to3[0][4] = 0.529774318013775; fWeightMatrix2to3[0][5] = -0.744043648790333; fWeightMatrix2to3[0][6] = 0.334432659091631; fWeightMatrix2to3[0][7] = 1.43888018731392; fWeightMatrix2to3[0][8] = -0.582566375131637; fWeightMatrix2to3[0][9] = 0.6490121498316; fWeightMatrix2to3[0][10] = -0.00010004010477239; fWeightMatrix2to3[0][11] = 1.01411095417848; fWeightMatrix2to3[0][12] = -0.601399311690876; fWeightMatrix2to3[0][13] = 0.745986741149893; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l