// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:28:19 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 21736 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1] LRWg LRWg 'F' [0,0.999932587147] LRWj LRWj 'F' [-0.00544115481898,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.0806331635,275.269592285] MetDelPhi MetDelPhi 'F' [0.120642267168,3.12748098373] MetSpec MetSpec 'F' [15.0024642944,178.184524536] dPhiLeptons dPhiLeptons 'F' [0.00023341178894,3.14049983025] dRLeptons dRLeptons 'F' [0.403980851173,4.5256857872] dimass dimass 'F' [16.0008029938,557.91784668] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1; fVmin[2] = 0; fVmax[2] = 0.999932587146759; fVmin[3] = -0.00544115481898189; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.0806331634521; fVmax[5] = 275.269592285156; fVmin[6] = 0.120642267167568; fVmax[6] = 3.12748098373413; fVmin[7] = 15.0024642944336; fVmax[7] = 178.184524536133; fVmin[8] = 0.00023341178894043; fVmax[8] = 3.14049983024597; fVmin[9] = 0.403980851173401; fVmax[9] = 4.52568578720093; fVmin[10] = 16.0008029937744; fVmax[10] = 557.917846679688; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.78587534612727; fWeightMatrix0to1[1][0] = 1.39808145070985; fWeightMatrix0to1[2][0] = -0.513179585912009; fWeightMatrix0to1[3][0] = 1.51208143475257; fWeightMatrix0to1[4][0] = 0.852583705928061; fWeightMatrix0to1[5][0] = 3.58016453718151; fWeightMatrix0to1[6][0] = 1.86807322679552; fWeightMatrix0to1[7][0] = -0.389771790886047; fWeightMatrix0to1[8][0] = 0.898411025311535; fWeightMatrix0to1[9][0] = 1.52615221570026; fWeightMatrix0to1[10][0] = 0.322598798463723; fWeightMatrix0to1[11][0] = 0.127620930661387; fWeightMatrix0to1[0][1] = 1.61856300064746; fWeightMatrix0to1[1][1] = 1.2508004137362; fWeightMatrix0to1[2][1] = 1.14555273475568; fWeightMatrix0to1[3][1] = -2.38685520498847; fWeightMatrix0to1[4][1] = 0.990120560587827; fWeightMatrix0to1[5][1] = 0.0582594506025728; fWeightMatrix0to1[6][1] = -1.67994005255503; fWeightMatrix0to1[7][1] = 1.57777338515478; fWeightMatrix0to1[8][1] = 2.47311490049152; fWeightMatrix0to1[9][1] = 1.26399627703814; fWeightMatrix0to1[10][1] = 0.546402453973295; fWeightMatrix0to1[11][1] = -0.795597747290522; fWeightMatrix0to1[0][2] = 0.560657199996271; fWeightMatrix0to1[1][2] = 0.220959366771511; fWeightMatrix0to1[2][2] = 0.00793419741458121; fWeightMatrix0to1[3][2] = -1.42763407243903; fWeightMatrix0to1[4][2] = 1.90474548272643; fWeightMatrix0to1[5][2] = 0.0110947021589814; fWeightMatrix0to1[6][2] = -0.0703674278991695; fWeightMatrix0to1[7][2] = -0.952902179241417; fWeightMatrix0to1[8][2] = -0.281751122975828; fWeightMatrix0to1[9][2] = -0.579020532914903; fWeightMatrix0to1[10][2] = -0.44121866023538; fWeightMatrix0to1[11][2] = 0.726954566976275; fWeightMatrix0to1[0][3] = -0.0781702638272787; fWeightMatrix0to1[1][3] = 0.562114095248046; fWeightMatrix0to1[2][3] = 1.29415904367419; fWeightMatrix0to1[3][3] = 1.40431433542528; fWeightMatrix0to1[4][3] = -0.918257353806234; fWeightMatrix0to1[5][3] = -0.334981201005296; fWeightMatrix0to1[6][3] = -0.726115191597661; fWeightMatrix0to1[7][3] = -1.62451910288858; fWeightMatrix0to1[8][3] = -0.505128752599966; fWeightMatrix0to1[9][3] = -1.28850615211903; fWeightMatrix0to1[10][3] = -0.0900617620202149; fWeightMatrix0to1[11][3] = 0.797913880774829; fWeightMatrix0to1[0][4] = 0.140304025790977; fWeightMatrix0to1[1][4] = 0.0501130209250728; fWeightMatrix0to1[2][4] = 1.34292032114309; fWeightMatrix0to1[3][4] = -1.0245532685625; fWeightMatrix0to1[4][4] = 0.0491787651146999; fWeightMatrix0to1[5][4] = -0.851862601083288; fWeightMatrix0to1[6][4] = -0.583816660592497; fWeightMatrix0to1[7][4] = 0.165750825797612; fWeightMatrix0to1[8][4] = -0.326577672160567; fWeightMatrix0to1[9][4] = 0.566084716959858; fWeightMatrix0to1[10][4] = -0.258322242810502; fWeightMatrix0to1[11][4] = -0.249932215644494; fWeightMatrix0to1[0][5] = -0.919309862678924; fWeightMatrix0to1[1][5] = -0.00733193235709649; fWeightMatrix0to1[2][5] = -1.65992430883959; fWeightMatrix0to1[3][5] = 0.391554090482956; fWeightMatrix0to1[4][5] = 1.02445360375544; fWeightMatrix0to1[5][5] = -1.69251980120263; fWeightMatrix0to1[6][5] = 1.85658327465462; fWeightMatrix0to1[7][5] = -0.535519946229393; fWeightMatrix0to1[8][5] = -0.254714473505672; fWeightMatrix0to1[9][5] = 0.545690135962659; fWeightMatrix0to1[10][5] = 0.384433903533899; fWeightMatrix0to1[11][5] = -1.05937703553016; fWeightMatrix0to1[0][6] = -1.03923798976492; fWeightMatrix0to1[1][6] = -1.15543851270292; fWeightMatrix0to1[2][6] = 1.01609050199625; fWeightMatrix0to1[3][6] = -0.832481000803386; fWeightMatrix0to1[4][6] = -0.0854564786520992; fWeightMatrix0to1[5][6] = -1.1830354136183; fWeightMatrix0to1[6][6] = -1.94087855410286; fWeightMatrix0to1[7][6] = 1.31281697005951; fWeightMatrix0to1[8][6] = 0.253422535589372; fWeightMatrix0to1[9][6] = 0.734511619609631; fWeightMatrix0to1[10][6] = 2.28400481988691; fWeightMatrix0to1[11][6] = 1.01609987470097; fWeightMatrix0to1[0][7] = -0.386018625385843; fWeightMatrix0to1[1][7] = 0.631857079168355; fWeightMatrix0to1[2][7] = 2.3335246906805; fWeightMatrix0to1[3][7] = -0.762181312909539; fWeightMatrix0to1[4][7] = 1.37753628973764; fWeightMatrix0to1[5][7] = -0.382022125376427; fWeightMatrix0to1[6][7] = -0.224536515622063; fWeightMatrix0to1[7][7] = -0.531158646382675; fWeightMatrix0to1[8][7] = 0.189572904496545; fWeightMatrix0to1[9][7] = 1.61993039856253; fWeightMatrix0to1[10][7] = -0.470629983218253; fWeightMatrix0to1[11][7] = 2.35309630899695; fWeightMatrix0to1[0][8] = -0.615413764907922; fWeightMatrix0to1[1][8] = -1.44533908014094; fWeightMatrix0to1[2][8] = 2.28539625722137; fWeightMatrix0to1[3][8] = -2.07463558144693; fWeightMatrix0to1[4][8] = 0.0668552595596359; fWeightMatrix0to1[5][8] = -0.232390485548503; fWeightMatrix0to1[6][8] = 0.116822018085899; fWeightMatrix0to1[7][8] = -0.220023192603143; fWeightMatrix0to1[8][8] = 1.92072918570465; fWeightMatrix0to1[9][8] = -0.890910384562118; fWeightMatrix0to1[10][8] = 1.01127565252843; fWeightMatrix0to1[11][8] = -1.20895128744163; fWeightMatrix0to1[0][9] = -0.473725172165015; fWeightMatrix0to1[1][9] = -1.91107762463087; fWeightMatrix0to1[2][9] = 1.9508689425643; fWeightMatrix0to1[3][9] = -1.20472049190951; fWeightMatrix0to1[4][9] = -0.299178271143344; fWeightMatrix0to1[5][9] = 0.191574911478058; fWeightMatrix0to1[6][9] = -1.04047105357132; fWeightMatrix0to1[7][9] = 0.917274884257074; fWeightMatrix0to1[8][9] = -1.17343539814481; fWeightMatrix0to1[9][9] = -2.07900089914366; fWeightMatrix0to1[10][9] = -2.32813346815752; fWeightMatrix0to1[11][9] = -0.0510817659041139; fWeightMatrix0to1[0][10] = -0.907250773369012; fWeightMatrix0to1[1][10] = 1.03489314876049; fWeightMatrix0to1[2][10] = 1.54923424230854; fWeightMatrix0to1[3][10] = -2.16349497442715; fWeightMatrix0to1[4][10] = 0.833086179383921; fWeightMatrix0to1[5][10] = -1.54394960967983; fWeightMatrix0to1[6][10] = -2.64614183107923; fWeightMatrix0to1[7][10] = 0.795304628602647; fWeightMatrix0to1[8][10] = -0.448364610399188; fWeightMatrix0to1[9][10] = -1.02978250736524; fWeightMatrix0to1[10][10] = -1.28196360338961; fWeightMatrix0to1[11][10] = -1.62943924681877; fWeightMatrix0to1[0][11] = -0.852252643139408; fWeightMatrix0to1[1][11] = -0.910045488488091; fWeightMatrix0to1[2][11] = -0.125972272991876; fWeightMatrix0to1[3][11] = 1.08824647278734; fWeightMatrix0to1[4][11] = 1.23116163383048; fWeightMatrix0to1[5][11] = 0.903224012560885; fWeightMatrix0to1[6][11] = -1.68357509248026; fWeightMatrix0to1[7][11] = -1.17178808336037; fWeightMatrix0to1[8][11] = 2.70979791194146; fWeightMatrix0to1[9][11] = -1.69153314488656; fWeightMatrix0to1[10][11] = 0.0393515173545834; fWeightMatrix0to1[11][11] = -0.243822819667016; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.48026923851191; fWeightMatrix1to2[1][0] = -0.0529813420644063; fWeightMatrix1to2[2][0] = -0.943300338155797; fWeightMatrix1to2[3][0] = -0.796627211852415; fWeightMatrix1to2[4][0] = -2.0501407612521; fWeightMatrix1to2[5][0] = -0.207798734698458; fWeightMatrix1to2[6][0] = 1.61033349643164; fWeightMatrix1to2[7][0] = 1.34110048241957; fWeightMatrix1to2[8][0] = 1.45759366425646; fWeightMatrix1to2[9][0] = -2.69840357644617; fWeightMatrix1to2[10][0] = -0.660030086281013; fWeightMatrix1to2[0][1] = 0.849091626449971; fWeightMatrix1to2[1][1] = -1.6200058312565; fWeightMatrix1to2[2][1] = -0.995015031428817; fWeightMatrix1to2[3][1] = -0.0705320881541544; fWeightMatrix1to2[4][1] = 1.20748490197218; fWeightMatrix1to2[5][1] = -1.96902592663284; fWeightMatrix1to2[6][1] = -0.797408582652731; fWeightMatrix1to2[7][1] = -0.0093701446065152; fWeightMatrix1to2[8][1] = -1.41496412565125; fWeightMatrix1to2[9][1] = -1.57877322798366; fWeightMatrix1to2[10][1] = -0.248218081088283; fWeightMatrix1to2[0][2] = 0.549614170893893; fWeightMatrix1to2[1][2] = -0.268686155463622; fWeightMatrix1to2[2][2] = -1.60859816122785; fWeightMatrix1to2[3][2] = -0.20866214986771; fWeightMatrix1to2[4][2] = 1.43268513636993; fWeightMatrix1to2[5][2] = -0.933501161014245; fWeightMatrix1to2[6][2] = 0.893446649415195; fWeightMatrix1to2[7][2] = -2.07996145299459; fWeightMatrix1to2[8][2] = 0.387455340748484; fWeightMatrix1to2[9][2] = 0.0929876041234388; fWeightMatrix1to2[10][2] = -0.554585190889391; fWeightMatrix1to2[0][3] = -1.34809636919847; fWeightMatrix1to2[1][3] = -0.034265775111514; fWeightMatrix1to2[2][3] = 1.05458532289823; fWeightMatrix1to2[3][3] = -2.0543241753738; fWeightMatrix1to2[4][3] = 0.495673335453956; fWeightMatrix1to2[5][3] = -0.0657407558139899; fWeightMatrix1to2[6][3] = 1.26690693529904; fWeightMatrix1to2[7][3] = 0.0781644158500635; fWeightMatrix1to2[8][3] = -1.59460929823312; fWeightMatrix1to2[9][3] = -1.92717957764898; fWeightMatrix1to2[10][3] = -0.445402471283728; fWeightMatrix1to2[0][4] = 0.462919883497094; fWeightMatrix1to2[1][4] = 0.152810776001547; fWeightMatrix1to2[2][4] = 0.483912935295995; fWeightMatrix1to2[3][4] = -0.989908211759483; fWeightMatrix1to2[4][4] = 0.582540962275411; fWeightMatrix1to2[5][4] = 0.877778306343094; fWeightMatrix1to2[6][4] = -1.26169810077222; fWeightMatrix1to2[7][4] = -0.0545135222676675; fWeightMatrix1to2[8][4] = 0.848820137881586; fWeightMatrix1to2[9][4] = -0.138620197543513; fWeightMatrix1to2[10][4] = 0.112296278822955; fWeightMatrix1to2[0][5] = -0.498647951904513; fWeightMatrix1to2[1][5] = -0.0257218937422137; fWeightMatrix1to2[2][5] = -0.559753622920485; fWeightMatrix1to2[3][5] = -0.468688627169964; fWeightMatrix1to2[4][5] = -2.22051306476539; fWeightMatrix1to2[5][5] = -1.51592431523265; fWeightMatrix1to2[6][5] = -0.958561972129505; fWeightMatrix1to2[7][5] = 0.0830162284400266; fWeightMatrix1to2[8][5] = -2.24225987513636; fWeightMatrix1to2[9][5] = 1.67235177504493; fWeightMatrix1to2[10][5] = -0.871511968024477; fWeightMatrix1to2[0][6] = -1.25460203276195; fWeightMatrix1to2[1][6] = 0.975068264252646; fWeightMatrix1to2[2][6] = -1.24903629478271; fWeightMatrix1to2[3][6] = -1.88148350000168; fWeightMatrix1to2[4][6] = 0.689157916980315; fWeightMatrix1to2[5][6] = -1.52701382785529; fWeightMatrix1to2[6][6] = 0.317619545721224; fWeightMatrix1to2[7][6] = -2.6625109491886; fWeightMatrix1to2[8][6] = -1.08539330939571; fWeightMatrix1to2[9][6] = 0.959177358433797; fWeightMatrix1to2[10][6] = -2.99439722025558; fWeightMatrix1to2[0][7] = 1.09454927406465; fWeightMatrix1to2[1][7] = 0.18955193220264; fWeightMatrix1to2[2][7] = -1.19996612922173; fWeightMatrix1to2[3][7] = -1.15339771558486; fWeightMatrix1to2[4][7] = -1.27439268733743; fWeightMatrix1to2[5][7] = -0.137688848406843; fWeightMatrix1to2[6][7] = -0.76806148413452; fWeightMatrix1to2[7][7] = -2.02364203675715; fWeightMatrix1to2[8][7] = -0.0879662319974191; fWeightMatrix1to2[9][7] = -1.13775288027437; fWeightMatrix1to2[10][7] = 0.998469398041856; fWeightMatrix1to2[0][8] = -1.54109101840862; fWeightMatrix1to2[1][8] = -0.508984415664647; fWeightMatrix1to2[2][8] = 0.400582739352957; fWeightMatrix1to2[3][8] = 0.795745701566909; fWeightMatrix1to2[4][8] = -1.50437710176079; fWeightMatrix1to2[5][8] = 1.99711049557582; fWeightMatrix1to2[6][8] = -2.20585608471663; fWeightMatrix1to2[7][8] = -0.0896706657101816; fWeightMatrix1to2[8][8] = -0.523564200788631; fWeightMatrix1to2[9][8] = -2.38885280618939; fWeightMatrix1to2[10][8] = 1.21191030190497; fWeightMatrix1to2[0][9] = -0.308310305408622; fWeightMatrix1to2[1][9] = 1.04679600917199; fWeightMatrix1to2[2][9] = 1.40718191635588; fWeightMatrix1to2[3][9] = 0.113889631553552; fWeightMatrix1to2[4][9] = 0.956660537509395; fWeightMatrix1to2[5][9] = -0.886054403138063; fWeightMatrix1to2[6][9] = -1.3115949243087; fWeightMatrix1to2[7][9] = -0.611935635003494; fWeightMatrix1to2[8][9] = -0.708720173755782; fWeightMatrix1to2[9][9] = -1.33232664693101; fWeightMatrix1to2[10][9] = -0.847441703703753; fWeightMatrix1to2[0][10] = 0.898271163901063; fWeightMatrix1to2[1][10] = -0.920411560706986; fWeightMatrix1to2[2][10] = -2.02082538524965; fWeightMatrix1to2[3][10] = 1.44984400636241; fWeightMatrix1to2[4][10] = -1.56425523958852; fWeightMatrix1to2[5][10] = -0.811266870714434; fWeightMatrix1to2[6][10] = -1.42802833150812; fWeightMatrix1to2[7][10] = 0.1706406075782; fWeightMatrix1to2[8][10] = -0.405208407769471; fWeightMatrix1to2[9][10] = -0.346776591540579; fWeightMatrix1to2[10][10] = -1.13618370316389; fWeightMatrix1to2[0][11] = 0.705158127683871; fWeightMatrix1to2[1][11] = -0.718228474205091; fWeightMatrix1to2[2][11] = 0.572858698481044; fWeightMatrix1to2[3][11] = -1.20573270084937; fWeightMatrix1to2[4][11] = -0.730375891455865; fWeightMatrix1to2[5][11] = -1.14523010147381; fWeightMatrix1to2[6][11] = 0.993727397564879; fWeightMatrix1to2[7][11] = 0.899136435062973; fWeightMatrix1to2[8][11] = -0.307030050489899; fWeightMatrix1to2[9][11] = 1.19597724649522; fWeightMatrix1to2[10][11] = 0.593180252471612; fWeightMatrix1to2[0][12] = -0.136776830603561; fWeightMatrix1to2[1][12] = -0.320854487333603; fWeightMatrix1to2[2][12] = -0.723623926400946; fWeightMatrix1to2[3][12] = -0.408491576026416; fWeightMatrix1to2[4][12] = 1.50427613404371; fWeightMatrix1to2[5][12] = 1.77928847139841; fWeightMatrix1to2[6][12] = -2.03511534846127; fWeightMatrix1to2[7][12] = -2.28394105046309; fWeightMatrix1to2[8][12] = 0.102313905802244; fWeightMatrix1to2[9][12] = -0.173623877480985; fWeightMatrix1to2[10][12] = 1.28823378545119; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.0541489931692151; fWeightMatrix2to3[0][1] = -0.187997845817641; fWeightMatrix2to3[0][2] = -1.1600337343807; fWeightMatrix2to3[0][3] = -0.436593475003022; fWeightMatrix2to3[0][4] = 0.03912419657247; fWeightMatrix2to3[0][5] = -0.71507457235993; fWeightMatrix2to3[0][6] = -1.44341444221455; fWeightMatrix2to3[0][7] = 0.0144030382603487; fWeightMatrix2to3[0][8] = 0.679143763588042; fWeightMatrix2to3[0][9] = 1.18425831428716; fWeightMatrix2to3[0][10] = -0.437568804092269; fWeightMatrix2to3[0][11] = 1.01477392138237; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l