// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:28:18 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 21486 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1] LRWg LRWg 'F' [0,0.99990350008] LRWj LRWj 'F' [-0.00370291154832,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.1914625168,199.549758911] MetDelPhi MetDelPhi 'F' [0.143834546208,3.13655138016] MetSpec MetSpec 'F' [15.0028028488,188.142364502] dPhiLeptons dPhiLeptons 'F' [5.24520874023e-06,3.14073061943] dRLeptons dRLeptons 'F' [0.401803344488,4.52171230316] dimass dimass 'F' [16.0141124725,606.535461426] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1; fVmin[2] = 0; fVmax[2] = 0.999903500080109; fVmin[3] = -0.00370291154831648; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.1914625167847; fVmax[5] = 199.549758911133; fVmin[6] = 0.143834546208382; fVmax[6] = 3.13655138015747; fVmin[7] = 15.0028028488159; fVmax[7] = 188.142364501953; fVmin[8] = 5.24520874023438e-06; fVmax[8] = 3.14073061943054; fVmin[9] = 0.401803344488144; fVmax[9] = 4.52171230316162; fVmin[10] = 16.0141124725342; fVmax[10] = 606.535461425781; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.60202488528879; fWeightMatrix0to1[1][0] = 1.37153998484192; fWeightMatrix0to1[2][0] = -0.206602387506984; fWeightMatrix0to1[3][0] = 1.02222189561983; fWeightMatrix0to1[4][0] = 0.180302819878266; fWeightMatrix0to1[5][0] = 2.63901280369413; fWeightMatrix0to1[6][0] = 1.99335129679105; fWeightMatrix0to1[7][0] = -0.238019920693244; fWeightMatrix0to1[8][0] = 1.11009860136848; fWeightMatrix0to1[9][0] = 1.22138782391316; fWeightMatrix0to1[10][0] = 1.29222543020177; fWeightMatrix0to1[11][0] = 1.36857024149885; fWeightMatrix0to1[0][1] = 1.1145825194367; fWeightMatrix0to1[1][1] = 1.0733177656768; fWeightMatrix0to1[2][1] = 1.10589906073832; fWeightMatrix0to1[3][1] = -1.90616314833042; fWeightMatrix0to1[4][1] = 1.46831211248683; fWeightMatrix0to1[5][1] = 0.172627210504629; fWeightMatrix0to1[6][1] = -1.04551669163545; fWeightMatrix0to1[7][1] = 1.74464893067914; fWeightMatrix0to1[8][1] = 2.44093728618836; fWeightMatrix0to1[9][1] = 1.32812670558431; fWeightMatrix0to1[10][1] = 0.217950873886736; fWeightMatrix0to1[11][1] = -1.01988316747294; fWeightMatrix0to1[0][2] = 0.462182508480596; fWeightMatrix0to1[1][2] = -0.0911553940070854; fWeightMatrix0to1[2][2] = 0.0172475116367192; fWeightMatrix0to1[3][2] = -1.70297230257398; fWeightMatrix0to1[4][2] = 1.56966377775935; fWeightMatrix0to1[5][2] = -0.222950695769698; fWeightMatrix0to1[6][2] = -0.261750333717998; fWeightMatrix0to1[7][2] = -0.72417013271158; fWeightMatrix0to1[8][2] = -0.185920353174587; fWeightMatrix0to1[9][2] = -0.547390170230584; fWeightMatrix0to1[10][2] = -0.235318779261035; fWeightMatrix0to1[11][2] = 1.47011628968466; fWeightMatrix0to1[0][3] = 0.567497996599294; fWeightMatrix0to1[1][3] = 0.585563715120258; fWeightMatrix0to1[2][3] = 1.23524070652342; fWeightMatrix0to1[3][3] = 1.42117741941135; fWeightMatrix0to1[4][3] = -1.05771471533766; fWeightMatrix0to1[5][3] = -0.909036471216505; fWeightMatrix0to1[6][3] = -0.238720981261707; fWeightMatrix0to1[7][3] = -1.92510652857268; fWeightMatrix0to1[8][3] = -0.449129968892899; fWeightMatrix0to1[9][3] = -1.5701593618778; fWeightMatrix0to1[10][3] = -0.118006795930995; fWeightMatrix0to1[11][3] = 1.16357215916554; fWeightMatrix0to1[0][4] = 0.695357452802014; fWeightMatrix0to1[1][4] = -0.678241281772734; fWeightMatrix0to1[2][4] = 1.0417587076114; fWeightMatrix0to1[3][4] = -1.50402128898845; fWeightMatrix0to1[4][4] = 0.0959260538259567; fWeightMatrix0to1[5][4] = -0.868648984458747; fWeightMatrix0to1[6][4] = -0.98534717215807; fWeightMatrix0to1[7][4] = 0.0344252976919689; fWeightMatrix0to1[8][4] = -0.612103725406909; fWeightMatrix0to1[9][4] = 0.508777297604143; fWeightMatrix0to1[10][4] = 0.0769654690977995; fWeightMatrix0to1[11][4] = -0.331673337041849; fWeightMatrix0to1[0][5] = -1.07872025131487; fWeightMatrix0to1[1][5] = 0.531712813988417; fWeightMatrix0to1[2][5] = -2.08357082233969; fWeightMatrix0to1[3][5] = 0.511077794816647; fWeightMatrix0to1[4][5] = 0.730591326718377; fWeightMatrix0to1[5][5] = -2.11900929804547; fWeightMatrix0to1[6][5] = 0.940525114502817; fWeightMatrix0to1[7][5] = -0.418715881486912; fWeightMatrix0to1[8][5] = 0.316842176440395; fWeightMatrix0to1[9][5] = 0.608039890048736; fWeightMatrix0to1[10][5] = 0.362533936259384; fWeightMatrix0to1[11][5] = -0.88291641612719; fWeightMatrix0to1[0][6] = -1.10909555417684; fWeightMatrix0to1[1][6] = -1.41623669020788; fWeightMatrix0to1[2][6] = 1.30352375514459; fWeightMatrix0to1[3][6] = -0.982489435598934; fWeightMatrix0to1[4][6] = -0.139602011977239; fWeightMatrix0to1[5][6] = -0.986344546001909; fWeightMatrix0to1[6][6] = -2.67744019192771; fWeightMatrix0to1[7][6] = 1.43150396307703; fWeightMatrix0to1[8][6] = 0.373099623498785; fWeightMatrix0to1[9][6] = 0.894570620211401; fWeightMatrix0to1[10][6] = 2.20896624976744; fWeightMatrix0to1[11][6] = 0.78692365228047; fWeightMatrix0to1[0][7] = -0.654284352663636; fWeightMatrix0to1[1][7] = 1.10426405089876; fWeightMatrix0to1[2][7] = 1.75504027196616; fWeightMatrix0to1[3][7] = -0.783698825809501; fWeightMatrix0to1[4][7] = 0.995807565952242; fWeightMatrix0to1[5][7] = -0.299062267923051; fWeightMatrix0to1[6][7] = 0.0105899498331152; fWeightMatrix0to1[7][7] = -0.446635075656651; fWeightMatrix0to1[8][7] = 0.0744220541925907; fWeightMatrix0to1[9][7] = 1.72313579860626; fWeightMatrix0to1[10][7] = -0.452045090965562; fWeightMatrix0to1[11][7] = 2.41539906514047; fWeightMatrix0to1[0][8] = -0.609628721725797; fWeightMatrix0to1[1][8] = -1.11034139997077; fWeightMatrix0to1[2][8] = 2.06440989365604; fWeightMatrix0to1[3][8] = -1.92807347802307; fWeightMatrix0to1[4][8] = 0.554291181067865; fWeightMatrix0to1[5][8] = 0.60869871415442; fWeightMatrix0to1[6][8] = -0.0760054443239105; fWeightMatrix0to1[7][8] = -0.474789462419459; fWeightMatrix0to1[8][8] = 2.06027680552804; fWeightMatrix0to1[9][8] = -0.325415058146514; fWeightMatrix0to1[10][8] = 0.307974855686314; fWeightMatrix0to1[11][8] = -0.676544680702478; fWeightMatrix0to1[0][9] = -0.209948080687368; fWeightMatrix0to1[1][9] = -2.24743916437538; fWeightMatrix0to1[2][9] = 2.0758680905246; fWeightMatrix0to1[3][9] = -1.3619458708188; fWeightMatrix0to1[4][9] = 0.667675691599695; fWeightMatrix0to1[5][9] = 0.43502259630326; fWeightMatrix0to1[6][9] = -0.560065408957576; fWeightMatrix0to1[7][9] = 0.634960231892051; fWeightMatrix0to1[8][9] = -0.747568229177485; fWeightMatrix0to1[9][9] = -2.17968237875093; fWeightMatrix0to1[10][9] = -1.52025020062209; fWeightMatrix0to1[11][9] = 0.54461998681323; fWeightMatrix0to1[0][10] = -0.542658392867209; fWeightMatrix0to1[1][10] = 0.971647007775896; fWeightMatrix0to1[2][10] = 1.39619417387192; fWeightMatrix0to1[3][10] = -2.08740915949364; fWeightMatrix0to1[4][10] = 0.985915894171224; fWeightMatrix0to1[5][10] = -1.7334198527299; fWeightMatrix0to1[6][10] = -1.96316200653664; fWeightMatrix0to1[7][10] = 0.800977703873895; fWeightMatrix0to1[8][10] = -0.30787235696974; fWeightMatrix0to1[9][10] = -1.01897725458524; fWeightMatrix0to1[10][10] = -1.90222510584003; fWeightMatrix0to1[11][10] = -1.26575026583534; fWeightMatrix0to1[0][11] = -1.13761985786775; fWeightMatrix0to1[1][11] = -0.822183684240504; fWeightMatrix0to1[2][11] = 0.0180501757062828; fWeightMatrix0to1[3][11] = 1.38896959566906; fWeightMatrix0to1[4][11] = 1.55857338020507; fWeightMatrix0to1[5][11] = 1.16652622258107; fWeightMatrix0to1[6][11] = -1.45726166258669; fWeightMatrix0to1[7][11] = -1.4404606775281; fWeightMatrix0to1[8][11] = 3.12319488959208; fWeightMatrix0to1[9][11] = -1.5797892884316; fWeightMatrix0to1[10][11] = -0.0307374729687434; fWeightMatrix0to1[11][11] = -0.598159640858692; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.44390248699727; fWeightMatrix1to2[1][0] = 0.144053283358076; fWeightMatrix1to2[2][0] = -0.899449577728178; fWeightMatrix1to2[3][0] = -0.892862684745961; fWeightMatrix1to2[4][0] = -2.11793671138925; fWeightMatrix1to2[5][0] = 0.723363756540363; fWeightMatrix1to2[6][0] = 1.76603172266598; fWeightMatrix1to2[7][0] = 1.53095985917761; fWeightMatrix1to2[8][0] = 1.66156559072622; fWeightMatrix1to2[9][0] = -2.11749620902932; fWeightMatrix1to2[10][0] = -0.520222731639929; fWeightMatrix1to2[0][1] = 0.918159979935365; fWeightMatrix1to2[1][1] = -1.68477945017206; fWeightMatrix1to2[2][1] = -0.985735017980359; fWeightMatrix1to2[3][1] = -0.0679311773554322; fWeightMatrix1to2[4][1] = 1.2032815487021; fWeightMatrix1to2[5][1] = -2.28598128405602; fWeightMatrix1to2[6][1] = -0.951767654327175; fWeightMatrix1to2[7][1] = 0.0416625180714081; fWeightMatrix1to2[8][1] = -1.38625950173559; fWeightMatrix1to2[9][1] = -1.29611036306735; fWeightMatrix1to2[10][1] = -0.052808298079043; fWeightMatrix1to2[0][2] = 0.569495316564171; fWeightMatrix1to2[1][2] = -0.279717129647365; fWeightMatrix1to2[2][2] = -1.58538538783118; fWeightMatrix1to2[3][2] = -0.157536493932877; fWeightMatrix1to2[4][2] = 1.63676108877117; fWeightMatrix1to2[5][2] = -0.0577006357929955; fWeightMatrix1to2[6][2] = 1.04119297335105; fWeightMatrix1to2[7][2] = -2.03873594218162; fWeightMatrix1to2[8][2] = 0.275964789265028; fWeightMatrix1to2[9][2] = -0.154362562565193; fWeightMatrix1to2[10][2] = -0.170902755175201; fWeightMatrix1to2[0][3] = -1.41550479924087; fWeightMatrix1to2[1][3] = 0.15115875544677; fWeightMatrix1to2[2][3] = 1.07582598844466; fWeightMatrix1to2[3][3] = -2.16564601838452; fWeightMatrix1to2[4][3] = 0.445433195589241; fWeightMatrix1to2[5][3] = 0.0463821493045683; fWeightMatrix1to2[6][3] = 1.279591248883; fWeightMatrix1to2[7][3] = 0.0672848691216471; fWeightMatrix1to2[8][3] = -1.60359064681211; fWeightMatrix1to2[9][3] = -1.84530373446499; fWeightMatrix1to2[10][3] = -0.614736187728743; fWeightMatrix1to2[0][4] = 0.341421219394221; fWeightMatrix1to2[1][4] = -0.0175688845015356; fWeightMatrix1to2[2][4] = -0.0718334317898256; fWeightMatrix1to2[3][4] = -1.06765709654168; fWeightMatrix1to2[4][4] = 0.367754228335443; fWeightMatrix1to2[5][4] = 1.14329219038039; fWeightMatrix1to2[6][4] = -1.37966991590769; fWeightMatrix1to2[7][4] = -0.218254469638079; fWeightMatrix1to2[8][4] = 0.762313983098308; fWeightMatrix1to2[9][4] = 0.25349433804128; fWeightMatrix1to2[10][4] = -0.153101720535356; fWeightMatrix1to2[0][5] = -0.58198159114623; fWeightMatrix1to2[1][5] = 0.0962955415486077; fWeightMatrix1to2[2][5] = -0.653572849472981; fWeightMatrix1to2[3][5] = -0.742027461355096; fWeightMatrix1to2[4][5] = -2.20741982625346; fWeightMatrix1to2[5][5] = -1.26401551302733; fWeightMatrix1to2[6][5] = -0.862518366224157; fWeightMatrix1to2[7][5] = 0.100549779005898; fWeightMatrix1to2[8][5] = -2.09098746692932; fWeightMatrix1to2[9][5] = 1.62609581062098; fWeightMatrix1to2[10][5] = -0.98882053741829; fWeightMatrix1to2[0][6] = -1.16798068267479; fWeightMatrix1to2[1][6] = 1.00340929573182; fWeightMatrix1to2[2][6] = -1.35334217444077; fWeightMatrix1to2[3][6] = -1.93065972460268; fWeightMatrix1to2[4][6] = 0.667977908877992; fWeightMatrix1to2[5][6] = -1.72509087691684; fWeightMatrix1to2[6][6] = 0.718487079403886; fWeightMatrix1to2[7][6] = -2.65153880654993; fWeightMatrix1to2[8][6] = -0.955169765915787; fWeightMatrix1to2[9][6] = 0.6499951865469; fWeightMatrix1to2[10][6] = -3.10398032484893; fWeightMatrix1to2[0][7] = 1.09201832961266; fWeightMatrix1to2[1][7] = 0.272758207731386; fWeightMatrix1to2[2][7] = -1.18953960959152; fWeightMatrix1to2[3][7] = -1.26407613268897; fWeightMatrix1to2[4][7] = -1.21882103545016; fWeightMatrix1to2[5][7] = -0.301220564578962; fWeightMatrix1to2[6][7] = -0.704562248523765; fWeightMatrix1to2[7][7] = -1.93649078565364; fWeightMatrix1to2[8][7] = 0.0118003744601894; fWeightMatrix1to2[9][7] = -1.09223926735438; fWeightMatrix1to2[10][7] = 0.896644566894294; fWeightMatrix1to2[0][8] = -1.62497105920969; fWeightMatrix1to2[1][8] = -0.420689453361637; fWeightMatrix1to2[2][8] = 1.02162245457879; fWeightMatrix1to2[3][8] = 0.642047942997396; fWeightMatrix1to2[4][8] = -1.51150968203808; fWeightMatrix1to2[5][8] = 2.2713083036336; fWeightMatrix1to2[6][8] = -2.01542457913149; fWeightMatrix1to2[7][8] = 0.0569822860580738; fWeightMatrix1to2[8][8] = -0.636431484066979; fWeightMatrix1to2[9][8] = -2.24792165718565; fWeightMatrix1to2[10][8] = 1.03861745750606; fWeightMatrix1to2[0][9] = -0.395494060581187; fWeightMatrix1to2[1][9] = 0.949838341993549; fWeightMatrix1to2[2][9] = 1.10365654264673; fWeightMatrix1to2[3][9] = 0.0489544947387866; fWeightMatrix1to2[4][9] = 1.00378993849117; fWeightMatrix1to2[5][9] = -0.928198402509864; fWeightMatrix1to2[6][9] = -1.229857245721; fWeightMatrix1to2[7][9] = -0.567736390887254; fWeightMatrix1to2[8][9] = -0.583298747641505; fWeightMatrix1to2[9][9] = -1.21821449875529; fWeightMatrix1to2[10][9] = -0.353843666843441; fWeightMatrix1to2[0][10] = 0.784347341030798; fWeightMatrix1to2[1][10] = -0.683911500608627; fWeightMatrix1to2[2][10] = -1.90700255604345; fWeightMatrix1to2[3][10] = 1.18068826803043; fWeightMatrix1to2[4][10] = -1.51907333381366; fWeightMatrix1to2[5][10] = -1.14452076760072; fWeightMatrix1to2[6][10] = -1.45906990149579; fWeightMatrix1to2[7][10] = 0.217664613140259; fWeightMatrix1to2[8][10] = -0.313884033009556; fWeightMatrix1to2[9][10] = -0.38468649356597; fWeightMatrix1to2[10][10] = -1.07644049917756; fWeightMatrix1to2[0][11] = 0.807587626380901; fWeightMatrix1to2[1][11] = -0.394633261777069; fWeightMatrix1to2[2][11] = 0.198000449067015; fWeightMatrix1to2[3][11] = -1.27559350376789; fWeightMatrix1to2[4][11] = -0.761352774063405; fWeightMatrix1to2[5][11] = -1.69385950354516; fWeightMatrix1to2[6][11] = 0.791998859365806; fWeightMatrix1to2[7][11] = 0.94794570157561; fWeightMatrix1to2[8][11] = -0.359213871536133; fWeightMatrix1to2[9][11] = 1.29931455200827; fWeightMatrix1to2[10][11] = 0.911411250498975; fWeightMatrix1to2[0][12] = -0.209945438236411; fWeightMatrix1to2[1][12] = -0.130943704463641; fWeightMatrix1to2[2][12] = -0.591224130627158; fWeightMatrix1to2[3][12] = -0.549314514694959; fWeightMatrix1to2[4][12] = 1.4509077710142; fWeightMatrix1to2[5][12] = 1.66946721140785; fWeightMatrix1to2[6][12] = -2.02792310340673; fWeightMatrix1to2[7][12] = -2.25934540958845; fWeightMatrix1to2[8][12] = -0.0118772428763881; fWeightMatrix1to2[9][12] = -0.22556011133757; fWeightMatrix1to2[10][12] = 0.663012510336852; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.0489152900329682; fWeightMatrix2to3[0][1] = 0.0302660081172627; fWeightMatrix2to3[0][2] = -1.0516588251224; fWeightMatrix2to3[0][3] = -0.0372827478424661; fWeightMatrix2to3[0][4] = 0.270062617835979; fWeightMatrix2to3[0][5] = -0.732930489357967; fWeightMatrix2to3[0][6] = -1.49255938441804; fWeightMatrix2to3[0][7] = -0.0802877875590936; fWeightMatrix2to3[0][8] = 0.410796539470536; fWeightMatrix2to3[0][9] = 0.749266697055274; fWeightMatrix2to3[0][10] = -0.501374424157965; fWeightMatrix2to3[0][11] = 0.965444032104413; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l