// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:30:01 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 22756 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1] LRWg LRWg 'F' [0,0.999932587147] LRWj LRWj 'F' [-0.0367123484612,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.0806331635,275.269592285] MetDelPhi MetDelPhi 'F' [0.126148730516,3.13422250748] MetSpec MetSpec 'F' [15.0024642944,215.058685303] dPhiLeptons dPhiLeptons 'F' [2.5749206543e-05,3.14062619209] dRLeptons dRLeptons 'F' [0.397228628397,4.5256857872] dimass dimass 'F' [16.0042724609,557.91784668] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1; fVmin[2] = 0; fVmax[2] = 0.999932587146759; fVmin[3] = -0.0367123484611511; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.0806331634521; fVmax[5] = 275.269592285156; fVmin[6] = 0.126148730516434; fVmax[6] = 3.13422250747681; fVmin[7] = 15.0024642944336; fVmax[7] = 215.058685302734; fVmin[8] = 2.57492065429688e-05; fVmax[8] = 3.1406261920929; fVmin[9] = 0.397228628396988; fVmax[9] = 4.52568578720093; fVmin[10] = 16.0042724609375; fVmax[10] = 557.917846679688; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.33329445528617; fWeightMatrix0to1[1][0] = 1.16277771851443; fWeightMatrix0to1[2][0] = 1.01505145721233; fWeightMatrix0to1[3][0] = 2.43873945526412; fWeightMatrix0to1[4][0] = 0.282207840499145; fWeightMatrix0to1[5][0] = 1.97893752772164; fWeightMatrix0to1[6][0] = 0.980825847721466; fWeightMatrix0to1[7][0] = -0.49744443365359; fWeightMatrix0to1[8][0] = 0.335050756305815; fWeightMatrix0to1[9][0] = 1.09622508030692; fWeightMatrix0to1[10][0] = -0.253988971200609; fWeightMatrix0to1[11][0] = -0.267675151556724; fWeightMatrix0to1[0][1] = 1.09971141828846; fWeightMatrix0to1[1][1] = 0.149953386104817; fWeightMatrix0to1[2][1] = 0.843090563190795; fWeightMatrix0to1[3][1] = -1.60706900818048; fWeightMatrix0to1[4][1] = 1.43681882699378; fWeightMatrix0to1[5][1] = -0.202057876952845; fWeightMatrix0to1[6][1] = -1.19666911067995; fWeightMatrix0to1[7][1] = 1.33915873993917; fWeightMatrix0to1[8][1] = 0.602273967674004; fWeightMatrix0to1[9][1] = 1.26372118782732; fWeightMatrix0to1[10][1] = -0.654516104831529; fWeightMatrix0to1[11][1] = 0.0014099523727459; fWeightMatrix0to1[0][2] = 0.759196590576108; fWeightMatrix0to1[1][2] = 0.815004205907034; fWeightMatrix0to1[2][2] = 0.616173541745705; fWeightMatrix0to1[3][2] = -1.52849890267838; fWeightMatrix0to1[4][2] = 1.96121323113922; fWeightMatrix0to1[5][2] = 0.192418399928705; fWeightMatrix0to1[6][2] = -0.863394780056049; fWeightMatrix0to1[7][2] = -0.371056006997744; fWeightMatrix0to1[8][2] = -0.697601965212351; fWeightMatrix0to1[9][2] = -0.743056689819698; fWeightMatrix0to1[10][2] = 0.193415301314325; fWeightMatrix0to1[11][2] = 0.75770864643659; fWeightMatrix0to1[0][3] = 0.473593201304116; fWeightMatrix0to1[1][3] = 0.393657302073201; fWeightMatrix0to1[2][3] = 0.791034796118637; fWeightMatrix0to1[3][3] = 1.65996543257492; fWeightMatrix0to1[4][3] = -0.908458372716504; fWeightMatrix0to1[5][3] = -0.440228385776423; fWeightMatrix0to1[6][3] = -0.367107450725105; fWeightMatrix0to1[7][3] = -1.8717782480327; fWeightMatrix0to1[8][3] = -0.971562482515616; fWeightMatrix0to1[9][3] = -1.27377717446608; fWeightMatrix0to1[10][3] = -0.377500764118501; fWeightMatrix0to1[11][3] = 1.07692461516392; fWeightMatrix0to1[0][4] = 0.340514893673556; fWeightMatrix0to1[1][4] = 0.535632224968392; fWeightMatrix0to1[2][4] = 1.21048763668143; fWeightMatrix0to1[3][4] = -1.45046362655543; fWeightMatrix0to1[4][4] = -0.528070813529872; fWeightMatrix0to1[5][4] = -1.55288250963667; fWeightMatrix0to1[6][4] = -0.33941337326042; fWeightMatrix0to1[7][4] = 0.594978500729115; fWeightMatrix0to1[8][4] = 0.335338524301177; fWeightMatrix0to1[9][4] = 0.169926658425161; fWeightMatrix0to1[10][4] = -0.994075175568608; fWeightMatrix0to1[11][4] = -0.705430305572987; fWeightMatrix0to1[0][5] = -0.801875085668513; fWeightMatrix0to1[1][5] = -1.21749274576568; fWeightMatrix0to1[2][5] = -2.52246299848429; fWeightMatrix0to1[3][5] = 0.827068159990063; fWeightMatrix0to1[4][5] = 1.50517666641559; fWeightMatrix0to1[5][5] = -2.12810266933832; fWeightMatrix0to1[6][5] = 2.00718292821866; fWeightMatrix0to1[7][5] = -0.570436631354959; fWeightMatrix0to1[8][5] = -0.527511306318558; fWeightMatrix0to1[9][5] = 0.823801929556194; fWeightMatrix0to1[10][5] = -0.18848856902848; fWeightMatrix0to1[11][5] = -1.60433270681978; fWeightMatrix0to1[0][6] = -1.29084169764006; fWeightMatrix0to1[1][6] = -0.454405206720752; fWeightMatrix0to1[2][6] = 1.87137099410697; fWeightMatrix0to1[3][6] = -1.3002037616701; fWeightMatrix0to1[4][6] = -0.438387784549448; fWeightMatrix0to1[5][6] = -0.285498185854723; fWeightMatrix0to1[6][6] = -1.28440451367659; fWeightMatrix0to1[7][6] = 1.09429857178614; fWeightMatrix0to1[8][6] = -0.0735405486251169; fWeightMatrix0to1[9][6] = 0.405604665615793; fWeightMatrix0to1[10][6] = 1.32666270235374; fWeightMatrix0to1[11][6] = 1.12859317187366; fWeightMatrix0to1[0][7] = -0.293485966062245; fWeightMatrix0to1[1][7] = -0.155532804419513; fWeightMatrix0to1[2][7] = 1.33522605429953; fWeightMatrix0to1[3][7] = -0.321402097308763; fWeightMatrix0to1[4][7] = 1.73633217738679; fWeightMatrix0to1[5][7] = -0.19459731466423; fWeightMatrix0to1[6][7] = 0.66640596905167; fWeightMatrix0to1[7][7] = -0.677671996568586; fWeightMatrix0to1[8][7] = -0.640738880700378; fWeightMatrix0to1[9][7] = 2.10759655020123; fWeightMatrix0to1[10][7] = -1.51131910273598; fWeightMatrix0to1[11][7] = 1.63983468890712; fWeightMatrix0to1[0][8] = 0.000281788068855807; fWeightMatrix0to1[1][8] = -1.41887267887073; fWeightMatrix0to1[2][8] = 2.15206959289121; fWeightMatrix0to1[3][8] = -1.75023854572396; fWeightMatrix0to1[4][8] = 0.437451262052743; fWeightMatrix0to1[5][8] = 0.562726068619443; fWeightMatrix0to1[6][8] = 0.246352581970749; fWeightMatrix0to1[7][8] = -0.414590186653334; fWeightMatrix0to1[8][8] = 2.06962276932526; fWeightMatrix0to1[9][8] = 0.256146400355865; fWeightMatrix0to1[10][8] = 0.239500935769585; fWeightMatrix0to1[11][8] = -1.58492959703701; fWeightMatrix0to1[0][9] = -0.0522454373278077; fWeightMatrix0to1[1][9] = -2.05763379793288; fWeightMatrix0to1[2][9] = 1.5859927233195; fWeightMatrix0to1[3][9] = -1.07010558536526; fWeightMatrix0to1[4][9] = 0.0701147722504132; fWeightMatrix0to1[5][9] = 0.531604693028227; fWeightMatrix0to1[6][9] = -0.433565092839553; fWeightMatrix0to1[7][9] = 0.690502300858756; fWeightMatrix0to1[8][9] = -1.29837092802056; fWeightMatrix0to1[9][9] = -2.24191200711996; fWeightMatrix0to1[10][9] = -1.23887399704555; fWeightMatrix0to1[11][9] = 0.373215951522473; fWeightMatrix0to1[0][10] = -0.897658732948844; fWeightMatrix0to1[1][10] = 0.439953983306433; fWeightMatrix0to1[2][10] = 1.2534558640764; fWeightMatrix0to1[3][10] = -1.72575060797909; fWeightMatrix0to1[4][10] = 0.961837004154293; fWeightMatrix0to1[5][10] = -1.51164487904294; fWeightMatrix0to1[6][10] = -1.23026529261941; fWeightMatrix0to1[7][10] = 0.741087031991318; fWeightMatrix0to1[8][10] = -0.617199897527471; fWeightMatrix0to1[9][10] = -0.210046874393532; fWeightMatrix0to1[10][10] = -0.631020364736678; fWeightMatrix0to1[11][10] = -1.36993628524005; fWeightMatrix0to1[0][11] = -1.31198962802525; fWeightMatrix0to1[1][11] = -0.0417159035193038; fWeightMatrix0to1[2][11] = -0.165321151560166; fWeightMatrix0to1[3][11] = 0.406910838319176; fWeightMatrix0to1[4][11] = 0.679438129940624; fWeightMatrix0to1[5][11] = 1.45629672496869; fWeightMatrix0to1[6][11] = -0.0481718219663419; fWeightMatrix0to1[7][11] = -1.48067373060391; fWeightMatrix0to1[8][11] = 2.16147366365734; fWeightMatrix0to1[9][11] = -2.616389259031; fWeightMatrix0to1[10][11] = -0.519732966284467; fWeightMatrix0to1[11][11] = -0.162615898643527; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.71872838464772; fWeightMatrix1to2[1][0] = 0.00681896639457651; fWeightMatrix1to2[2][0] = -0.893030943526098; fWeightMatrix1to2[3][0] = -0.815347606218253; fWeightMatrix1to2[4][0] = -1.55848121688902; fWeightMatrix1to2[5][0] = 0.175192612328059; fWeightMatrix1to2[6][0] = 1.89387699593054; fWeightMatrix1to2[7][0] = 0.901110551713271; fWeightMatrix1to2[8][0] = 1.57342117177967; fWeightMatrix1to2[9][0] = -2.18327721696925; fWeightMatrix1to2[10][0] = -0.178107847557894; fWeightMatrix1to2[0][1] = 0.752668169073905; fWeightMatrix1to2[1][1] = -1.77733599707027; fWeightMatrix1to2[2][1] = -0.782339356014479; fWeightMatrix1to2[3][1] = -0.011385910001889; fWeightMatrix1to2[4][1] = 1.54528470348543; fWeightMatrix1to2[5][1] = -2.43510255296474; fWeightMatrix1to2[6][1] = -0.947644263447778; fWeightMatrix1to2[7][1] = -0.279134489678471; fWeightMatrix1to2[8][1] = -1.37309764889297; fWeightMatrix1to2[9][1] = -1.24768277227113; fWeightMatrix1to2[10][1] = -0.446389550129488; fWeightMatrix1to2[0][2] = 0.487213013175192; fWeightMatrix1to2[1][2] = -0.248037649189283; fWeightMatrix1to2[2][2] = -0.920344923037634; fWeightMatrix1to2[3][2] = -0.00121375920445563; fWeightMatrix1to2[4][2] = 1.52581020178261; fWeightMatrix1to2[5][2] = -0.633153402510593; fWeightMatrix1to2[6][2] = 1.2350739663797; fWeightMatrix1to2[7][2] = -2.21623064196065; fWeightMatrix1to2[8][2] = 0.316531923088552; fWeightMatrix1to2[9][2] = -0.400959279579671; fWeightMatrix1to2[10][2] = -0.022172166628122; fWeightMatrix1to2[0][3] = -1.54740998736963; fWeightMatrix1to2[1][3] = 0.102605812655679; fWeightMatrix1to2[2][3] = 0.892459900138467; fWeightMatrix1to2[3][3] = -2.35873371820945; fWeightMatrix1to2[4][3] = 0.853940655872569; fWeightMatrix1to2[5][3] = 0.140474250974906; fWeightMatrix1to2[6][3] = 1.23794292555137; fWeightMatrix1to2[7][3] = 0.213296182894952; fWeightMatrix1to2[8][3] = -1.74252450992404; fWeightMatrix1to2[9][3] = -1.87881757233646; fWeightMatrix1to2[10][3] = -0.928239715774229; fWeightMatrix1to2[0][4] = 0.312159226199657; fWeightMatrix1to2[1][4] = 0.136673388660527; fWeightMatrix1to2[2][4] = -0.0443044029868572; fWeightMatrix1to2[3][4] = -1.01978095989112; fWeightMatrix1to2[4][4] = 0.550797714557982; fWeightMatrix1to2[5][4] = 0.262547276071448; fWeightMatrix1to2[6][4] = -1.27549182578611; fWeightMatrix1to2[7][4] = -0.53905039228941; fWeightMatrix1to2[8][4] = 0.81154783115659; fWeightMatrix1to2[9][4] = 0.80988208344329; fWeightMatrix1to2[10][4] = -0.788272091119694; fWeightMatrix1to2[0][5] = -0.850481179242301; fWeightMatrix1to2[1][5] = -0.170008409713353; fWeightMatrix1to2[2][5] = -0.416361919395077; fWeightMatrix1to2[3][5] = -0.804376119281284; fWeightMatrix1to2[4][5] = -2.0937032895512; fWeightMatrix1to2[5][5] = -1.51165889238655; fWeightMatrix1to2[6][5] = -0.81439771823417; fWeightMatrix1to2[7][5] = 0.106555116678542; fWeightMatrix1to2[8][5] = -2.09912996940905; fWeightMatrix1to2[9][5] = 1.55180723500744; fWeightMatrix1to2[10][5] = -0.755654513139985; fWeightMatrix1to2[0][6] = -1.21903179169395; fWeightMatrix1to2[1][6] = 1.10901580193308; fWeightMatrix1to2[2][6] = -1.32942426903596; fWeightMatrix1to2[3][6] = -1.88418904716011; fWeightMatrix1to2[4][6] = 0.887477824604427; fWeightMatrix1to2[5][6] = -0.63546351026751; fWeightMatrix1to2[6][6] = 0.497551987483262; fWeightMatrix1to2[7][6] = -2.44358723828672; fWeightMatrix1to2[8][6] = -0.952489395894645; fWeightMatrix1to2[9][6] = 0.789113329979111; fWeightMatrix1to2[10][6] = -3.01897589585839; fWeightMatrix1to2[0][7] = 0.722595097523398; fWeightMatrix1to2[1][7] = 0.0584832035546435; fWeightMatrix1to2[2][7] = -0.980204356385037; fWeightMatrix1to2[3][7] = -1.37147952558853; fWeightMatrix1to2[4][7] = -1.2202412604078; fWeightMatrix1to2[5][7] = -1.07959915623576; fWeightMatrix1to2[6][7] = -0.480868892656051; fWeightMatrix1to2[7][7] = -1.91239912477056; fWeightMatrix1to2[8][7] = -0.152264489073025; fWeightMatrix1to2[9][7] = -1.2507324604474; fWeightMatrix1to2[10][7] = 1.24114656067732; fWeightMatrix1to2[0][8] = -1.56441916010942; fWeightMatrix1to2[1][8] = -0.646199091985705; fWeightMatrix1to2[2][8] = 1.05772292389528; fWeightMatrix1to2[3][8] = 0.584592995828214; fWeightMatrix1to2[4][8] = -1.75704192377107; fWeightMatrix1to2[5][8] = 1.51209593605402; fWeightMatrix1to2[6][8] = -1.83549026089717; fWeightMatrix1to2[7][8] = 0.636964274781543; fWeightMatrix1to2[8][8] = -0.746098488239318; fWeightMatrix1to2[9][8] = -1.64738649018105; fWeightMatrix1to2[10][8] = 1.60542332916382; fWeightMatrix1to2[0][9] = -0.540696004838309; fWeightMatrix1to2[1][9] = 1.12716930001639; fWeightMatrix1to2[2][9] = 1.27784409681559; fWeightMatrix1to2[3][9] = 0.0344973498980295; fWeightMatrix1to2[4][9] = 1.12108883871905; fWeightMatrix1to2[5][9] = -1.21393632945652; fWeightMatrix1to2[6][9] = -1.17221378528003; fWeightMatrix1to2[7][9] = -0.884248224434228; fWeightMatrix1to2[8][9] = -0.537078951070785; fWeightMatrix1to2[9][9] = -1.13013561828007; fWeightMatrix1to2[10][9] = -0.907177986727912; fWeightMatrix1to2[0][10] = 0.734845086736402; fWeightMatrix1to2[1][10] = -0.984971592615846; fWeightMatrix1to2[2][10] = -1.94304400303567; fWeightMatrix1to2[3][10] = 1.06969883076261; fWeightMatrix1to2[4][10] = -1.29629285537926; fWeightMatrix1to2[5][10] = -1.08185374438838; fWeightMatrix1to2[6][10] = -1.08128842422598; fWeightMatrix1to2[7][10] = 0.14859655579043; fWeightMatrix1to2[8][10] = -0.363623972281679; fWeightMatrix1to2[9][10] = -0.461770241632295; fWeightMatrix1to2[10][10] = -1.27544686951188; fWeightMatrix1to2[0][11] = 0.747852426862887; fWeightMatrix1to2[1][11] = -0.645409924979703; fWeightMatrix1to2[2][11] = 0.315277743822886; fWeightMatrix1to2[3][11] = -1.40536341467642; fWeightMatrix1to2[4][11] = -0.682632009981741; fWeightMatrix1to2[5][11] = -1.1657951591496; fWeightMatrix1to2[6][11] = 0.8875389635865; fWeightMatrix1to2[7][11] = 1.32121199713091; fWeightMatrix1to2[8][11] = -0.310851983166818; fWeightMatrix1to2[9][11] = 0.875157334599399; fWeightMatrix1to2[10][11] = 0.514513537819907; fWeightMatrix1to2[0][12] = -0.350378240785207; fWeightMatrix1to2[1][12] = -0.305957278816555; fWeightMatrix1to2[2][12] = -0.627651439402512; fWeightMatrix1to2[3][12] = -0.591259287407712; fWeightMatrix1to2[4][12] = 1.59418447515226; fWeightMatrix1to2[5][12] = 0.901035863653301; fWeightMatrix1to2[6][12] = -2.08125714041752; fWeightMatrix1to2[7][12] = -2.25070969848801; fWeightMatrix1to2[8][12] = -0.0169959355374658; fWeightMatrix1to2[9][12] = -0.130909906042466; fWeightMatrix1to2[10][12] = 0.86301824135679; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.424084550266778; fWeightMatrix2to3[0][1] = -0.0838696657063759; fWeightMatrix2to3[0][2] = -0.836238550933598; fWeightMatrix2to3[0][3] = 0.236185483210546; fWeightMatrix2to3[0][4] = 0.617507627850387; fWeightMatrix2to3[0][5] = -1.37988173662972; fWeightMatrix2to3[0][6] = -1.3601258740386; fWeightMatrix2to3[0][7] = -0.771420915715125; fWeightMatrix2to3[0][8] = 0.472600954754963; fWeightMatrix2to3[0][9] = -0.474831637841767; fWeightMatrix2to3[0][10] = -0.726630996950985; fWeightMatrix2to3[0][11] = 0.975397725271345; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l