// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:29:38 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 22656 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1.001734972] LRWg LRWg 'F' [0,0.99988079071] LRWj LRWj 'F' [-0.00544115481898,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.1351919174,265.280822754] MetDelPhi MetDelPhi 'F' [0.13507989049,3.12887883186] MetSpec MetSpec 'F' [15.0015153885,196.811447144] dPhiLeptons dPhiLeptons 'F' [8.4400177002e-05,3.14152622223] dRLeptons dRLeptons 'F' [0.400388747454,4.41000127792] dimass dimass 'F' [16.045343399,573.807128906] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1.00173497200012; fVmin[2] = 0; fVmax[2] = 0.999880790710449; fVmin[3] = -0.00544115481898189; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.1351919174194; fVmax[5] = 265.280822753906; fVmin[6] = 0.135079890489578; fVmax[6] = 3.1288788318634; fVmin[7] = 15.0015153884888; fVmax[7] = 196.811447143555; fVmin[8] = 8.44001770019531e-05; fVmax[8] = 3.141526222229; fVmin[9] = 0.40038874745369; fVmax[9] = 4.41000127792358; fVmin[10] = 16.0453433990479; fVmax[10] = 573.80712890625; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.65462752053532; fWeightMatrix0to1[1][0] = 1.50055596496089; fWeightMatrix0to1[2][0] = -0.081020777685934; fWeightMatrix0to1[3][0] = 2.59363860051387; fWeightMatrix0to1[4][0] = 0.581397525299068; fWeightMatrix0to1[5][0] = 1.62425893312884; fWeightMatrix0to1[6][0] = 1.74154795226871; fWeightMatrix0to1[7][0] = 0.0436196802854812; fWeightMatrix0to1[8][0] = 0.867860098482994; fWeightMatrix0to1[9][0] = 1.33543230050641; fWeightMatrix0to1[10][0] = 0.328785895203889; fWeightMatrix0to1[11][0] = 0.200620747806885; fWeightMatrix0to1[0][1] = 1.43478695469036; fWeightMatrix0to1[1][1] = 0.845153664619776; fWeightMatrix0to1[2][1] = 0.570911520618172; fWeightMatrix0to1[3][1] = -1.12228133725389; fWeightMatrix0to1[4][1] = 1.67976912831272; fWeightMatrix0to1[5][1] = 0.0240340344385212; fWeightMatrix0to1[6][1] = -1.77398711686796; fWeightMatrix0to1[7][1] = 2.16128266223172; fWeightMatrix0to1[8][1] = 0.798337005715035; fWeightMatrix0to1[9][1] = 0.41817882654666; fWeightMatrix0to1[10][1] = 0.524981814349193; fWeightMatrix0to1[11][1] = 0.231921711609104; fWeightMatrix0to1[0][2] = 0.351128756329362; fWeightMatrix0to1[1][2] = 0.196545377799027; fWeightMatrix0to1[2][2] = 0.787189615494244; fWeightMatrix0to1[3][2] = -1.45871252303135; fWeightMatrix0to1[4][2] = 1.80692914589635; fWeightMatrix0to1[5][2] = 0.154472445954465; fWeightMatrix0to1[6][2] = -0.426421331477031; fWeightMatrix0to1[7][2] = -0.759197746015242; fWeightMatrix0to1[8][2] = -0.119043416841108; fWeightMatrix0to1[9][2] = -0.953165591445868; fWeightMatrix0to1[10][2] = 0.0558922614204443; fWeightMatrix0to1[11][2] = 1.43721575154796; fWeightMatrix0to1[0][3] = 0.656579930183529; fWeightMatrix0to1[1][3] = 0.826267133588804; fWeightMatrix0to1[2][3] = 0.794871983214229; fWeightMatrix0to1[3][3] = 1.7967375169749; fWeightMatrix0to1[4][3] = -1.19357858572002; fWeightMatrix0to1[5][3] = -0.454667626641627; fWeightMatrix0to1[6][3] = -1.27006104026966; fWeightMatrix0to1[7][3] = -1.45686656457336; fWeightMatrix0to1[8][3] = -0.739301191903545; fWeightMatrix0to1[9][3] = -1.05068631035288; fWeightMatrix0to1[10][3] = -0.129553254009058; fWeightMatrix0to1[11][3] = 0.697701678188062; fWeightMatrix0to1[0][4] = 0.890116897353889; fWeightMatrix0to1[1][4] = 0.493565092013818; fWeightMatrix0to1[2][4] = 0.893340602337309; fWeightMatrix0to1[3][4] = -1.50822947414657; fWeightMatrix0to1[4][4] = -0.626751686330046; fWeightMatrix0to1[5][4] = -1.42509499488746; fWeightMatrix0to1[6][4] = -0.768795751488734; fWeightMatrix0to1[7][4] = -0.382037790546287; fWeightMatrix0to1[8][4] = -0.22572164238816; fWeightMatrix0to1[9][4] = 1.19520484464063; fWeightMatrix0to1[10][4] = -0.775843967563798; fWeightMatrix0to1[11][4] = 0.129935056584747; fWeightMatrix0to1[0][5] = -0.697534388476242; fWeightMatrix0to1[1][5] = -0.594695158276456; fWeightMatrix0to1[2][5] = -2.31792832744733; fWeightMatrix0to1[3][5] = 0.782655186924677; fWeightMatrix0to1[4][5] = 1.63086334081814; fWeightMatrix0to1[5][5] = -1.82866581600017; fWeightMatrix0to1[6][5] = 1.58636414501692; fWeightMatrix0to1[7][5] = -0.708797902305562; fWeightMatrix0to1[8][5] = -0.260611490690343; fWeightMatrix0to1[9][5] = -0.0111169929716495; fWeightMatrix0to1[10][5] = -0.850430390514723; fWeightMatrix0to1[11][5] = -0.861445989419876; fWeightMatrix0to1[0][6] = -1.38320526930367; fWeightMatrix0to1[1][6] = -0.320088701064298; fWeightMatrix0to1[2][6] = 2.12878023211734; fWeightMatrix0to1[3][6] = -0.59348073639677; fWeightMatrix0to1[4][6] = -0.913579507145665; fWeightMatrix0to1[5][6] = -0.270908137493945; fWeightMatrix0to1[6][6] = -1.83542867646241; fWeightMatrix0to1[7][6] = 2.08050973990422; fWeightMatrix0to1[8][6] = 0.40389112798039; fWeightMatrix0to1[9][6] = 0.282011582461464; fWeightMatrix0to1[10][6] = 1.49703328545836; fWeightMatrix0to1[11][6] = 0.167663196613952; fWeightMatrix0to1[0][7] = -0.219398581341964; fWeightMatrix0to1[1][7] = 0.403294463393997; fWeightMatrix0to1[2][7] = 1.56824088402654; fWeightMatrix0to1[3][7] = -0.455244411669851; fWeightMatrix0to1[4][7] = 1.89815014483702; fWeightMatrix0to1[5][7] = 0.0730995945161036; fWeightMatrix0to1[6][7] = 1.03273814516562; fWeightMatrix0to1[7][7] = -0.554692890190167; fWeightMatrix0to1[8][7] = -0.31403370825409; fWeightMatrix0to1[9][7] = 0.887219315692729; fWeightMatrix0to1[10][7] = -2.23092832155935; fWeightMatrix0to1[11][7] = 2.19894476933682; fWeightMatrix0to1[0][8] = 0.119103718319129; fWeightMatrix0to1[1][8] = -0.392418652071513; fWeightMatrix0to1[2][8] = 2.29570167714072; fWeightMatrix0to1[3][8] = -2.25780954773032; fWeightMatrix0to1[4][8] = 0.2643091475201; fWeightMatrix0to1[5][8] = 0.931136360047457; fWeightMatrix0to1[6][8] = -0.778551711746665; fWeightMatrix0to1[7][8] = -1.11167022069844; fWeightMatrix0to1[8][8] = 2.14637547990156; fWeightMatrix0to1[9][8] = -0.00875640049431432; fWeightMatrix0to1[10][8] = 0.665016826874382; fWeightMatrix0to1[11][8] = -1.46466365998647; fWeightMatrix0to1[0][9] = 0.0920142672898901; fWeightMatrix0to1[1][9] = -1.71787193625765; fWeightMatrix0to1[2][9] = 1.24863877441835; fWeightMatrix0to1[3][9] = -1.21824619152771; fWeightMatrix0to1[4][9] = -0.0783194214464598; fWeightMatrix0to1[5][9] = 0.727941188404314; fWeightMatrix0to1[6][9] = -1.09482767669652; fWeightMatrix0to1[7][9] = 0.705154383867254; fWeightMatrix0to1[8][9] = -1.13039541609727; fWeightMatrix0to1[9][9] = -2.02329622797012; fWeightMatrix0to1[10][9] = -1.33599065166845; fWeightMatrix0to1[11][9] = 0.0713123369812594; fWeightMatrix0to1[0][10] = -1.43416261663367; fWeightMatrix0to1[1][10] = 1.96539446780134; fWeightMatrix0to1[2][10] = 1.19626365259069; fWeightMatrix0to1[3][10] = -1.4736955173512; fWeightMatrix0to1[4][10] = 1.18087323059182; fWeightMatrix0to1[5][10] = -1.90353554522983; fWeightMatrix0to1[6][10] = -1.30837995499381; fWeightMatrix0to1[7][10] = 0.531155326589459; fWeightMatrix0to1[8][10] = -0.652549640300436; fWeightMatrix0to1[9][10] = -0.480276540668643; fWeightMatrix0to1[10][10] = -0.925574111207797; fWeightMatrix0to1[11][10] = -1.45795093760673; fWeightMatrix0to1[0][11] = -1.20440678252613; fWeightMatrix0to1[1][11] = -0.439719466838807; fWeightMatrix0to1[2][11] = -0.272220858141388; fWeightMatrix0to1[3][11] = 0.374081025837368; fWeightMatrix0to1[4][11] = 0.815866563775096; fWeightMatrix0to1[5][11] = 1.92954796367767; fWeightMatrix0to1[6][11] = 0.364156710470051; fWeightMatrix0to1[7][11] = -1.26741129470475; fWeightMatrix0to1[8][11] = 2.09859250065691; fWeightMatrix0to1[9][11] = -1.75476793887125; fWeightMatrix0to1[10][11] = 0.202952757069341; fWeightMatrix0to1[11][11] = -0.936168776789437; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.55294198726105; fWeightMatrix1to2[1][0] = -0.163857429268185; fWeightMatrix1to2[2][0] = -0.720842184461318; fWeightMatrix1to2[3][0] = -0.860428486565548; fWeightMatrix1to2[4][0] = -2.04742398253115; fWeightMatrix1to2[5][0] = -1.40703592494359; fWeightMatrix1to2[6][0] = 2.20872334895562; fWeightMatrix1to2[7][0] = 1.5585446142832; fWeightMatrix1to2[8][0] = 1.52921770003674; fWeightMatrix1to2[9][0] = -2.20504735635683; fWeightMatrix1to2[10][0] = -0.677441572191487; fWeightMatrix1to2[0][1] = 0.882624614699222; fWeightMatrix1to2[1][1] = -1.74224690723333; fWeightMatrix1to2[2][1] = -0.818124942357026; fWeightMatrix1to2[3][1] = -0.0869894864572992; fWeightMatrix1to2[4][1] = 2.16304578450479; fWeightMatrix1to2[5][1] = -2.13194912076021; fWeightMatrix1to2[6][1] = -0.378021146799247; fWeightMatrix1to2[7][1] = 0.0619272479462484; fWeightMatrix1to2[8][1] = -1.43302105756854; fWeightMatrix1to2[9][1] = -1.35556797969038; fWeightMatrix1to2[10][1] = 0.113954687507802; fWeightMatrix1to2[0][2] = 0.536237005610397; fWeightMatrix1to2[1][2] = -0.260618589601343; fWeightMatrix1to2[2][2] = -1.01523954788847; fWeightMatrix1to2[3][2] = -0.135197372318285; fWeightMatrix1to2[4][2] = 1.3649550402742; fWeightMatrix1to2[5][2] = -0.825325886910388; fWeightMatrix1to2[6][2] = 0.829370785257768; fWeightMatrix1to2[7][2] = -1.93444137737606; fWeightMatrix1to2[8][2] = 0.236761252561594; fWeightMatrix1to2[9][2] = -0.385717384930608; fWeightMatrix1to2[10][2] = -0.0905539136707084; fWeightMatrix1to2[0][3] = -1.42589107340764; fWeightMatrix1to2[1][3] = 0.00443374469229319; fWeightMatrix1to2[2][3] = 0.74035296321574; fWeightMatrix1to2[3][3] = -2.30238399920916; fWeightMatrix1to2[4][3] = 1.36256024042586; fWeightMatrix1to2[5][3] = 0.0063913749502068; fWeightMatrix1to2[6][3] = 1.40558137335528; fWeightMatrix1to2[7][3] = 0.123850413861893; fWeightMatrix1to2[8][3] = -1.55116036575614; fWeightMatrix1to2[9][3] = -1.86982912588484; fWeightMatrix1to2[10][3] = -0.70162514547721; fWeightMatrix1to2[0][4] = 0.343764547674075; fWeightMatrix1to2[1][4] = 0.153547153295855; fWeightMatrix1to2[2][4] = 0.384308793809121; fWeightMatrix1to2[3][4] = -1.14713183693652; fWeightMatrix1to2[4][4] = 1.39541236451235; fWeightMatrix1to2[5][4] = 0.893159302994126; fWeightMatrix1to2[6][4] = -1.35713472105539; fWeightMatrix1to2[7][4] = -0.0070776326504347; fWeightMatrix1to2[8][4] = 0.645850442396416; fWeightMatrix1to2[9][4] = 0.797947340308029; fWeightMatrix1to2[10][4] = -1.15202736952151; fWeightMatrix1to2[0][5] = -0.625576089234281; fWeightMatrix1to2[1][5] = -0.214438776396643; fWeightMatrix1to2[2][5] = -0.230940584275387; fWeightMatrix1to2[3][5] = -0.824965416981024; fWeightMatrix1to2[4][5] = -2.34004198997755; fWeightMatrix1to2[5][5] = -1.16086897381527; fWeightMatrix1to2[6][5] = -0.748266445515644; fWeightMatrix1to2[7][5] = 0.284139532236603; fWeightMatrix1to2[8][5] = -2.18167695835874; fWeightMatrix1to2[9][5] = 1.56153646521072; fWeightMatrix1to2[10][5] = -0.61937622834778; fWeightMatrix1to2[0][6] = -1.2073582837893; fWeightMatrix1to2[1][6] = 0.978561334614753; fWeightMatrix1to2[2][6] = -1.07154638613837; fWeightMatrix1to2[3][6] = -1.77448254839759; fWeightMatrix1to2[4][6] = 1.79753602818816; fWeightMatrix1to2[5][6] = -1.76859239579776; fWeightMatrix1to2[6][6] = 1.00209945630467; fWeightMatrix1to2[7][6] = -2.44113430028198; fWeightMatrix1to2[8][6] = -0.901062289385142; fWeightMatrix1to2[9][6] = 0.893172086163808; fWeightMatrix1to2[10][6] = -3.27981372053931; fWeightMatrix1to2[0][7] = 0.893961395588699; fWeightMatrix1to2[1][7] = -0.0693843577310622; fWeightMatrix1to2[2][7] = -0.686677669174746; fWeightMatrix1to2[3][7] = -1.41780571652859; fWeightMatrix1to2[4][7] = -1.53919067269536; fWeightMatrix1to2[5][7] = -1.45025279400589; fWeightMatrix1to2[6][7] = -0.516301227556979; fWeightMatrix1to2[7][7] = -2.0441318449828; fWeightMatrix1to2[8][7] = -0.203234398208421; fWeightMatrix1to2[9][7] = -1.15252492828019; fWeightMatrix1to2[10][7] = 1.11491855545589; fWeightMatrix1to2[0][8] = -1.58209444284861; fWeightMatrix1to2[1][8] = -0.632185358479382; fWeightMatrix1to2[2][8] = 1.16325932498823; fWeightMatrix1to2[3][8] = 0.593008027826484; fWeightMatrix1to2[4][8] = -1.54373978749226; fWeightMatrix1to2[5][8] = 1.53427357395076; fWeightMatrix1to2[6][8] = -1.96274131494169; fWeightMatrix1to2[7][8] = -0.00432818972496059; fWeightMatrix1to2[8][8] = -0.829076652138249; fWeightMatrix1to2[9][8] = -1.77134444125618; fWeightMatrix1to2[10][8] = 1.17826796436102; fWeightMatrix1to2[0][9] = -0.424047094239922; fWeightMatrix1to2[1][9] = 1.07454860980701; fWeightMatrix1to2[2][9] = 1.68194101866178; fWeightMatrix1to2[3][9] = 0.057550582538185; fWeightMatrix1to2[4][9] = 1.16303544328372; fWeightMatrix1to2[5][9] = -1.17268915623797; fWeightMatrix1to2[6][9] = -1.15297146399999; fWeightMatrix1to2[7][9] = -0.615762550297186; fWeightMatrix1to2[8][9] = -0.608279138618356; fWeightMatrix1to2[9][9] = -1.18428181239186; fWeightMatrix1to2[10][9] = -0.192562814029631; fWeightMatrix1to2[0][10] = 0.746970060260827; fWeightMatrix1to2[1][10] = -1.15698457091705; fWeightMatrix1to2[2][10] = -2.28601018602562; fWeightMatrix1to2[3][10] = 1.10759355586146; fWeightMatrix1to2[4][10] = -0.654023370870165; fWeightMatrix1to2[5][10] = -1.10487827594082; fWeightMatrix1to2[6][10] = -1.10961126893047; fWeightMatrix1to2[7][10] = 0.287211933288145; fWeightMatrix1to2[8][10] = -0.35842819135862; fWeightMatrix1to2[9][10] = -0.787553382676484; fWeightMatrix1to2[10][10] = -1.37527792775421; fWeightMatrix1to2[0][11] = 0.672790518515214; fWeightMatrix1to2[1][11] = -0.351141924840851; fWeightMatrix1to2[2][11] = 0.294047588281213; fWeightMatrix1to2[3][11] = -1.35152901396517; fWeightMatrix1to2[4][11] = -1.56230919316523; fWeightMatrix1to2[5][11] = -1.33298278234047; fWeightMatrix1to2[6][11] = 0.852154458294011; fWeightMatrix1to2[7][11] = 0.971005434381828; fWeightMatrix1to2[8][11] = -0.23287190530254; fWeightMatrix1to2[9][11] = 0.924399676833471; fWeightMatrix1to2[10][11] = 0.33461785350976; fWeightMatrix1to2[0][12] = -0.212261832511062; fWeightMatrix1to2[1][12] = -0.352726780861324; fWeightMatrix1to2[2][12] = -0.54461885143431; fWeightMatrix1to2[3][12] = -0.591064756449399; fWeightMatrix1to2[4][12] = 1.40752130052956; fWeightMatrix1to2[5][12] = 0.927887828367839; fWeightMatrix1to2[6][12] = -2.05850109144922; fWeightMatrix1to2[7][12] = -2.18849336963214; fWeightMatrix1to2[8][12] = -0.0759362223672682; fWeightMatrix1to2[9][12] = -0.176251877644131; fWeightMatrix1to2[10][12] = 0.45957754740988; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.00177632635345756; fWeightMatrix2to3[0][1] = -0.389405093586756; fWeightMatrix2to3[0][2] = -1.11851274693464; fWeightMatrix2to3[0][3] = 0.286975170133167; fWeightMatrix2to3[0][4] = 1.25099174064888; fWeightMatrix2to3[0][5] = -1.77578080714313; fWeightMatrix2to3[0][6] = -1.44483413158511; fWeightMatrix2to3[0][7] = 0.405213472770598; fWeightMatrix2to3[0][8] = 0.175004213734385; fWeightMatrix2to3[0][9] = -0.641453868008828; fWeightMatrix2to3[0][10] = -0.762026433162177; fWeightMatrix2to3[0][11] = 0.705939387208391; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l