// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:03:43 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job404 Training events: 48964 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [47.4487342834,803.330627441] LepAPt LepAPt 'F' [20.0000915527,158.247711182] LepBPt LepBPt 'F' [10.000123024,71.4863357544] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.978936254978,16.7354717255] MetSpec MetSpec 'F' [15.0048427582,229.921310425] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1199874878,498.085754395] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [1.28708541393,323.660003662] addEt addEt 'F' [47.4487342834,409.19140625] dPhiLepSumMet dPhiLepSumMet 'F' [0.0808083936572,3.14158177376] dPhiLeptons dPhiLeptons 'F' [1.1922662452e-05,1.11774706841] dRLeptons dRLeptons 'F' [0.200001657009,1.15372478962] lep1_E lep1_E 'F' [20.0080432892,211.499725342] lep2_E lep2_E 'F' [10.0116767883,106.10823822] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 47.4487342834473; fVmax[0] = 803.330627441406; fVmin[1] = 20.0000915527344; fVmax[1] = 158.247711181641; fVmin[2] = 10.0001230239868; fVmax[2] = 71.4863357543945; fVmin[3] = 0.97893625497818; fVmax[3] = 16.7354717254639; fVmin[4] = 15.0048427581787; fVmax[4] = 229.921310424805; fVmin[5] = 30.119987487793; fVmax[5] = 498.085754394531; fVmin[6] = 1.2870854139328; fVmax[6] = 323.660003662109; fVmin[7] = 47.4487342834473; fVmax[7] = 409.19140625; fVmin[8] = 0.0808083936572075; fVmax[8] = 3.14158177375793; fVmin[9] = 1.19226624519797e-05; fVmax[9] = 1.11774706840515; fVmin[10] = 0.200001657009125; fVmax[10] = 1.15372478961945; fVmin[11] = 20.0080432891846; fVmax[11] = 211.499725341797; fVmin[12] = 10.0116767883301; fVmax[12] = 106.108238220215; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.384518168453924; fWeightMatrix0to1[1][0] = 2.05017750073871; fWeightMatrix0to1[2][0] = 0.895286446998211; fWeightMatrix0to1[3][0] = 1.3358842354018; fWeightMatrix0to1[4][0] = -0.931453085614545; fWeightMatrix0to1[5][0] = -0.839074849224794; fWeightMatrix0to1[6][0] = 0.212688979023933; fWeightMatrix0to1[7][0] = 1.30365119229911; fWeightMatrix0to1[8][0] = -0.85953302064419; fWeightMatrix0to1[9][0] = -0.631330893121013; fWeightMatrix0to1[10][0] = -1.44872260541735; fWeightMatrix0to1[11][0] = -0.046126828310631; fWeightMatrix0to1[12][0] = -0.6345956224984; fWeightMatrix0to1[13][0] = 0.0789310980344921; fWeightMatrix0to1[0][1] = -0.590209410875582; fWeightMatrix0to1[1][1] = 0.905108183646924; fWeightMatrix0to1[2][1] = 0.767523413061242; fWeightMatrix0to1[3][1] = -0.406893026164488; fWeightMatrix0to1[4][1] = 0.220471736620598; fWeightMatrix0to1[5][1] = 1.31616051359166; fWeightMatrix0to1[6][1] = -3.1749908642317; fWeightMatrix0to1[7][1] = -1.41829312111282; fWeightMatrix0to1[8][1] = 1.14240601288346; fWeightMatrix0to1[9][1] = 3.46466266270761; fWeightMatrix0to1[10][1] = -1.64558999117655; fWeightMatrix0to1[11][1] = -0.741869920270923; fWeightMatrix0to1[12][1] = -2.0809412675778; fWeightMatrix0to1[13][1] = -4.1801621288158; fWeightMatrix0to1[0][2] = -1.84901434875357; fWeightMatrix0to1[1][2] = 0.663723829877462; fWeightMatrix0to1[2][2] = 1.09883218407162; fWeightMatrix0to1[3][2] = -2.2758453904506; fWeightMatrix0to1[4][2] = -0.953772798514075; fWeightMatrix0to1[5][2] = 0.63123221113543; fWeightMatrix0to1[6][2] = -4.20164561437447; fWeightMatrix0to1[7][2] = -0.395989255843365; fWeightMatrix0to1[8][2] = -1.60401305388288; fWeightMatrix0to1[9][2] = 2.84747731186386; fWeightMatrix0to1[10][2] = 1.13237907579509; fWeightMatrix0to1[11][2] = 1.87664207299766; fWeightMatrix0to1[12][2] = -0.99764553675846; fWeightMatrix0to1[13][2] = -3.43947102539275; fWeightMatrix0to1[0][3] = 1.92309772457875; fWeightMatrix0to1[1][3] = 1.2056389568564; fWeightMatrix0to1[2][3] = -0.0751739260086241; fWeightMatrix0to1[3][3] = -1.79006344352199; fWeightMatrix0to1[4][3] = 1.07531579826215; fWeightMatrix0to1[5][3] = -0.414106706336321; fWeightMatrix0to1[6][3] = 4.30443326533421; fWeightMatrix0to1[7][3] = 1.37936971077964; fWeightMatrix0to1[8][3] = 2.38311930390652; fWeightMatrix0to1[9][3] = 1.25140467592866; fWeightMatrix0to1[10][3] = -1.82099099182075; fWeightMatrix0to1[11][3] = 1.53787833699531; fWeightMatrix0to1[12][3] = -3.41075443165993; fWeightMatrix0to1[13][3] = -1.59133486849284; fWeightMatrix0to1[0][4] = -1.29458858421469; fWeightMatrix0to1[1][4] = -1.60194620724724; fWeightMatrix0to1[2][4] = 1.87447775396859; fWeightMatrix0to1[3][4] = 0.0902258495219737; fWeightMatrix0to1[4][4] = -0.0464668839548279; fWeightMatrix0to1[5][4] = 1.16398671946133; fWeightMatrix0to1[6][4] = 1.4046541758254; fWeightMatrix0to1[7][4] = 0.192305344086452; fWeightMatrix0to1[8][4] = 2.99057805646921; fWeightMatrix0to1[9][4] = 1.34539801262473; fWeightMatrix0to1[10][4] = -0.278549723000138; fWeightMatrix0to1[11][4] = -1.36886621793119; fWeightMatrix0to1[12][4] = -1.28112549510166; fWeightMatrix0to1[13][4] = -2.76146848227894; fWeightMatrix0to1[0][5] = -0.800474878767562; fWeightMatrix0to1[1][5] = -1.35242433072672; fWeightMatrix0to1[2][5] = 0.629282351246921; fWeightMatrix0to1[3][5] = 0.651030649164804; fWeightMatrix0to1[4][5] = 2.46938429109778; fWeightMatrix0to1[5][5] = 0.430493094312855; fWeightMatrix0to1[6][5] = -1.52567902213586; fWeightMatrix0to1[7][5] = -2.55088350278025; fWeightMatrix0to1[8][5] = 1.44698542267657; fWeightMatrix0to1[9][5] = 0.868965276871185; fWeightMatrix0to1[10][5] = 2.18622528905614; fWeightMatrix0to1[11][5] = 1.09566735170052; fWeightMatrix0to1[12][5] = 2.03545164738102; fWeightMatrix0to1[13][5] = 0.0176186121171554; fWeightMatrix0to1[0][6] = -0.916137434471433; fWeightMatrix0to1[1][6] = -0.0456389255607185; fWeightMatrix0to1[2][6] = 1.21650398612006; fWeightMatrix0to1[3][6] = -1.04162067298446; fWeightMatrix0to1[4][6] = -0.302842282183567; fWeightMatrix0to1[5][6] = -0.838900495420895; fWeightMatrix0to1[6][6] = -0.270738444910457; fWeightMatrix0to1[7][6] = -0.757034479134646; fWeightMatrix0to1[8][6] = 2.15372210828543; fWeightMatrix0to1[9][6] = -0.372927808409513; fWeightMatrix0to1[10][6] = -0.59860608234627; fWeightMatrix0to1[11][6] = 1.78474900061493; fWeightMatrix0to1[12][6] = -1.50296880759296; fWeightMatrix0to1[13][6] = -2.85561468830216; fWeightMatrix0to1[0][7] = -1.42212473717041; fWeightMatrix0to1[1][7] = 0.43669005588157; fWeightMatrix0to1[2][7] = -0.872706571714756; fWeightMatrix0to1[3][7] = -5.57963444712366; fWeightMatrix0to1[4][7] = 1.37313918893548; fWeightMatrix0to1[5][7] = 2.21699975564022; fWeightMatrix0to1[6][7] = -4.57977925396691; fWeightMatrix0to1[7][7] = 1.25773949355957; fWeightMatrix0to1[8][7] = 0.0305716013388548; fWeightMatrix0to1[9][7] = 3.35290555011612; fWeightMatrix0to1[10][7] = 0.365988201411556; fWeightMatrix0to1[11][7] = 0.299137797771323; fWeightMatrix0to1[12][7] = -6.30793975647747; fWeightMatrix0to1[13][7] = -5.21634351086644; fWeightMatrix0to1[0][8] = 0.421502804735491; fWeightMatrix0to1[1][8] = -0.123722805728233; fWeightMatrix0to1[2][8] = -0.906042883309314; fWeightMatrix0to1[3][8] = 2.39655903857478; fWeightMatrix0to1[4][8] = -0.63837875213711; fWeightMatrix0to1[5][8] = -1.76148790290032; fWeightMatrix0to1[6][8] = 3.62268118950737; fWeightMatrix0to1[7][8] = -0.368846947746275; fWeightMatrix0to1[8][8] = -1.58497335313057; fWeightMatrix0to1[9][8] = 0.764614019018317; fWeightMatrix0to1[10][8] = -1.32769682424972; fWeightMatrix0to1[11][8] = 0.848586270367534; fWeightMatrix0to1[12][8] = -0.43283748665905; fWeightMatrix0to1[13][8] = -0.146232994792004; fWeightMatrix0to1[0][9] = -0.548551389108057; fWeightMatrix0to1[1][9] = -0.052539268873441; fWeightMatrix0to1[2][9] = -1.9143977791228; fWeightMatrix0to1[3][9] = -1.58168740273028; fWeightMatrix0to1[4][9] = 0.162721739803899; fWeightMatrix0to1[5][9] = -0.679493136449734; fWeightMatrix0to1[6][9] = 1.17793476841165; fWeightMatrix0to1[7][9] = -2.00378929518923; fWeightMatrix0to1[8][9] = -0.755035488218932; fWeightMatrix0to1[9][9] = -0.946606426546533; fWeightMatrix0to1[10][9] = 1.2651022783707; fWeightMatrix0to1[11][9] = -0.99177694668202; fWeightMatrix0to1[12][9] = 0.229737591579874; fWeightMatrix0to1[13][9] = -0.358046015439717; fWeightMatrix0to1[0][10] = 1.40333398060347; fWeightMatrix0to1[1][10] = -0.375302813712139; fWeightMatrix0to1[2][10] = -2.30295472296512; fWeightMatrix0to1[3][10] = -2.82317487170346; fWeightMatrix0to1[4][10] = -1.35063804080125; fWeightMatrix0to1[5][10] = 2.07782713988607; fWeightMatrix0to1[6][10] = -0.650438270115102; fWeightMatrix0to1[7][10] = -1.52202100609818; fWeightMatrix0to1[8][10] = -1.44286965848468; fWeightMatrix0to1[9][10] = -0.53975858939036; fWeightMatrix0to1[10][10] = -2.17366175712953; fWeightMatrix0to1[11][10] = 1.09978338941767; fWeightMatrix0to1[12][10] = 1.78887256102816; fWeightMatrix0to1[13][10] = -0.957126388565485; fWeightMatrix0to1[0][11] = -0.510657492891367; fWeightMatrix0to1[1][11] = -0.230402219528048; fWeightMatrix0to1[2][11] = 1.71253440238144; fWeightMatrix0to1[3][11] = 2.01173947576686; fWeightMatrix0to1[4][11] = 1.42339056944207; fWeightMatrix0to1[5][11] = -1.56334146080138; fWeightMatrix0to1[6][11] = 1.00690834823915; fWeightMatrix0to1[7][11] = 0.523603664536145; fWeightMatrix0to1[8][11] = 1.15331656603322; fWeightMatrix0to1[9][11] = 1.76342219260058; fWeightMatrix0to1[10][11] = -0.804872275470562; fWeightMatrix0to1[11][11] = -0.094637269597526; fWeightMatrix0to1[12][11] = -0.732575472927072; fWeightMatrix0to1[13][11] = 2.12302361873437; fWeightMatrix0to1[0][12] = -0.857560680703985; fWeightMatrix0to1[1][12] = -0.736081290818457; fWeightMatrix0to1[2][12] = 1.51112430213824; fWeightMatrix0to1[3][12] = 2.81588943932078; fWeightMatrix0to1[4][12] = -0.164473595583183; fWeightMatrix0to1[5][12] = 2.08768592534905; fWeightMatrix0to1[6][12] = 2.19367888277467; fWeightMatrix0to1[7][12] = -1.7533426025299; fWeightMatrix0to1[8][12] = 0.61603546640121; fWeightMatrix0to1[9][12] = -0.0384169712368255; fWeightMatrix0to1[10][12] = 0.940664370034667; fWeightMatrix0to1[11][12] = -0.311046378965734; fWeightMatrix0to1[12][12] = -1.52999650500456; fWeightMatrix0to1[13][12] = 1.72555224128734; fWeightMatrix0to1[0][13] = 1.99719026393579; fWeightMatrix0to1[1][13] = -1.35339864325843; fWeightMatrix0to1[2][13] = 0.800217122884789; fWeightMatrix0to1[3][13] = -6.98173870847522; fWeightMatrix0to1[4][13] = 0.226890793190961; fWeightMatrix0to1[5][13] = 2.03827906276293; fWeightMatrix0to1[6][13] = -9.46362819465555; fWeightMatrix0to1[7][13] = 0.815411147602361; fWeightMatrix0to1[8][13] = -0.291621684784185; fWeightMatrix0to1[9][13] = 7.61735427386472; fWeightMatrix0to1[10][13] = 1.894098636709; fWeightMatrix0to1[11][13] = 0.646306641303357; fWeightMatrix0to1[12][13] = -7.53882014974491; fWeightMatrix0to1[13][13] = -11.1594061005174; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -0.269112827657256; fWeightMatrix1to2[1][0] = -0.303730860843752; fWeightMatrix1to2[2][0] = -0.437481057575315; fWeightMatrix1to2[3][0] = -1.74165205613865; fWeightMatrix1to2[4][0] = -0.622545030538181; fWeightMatrix1to2[5][0] = -1.49448063805734; fWeightMatrix1to2[6][0] = -0.312868052439175; fWeightMatrix1to2[7][0] = -2.09554798800038; fWeightMatrix1to2[8][0] = 0.0249437157490187; fWeightMatrix1to2[9][0] = -0.120188089588333; fWeightMatrix1to2[10][0] = -1.3964461103225; fWeightMatrix1to2[11][0] = -0.113668005179371; fWeightMatrix1to2[12][0] = -0.304394701233186; fWeightMatrix1to2[0][1] = 1.56770272786944; fWeightMatrix1to2[1][1] = 0.556405944740383; fWeightMatrix1to2[2][1] = -0.516548548181066; fWeightMatrix1to2[3][1] = -2.00180575497043; fWeightMatrix1to2[4][1] = -0.896760589422227; fWeightMatrix1to2[5][1] = 1.74954879714548; fWeightMatrix1to2[6][1] = 0.889078941656323; fWeightMatrix1to2[7][1] = 0.896672689711104; fWeightMatrix1to2[8][1] = 0.139663696442014; fWeightMatrix1to2[9][1] = 2.01064449130109; fWeightMatrix1to2[10][1] = -1.6332984015684; fWeightMatrix1to2[11][1] = -1.18279886454101; fWeightMatrix1to2[12][1] = 0.151572420440069; fWeightMatrix1to2[0][2] = -1.6860320974317; fWeightMatrix1to2[1][2] = -0.103867249606972; fWeightMatrix1to2[2][2] = 2.38992008592804; fWeightMatrix1to2[3][2] = -1.17614479956194; fWeightMatrix1to2[4][2] = 0.572711232928636; fWeightMatrix1to2[5][2] = -0.11476734355857; fWeightMatrix1to2[6][2] = -2.88042039521434; fWeightMatrix1to2[7][2] = -1.17796563346328; fWeightMatrix1to2[8][2] = 0.992241627194496; fWeightMatrix1to2[9][2] = -0.400469712153945; fWeightMatrix1to2[10][2] = 0.642308687047926; fWeightMatrix1to2[11][2] = -1.47960217276273; fWeightMatrix1to2[12][2] = 1.04053983746391; fWeightMatrix1to2[0][3] = -3.0604644239133; fWeightMatrix1to2[1][3] = 1.00109794594111; fWeightMatrix1to2[2][3] = 0.00473760332093702; fWeightMatrix1to2[3][3] = -1.52079323258921; fWeightMatrix1to2[4][3] = -1.28252973954045; fWeightMatrix1to2[5][3] = -2.06762600160037; fWeightMatrix1to2[6][3] = -0.40274015450527; fWeightMatrix1to2[7][3] = -1.44891286917384; fWeightMatrix1to2[8][3] = -1.53512542492403; fWeightMatrix1to2[9][3] = -1.35987964495015; fWeightMatrix1to2[10][3] = 0.316640385783149; fWeightMatrix1to2[11][3] = -0.491429211060966; fWeightMatrix1to2[12][3] = -1.80940029756989; fWeightMatrix1to2[0][4] = 1.74634896368214; fWeightMatrix1to2[1][4] = -1.97544408283607; fWeightMatrix1to2[2][4] = -0.35157399011528; fWeightMatrix1to2[3][4] = -1.67458990038529; fWeightMatrix1to2[4][4] = 0.171338721685468; fWeightMatrix1to2[5][4] = 0.667330903947874; fWeightMatrix1to2[6][4] = -0.238986721270938; fWeightMatrix1to2[7][4] = -0.385427612665709; fWeightMatrix1to2[8][4] = -2.86041302096779; fWeightMatrix1to2[9][4] = -1.24374757465134; fWeightMatrix1to2[10][4] = -1.40176087882028; fWeightMatrix1to2[11][4] = -1.46876383957493; fWeightMatrix1to2[12][4] = -0.646649824070511; fWeightMatrix1to2[0][5] = 2.06378523793428; fWeightMatrix1to2[1][5] = 0.51557966913576; fWeightMatrix1to2[2][5] = -1.1528086804759; fWeightMatrix1to2[3][5] = -0.219987830337705; fWeightMatrix1to2[4][5] = -0.623968607156926; fWeightMatrix1to2[5][5] = -2.79324058675188; fWeightMatrix1to2[6][5] = -0.507075946246815; fWeightMatrix1to2[7][5] = 0.797681857895551; fWeightMatrix1to2[8][5] = -2.12390661007057; fWeightMatrix1to2[9][5] = -1.61165868323013; fWeightMatrix1to2[10][5] = 0.108146092466755; fWeightMatrix1to2[11][5] = 1.05433536779896; fWeightMatrix1to2[12][5] = -1.08045036551558; fWeightMatrix1to2[0][6] = -3.17810008891168; fWeightMatrix1to2[1][6] = -0.736122437447294; fWeightMatrix1to2[2][6] = 1.52479415405427; fWeightMatrix1to2[3][6] = 0.480108056358458; fWeightMatrix1to2[4][6] = 1.05781275594428; fWeightMatrix1to2[5][6] = -1.62394021575524; fWeightMatrix1to2[6][6] = -1.02254667112255; fWeightMatrix1to2[7][6] = -0.0170305581558509; fWeightMatrix1to2[8][6] = -2.83174382917379; fWeightMatrix1to2[9][6] = 0.547764332069215; fWeightMatrix1to2[10][6] = -2.80833075121169; fWeightMatrix1to2[11][6] = -1.24014148776926; fWeightMatrix1to2[12][6] = -3.45020298204048; fWeightMatrix1to2[0][7] = 4.15962206123133; fWeightMatrix1to2[1][7] = -2.50258682856847; fWeightMatrix1to2[2][7] = 0.709670408090245; fWeightMatrix1to2[3][7] = -1.30826918528815; fWeightMatrix1to2[4][7] = 1.20581937212897; fWeightMatrix1to2[5][7] = -1.7347410190165; fWeightMatrix1to2[6][7] = 0.0889832849199333; fWeightMatrix1to2[7][7] = 0.694916608581676; fWeightMatrix1to2[8][7] = -1.87937446520266; fWeightMatrix1to2[9][7] = -0.352859362418951; fWeightMatrix1to2[10][7] = -0.0564748506442067; fWeightMatrix1to2[11][7] = -1.92238057775796; fWeightMatrix1to2[12][7] = -3.0102055162004; fWeightMatrix1to2[0][8] = -2.12858780856866; fWeightMatrix1to2[1][8] = -1.50811435359166; fWeightMatrix1to2[2][8] = 2.687503354458; fWeightMatrix1to2[3][8] = -0.56226504088113; fWeightMatrix1to2[4][8] = 0.0502844028980358; fWeightMatrix1to2[5][8] = -0.718470541483956; fWeightMatrix1to2[6][8] = -1.99840394964911; fWeightMatrix1to2[7][8] = -1.97221903450606; fWeightMatrix1to2[8][8] = -1.01252183078778; fWeightMatrix1to2[9][8] = -1.93413283973042; fWeightMatrix1to2[10][8] = 0.571180208122544; fWeightMatrix1to2[11][8] = -1.73474029524351; fWeightMatrix1to2[12][8] = -1.06597238088037; fWeightMatrix1to2[0][9] = 5.49801188355206; fWeightMatrix1to2[1][9] = -0.431544863020875; fWeightMatrix1to2[2][9] = -0.436214954619637; fWeightMatrix1to2[3][9] = 2.02277296561514; fWeightMatrix1to2[4][9] = -1.79290966765282; fWeightMatrix1to2[5][9] = 1.37547914113295; fWeightMatrix1to2[6][9] = 2.5653312818085; fWeightMatrix1to2[7][9] = -0.834067080128595; fWeightMatrix1to2[8][9] = -1.25965815755966; fWeightMatrix1to2[9][9] = -0.788916109085427; fWeightMatrix1to2[10][9] = -1.62870584653442; fWeightMatrix1to2[11][9] = 0.413556050560529; fWeightMatrix1to2[12][9] = 1.86552580610711; fWeightMatrix1to2[0][10] = 2.44017287090161; fWeightMatrix1to2[1][10] = 0.0881852589898708; fWeightMatrix1to2[2][10] = -2.59952722345126; fWeightMatrix1to2[3][10] = 0.694958354204109; fWeightMatrix1to2[4][10] = -1.74623144101404; fWeightMatrix1to2[5][10] = -1.25878446780839; fWeightMatrix1to2[6][10] = -1.72326757776544; fWeightMatrix1to2[7][10] = -1.55217615990605; fWeightMatrix1to2[8][10] = -1.49667124530103; fWeightMatrix1to2[9][10] = 0.611498306037643; fWeightMatrix1to2[10][10] = -0.35717989401563; fWeightMatrix1to2[11][10] = 1.0984740261888; fWeightMatrix1to2[12][10] = -1.51127671882008; fWeightMatrix1to2[0][11] = 1.15624955590535; fWeightMatrix1to2[1][11] = -1.950968670765; fWeightMatrix1to2[2][11] = -1.07483359147246; fWeightMatrix1to2[3][11] = 1.57437131049451; fWeightMatrix1to2[4][11] = -1.59358933620428; fWeightMatrix1to2[5][11] = -0.860393149568731; fWeightMatrix1to2[6][11] = -1.82470937741621; fWeightMatrix1to2[7][11] = -0.752370520065714; fWeightMatrix1to2[8][11] = -1.26569478773758; fWeightMatrix1to2[9][11] = -0.482428508527012; fWeightMatrix1to2[10][11] = -1.60470992589843; fWeightMatrix1to2[11][11] = 1.01910009336439; fWeightMatrix1to2[12][11] = 0.222427829440432; fWeightMatrix1to2[0][12] = -4.38631647581322; fWeightMatrix1to2[1][12] = -0.852791737617273; fWeightMatrix1to2[2][12] = -1.70120039463888; fWeightMatrix1to2[3][12] = -2.32751129055311; fWeightMatrix1to2[4][12] = -1.76477624842108; fWeightMatrix1to2[5][12] = -0.577555055734616; fWeightMatrix1to2[6][12] = -3.57003009188193; fWeightMatrix1to2[7][12] = -1.54996339621223; fWeightMatrix1to2[8][12] = -0.887365268394173; fWeightMatrix1to2[9][12] = -2.82466813161805; fWeightMatrix1to2[10][12] = -1.26549270970736; fWeightMatrix1to2[11][12] = -1.4729044642718; fWeightMatrix1to2[12][12] = -0.498679388090051; fWeightMatrix1to2[0][13] = -5.61550655149999; fWeightMatrix1to2[1][13] = -2.3867990655003; fWeightMatrix1to2[2][13] = -1.43318125492643; fWeightMatrix1to2[3][13] = -1.68681685049964; fWeightMatrix1to2[4][13] = -1.57692771556456; fWeightMatrix1to2[5][13] = 0.103659460853635; fWeightMatrix1to2[6][13] = -4.62116504506919; fWeightMatrix1to2[7][13] = -1.06973885106692; fWeightMatrix1to2[8][13] = 0.596108424170737; fWeightMatrix1to2[9][13] = 0.226540116510001; fWeightMatrix1to2[10][13] = 0.714786659933145; fWeightMatrix1to2[11][13] = -0.00488901972592253; fWeightMatrix1to2[12][13] = 1.28721752852172; fWeightMatrix1to2[0][14] = 3.29242011610731; fWeightMatrix1to2[1][14] = -1.19394210774177; fWeightMatrix1to2[2][14] = -2.25572158403812; fWeightMatrix1to2[3][14] = -1.06301782061666; fWeightMatrix1to2[4][14] = -1.67127603829715; fWeightMatrix1to2[5][14] = -0.838819315430104; fWeightMatrix1to2[6][14] = 0.28345217810911; fWeightMatrix1to2[7][14] = -0.067786532415075; fWeightMatrix1to2[8][14] = -0.432375858755326; fWeightMatrix1to2[9][14] = -0.0981388969024984; fWeightMatrix1to2[10][14] = 0.780603883622621; fWeightMatrix1to2[11][14] = -1.40483496795638; fWeightMatrix1to2[12][14] = -0.256624697024381; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.736582114887075; fWeightMatrix2to3[0][1] = 7.89216552664885; fWeightMatrix2to3[0][2] = -1.48035063133148; fWeightMatrix2to3[0][3] = -0.930505294808271; fWeightMatrix2to3[0][4] = 0.998437815544713; fWeightMatrix2to3[0][5] = 9.88319189896665; fWeightMatrix2to3[0][6] = -1.63216282276711; fWeightMatrix2to3[0][7] = 1.67369236323486; fWeightMatrix2to3[0][8] = 3.52338949870743; fWeightMatrix2to3[0][9] = 1.09122283296275; fWeightMatrix2to3[0][10] = 1.06926280828436; fWeightMatrix2to3[0][11] = 0.779363972364402; fWeightMatrix2to3[0][12] = 6.48816292393782; fWeightMatrix2to3[0][13] = -0.0828216241489724; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l