// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:04:21 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job403 Training events: 49290 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [46.4561080933,776.884460449] LepAPt LepAPt 'F' [20.0001068115,174.934341431] LepBPt LepBPt 'F' [10.0002098083,71.4863357544] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.976250171661,16.9346866608] MetSpec MetSpec 'F' [15.0218133926,218.598831177] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.113155365,520.010314941] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [2.80723452568,299.029327393] addEt addEt 'F' [46.4561080933,388.558624268] dPhiLepSumMet dPhiLepSumMet 'F' [0.0435492135584,3.14158964157] dPhiLeptons dPhiLeptons 'F' [1.1922662452e-05,1.11657130718] dRLeptons dRLeptons 'F' [0.200001657009,1.15372478962] lep1_E lep1_E 'F' [20.0262546539,248.743759155] lep2_E lep2_E 'F' [10.0065498352,114.225990295] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 46.4561080932617; fVmax[0] = 776.884460449219; fVmin[1] = 20.0001068115234; fVmax[1] = 174.934341430664; fVmin[2] = 10.0002098083496; fVmax[2] = 71.4863357543945; fVmin[3] = 0.976250171661377; fVmax[3] = 16.9346866607666; fVmin[4] = 15.0218133926392; fVmax[4] = 218.598831176758; fVmin[5] = 30.1131553649902; fVmax[5] = 520.010314941406; fVmin[6] = 2.80723452568054; fVmax[6] = 299.029327392578; fVmin[7] = 46.4561080932617; fVmax[7] = 388.558624267578; fVmin[8] = 0.0435492135584354; fVmax[8] = 3.14158964157104; fVmin[9] = 1.19226624519797e-05; fVmax[9] = 1.11657130718231; fVmin[10] = 0.200001657009125; fVmax[10] = 1.15372478961945; fVmin[11] = 20.0262546539307; fVmax[11] = 248.743759155273; fVmin[12] = 10.0065498352051; fVmax[12] = 114.22599029541; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.369124769736657; fWeightMatrix0to1[1][0] = 2.11273049002197; fWeightMatrix0to1[2][0] = 0.0519990498394871; fWeightMatrix0to1[3][0] = 1.77342034881389; fWeightMatrix0to1[4][0] = -2.37697583910147; fWeightMatrix0to1[5][0] = -1.06178109263406; fWeightMatrix0to1[6][0] = 0.0642534669437345; fWeightMatrix0to1[7][0] = 1.29649363363074; fWeightMatrix0to1[8][0] = -1.52365823456812; fWeightMatrix0to1[9][0] = -0.529252229245596; fWeightMatrix0to1[10][0] = -0.951091602622116; fWeightMatrix0to1[11][0] = -0.257985179504682; fWeightMatrix0to1[12][0] = -1.46961722865426; fWeightMatrix0to1[13][0] = -0.820685661039019; fWeightMatrix0to1[0][1] = -0.566636010196901; fWeightMatrix0to1[1][1] = 1.44439111424839; fWeightMatrix0to1[2][1] = -0.557029275685777; fWeightMatrix0to1[3][1] = 1.84075526080494; fWeightMatrix0to1[4][1] = -0.195703740000762; fWeightMatrix0to1[5][1] = 1.43907628514872; fWeightMatrix0to1[6][1] = -3.3850954224814; fWeightMatrix0to1[7][1] = 3.43230620021318; fWeightMatrix0to1[8][1] = 1.73198130038814; fWeightMatrix0to1[9][1] = 2.20557403592191; fWeightMatrix0to1[10][1] = -4.3551685301018; fWeightMatrix0to1[11][1] = 0.796674691836282; fWeightMatrix0to1[12][1] = 1.05525836430286; fWeightMatrix0to1[13][1] = -1.21378327619649; fWeightMatrix0to1[0][2] = -1.88803193193427; fWeightMatrix0to1[1][2] = 1.07761578093068; fWeightMatrix0to1[2][2] = -0.311453018341957; fWeightMatrix0to1[3][2] = 1.28980523699198; fWeightMatrix0to1[4][2] = -0.420493872975624; fWeightMatrix0to1[5][2] = -0.553380356357214; fWeightMatrix0to1[6][2] = -4.3529851660895; fWeightMatrix0to1[7][2] = 2.98338468036319; fWeightMatrix0to1[8][2] = -2.10216208076194; fWeightMatrix0to1[9][2] = 3.9775304588028; fWeightMatrix0to1[10][2] = -2.02662441748187; fWeightMatrix0to1[11][2] = 1.77059355086353; fWeightMatrix0to1[12][2] = 0.615179617743778; fWeightMatrix0to1[13][2] = 0.454402219682577; fWeightMatrix0to1[0][3] = 1.87499999674399; fWeightMatrix0to1[1][3] = 1.37179918406355; fWeightMatrix0to1[2][3] = -0.660455817822156; fWeightMatrix0to1[3][3] = -1.45323574452187; fWeightMatrix0to1[4][3] = 0.5687759777665; fWeightMatrix0to1[5][3] = 0.164208474907549; fWeightMatrix0to1[6][3] = 6.01712212905597; fWeightMatrix0to1[7][3] = 1.17735756260684; fWeightMatrix0to1[8][3] = 1.55202331796003; fWeightMatrix0to1[9][3] = 1.88112322188792; fWeightMatrix0to1[10][3] = -0.279020427365459; fWeightMatrix0to1[11][3] = 1.36100720531134; fWeightMatrix0to1[12][3] = -1.81275722551071; fWeightMatrix0to1[13][3] = -0.871552060542297; fWeightMatrix0to1[0][4] = -1.32601201941471; fWeightMatrix0to1[1][4] = -1.27312713425256; fWeightMatrix0to1[2][4] = 1.36970633619396; fWeightMatrix0to1[3][4] = 0.843132087147121; fWeightMatrix0to1[4][4] = -1.87928110516742; fWeightMatrix0to1[5][4] = 0.182914918867212; fWeightMatrix0to1[6][4] = 1.63239847158164; fWeightMatrix0to1[7][4] = 2.90392003260975; fWeightMatrix0to1[8][4] = 2.3672694488504; fWeightMatrix0to1[9][4] = 0.892799938306604; fWeightMatrix0to1[10][4] = -1.08531791808797; fWeightMatrix0to1[11][4] = -0.923367972532844; fWeightMatrix0to1[12][4] = 0.504950324949527; fWeightMatrix0to1[13][4] = -0.922687338655; fWeightMatrix0to1[0][5] = -0.777445697910653; fWeightMatrix0to1[1][5] = -1.37979725614983; fWeightMatrix0to1[2][5] = -0.0949861601943723; fWeightMatrix0to1[3][5] = 1.12664315994727; fWeightMatrix0to1[4][5] = 0.935271598664855; fWeightMatrix0to1[5][5] = 0.166047278044636; fWeightMatrix0to1[6][5] = -1.1973663355251; fWeightMatrix0to1[7][5] = -3.70290007726989; fWeightMatrix0to1[8][5] = 0.665235021330902; fWeightMatrix0to1[9][5] = 0.626117397119349; fWeightMatrix0to1[10][5] = 3.15085613330479; fWeightMatrix0to1[11][5] = 0.745873472250576; fWeightMatrix0to1[12][5] = 0.714171199470985; fWeightMatrix0to1[13][5] = -1.28002122186739; fWeightMatrix0to1[0][6] = -0.937229245532783; fWeightMatrix0to1[1][6] = 0.242143563151242; fWeightMatrix0to1[2][6] = 1.26516141837096; fWeightMatrix0to1[3][6] = -1.04572897487917; fWeightMatrix0to1[4][6] = -1.8367626615107; fWeightMatrix0to1[5][6] = -2.04221784998238; fWeightMatrix0to1[6][6] = -2.12802651227451; fWeightMatrix0to1[7][6] = 2.44081382030586; fWeightMatrix0to1[8][6] = 2.324637817279; fWeightMatrix0to1[9][6] = -0.509477781292046; fWeightMatrix0to1[10][6] = -2.52757534218735; fWeightMatrix0to1[11][6] = 2.11572534083939; fWeightMatrix0to1[12][6] = -0.982201042938349; fWeightMatrix0to1[13][6] = -1.70529188021802; fWeightMatrix0to1[0][7] = -1.37165776983369; fWeightMatrix0to1[1][7] = 1.12174924004398; fWeightMatrix0to1[2][7] = -3.14390417141309; fWeightMatrix0to1[3][7] = -1.68324870362782; fWeightMatrix0to1[4][7] = 0.539923772883308; fWeightMatrix0to1[5][7] = 1.57981840010138; fWeightMatrix0to1[6][7] = -5.40448039556291; fWeightMatrix0to1[7][7] = 10.0795345775745; fWeightMatrix0to1[8][7] = 0.294877781436345; fWeightMatrix0to1[9][7] = 3.26009490852935; fWeightMatrix0to1[10][7] = -3.96332853282777; fWeightMatrix0to1[11][7] = 1.28935010238217; fWeightMatrix0to1[12][7] = -1.8740960325453; fWeightMatrix0to1[13][7] = 0.333067033626807; fWeightMatrix0to1[0][8] = 0.435360580909271; fWeightMatrix0to1[1][8] = 0.218005384561077; fWeightMatrix0to1[2][8] = -0.187600102108533; fWeightMatrix0to1[3][8] = 1.89917221704236; fWeightMatrix0to1[4][8] = 0.541805793617915; fWeightMatrix0to1[5][8] = -1.30326059873569; fWeightMatrix0to1[6][8] = 4.61165403942266; fWeightMatrix0to1[7][8] = -1.63264360665831; fWeightMatrix0to1[8][8] = -1.22379715557464; fWeightMatrix0to1[9][8] = -1.10901142214242; fWeightMatrix0to1[10][8] = -3.18505274210428; fWeightMatrix0to1[11][8] = 1.53374264598472; fWeightMatrix0to1[12][8] = 0.687366564457636; fWeightMatrix0to1[13][8] = 0.625215928068296; fWeightMatrix0to1[0][9] = -0.610605739631899; fWeightMatrix0to1[1][9] = 1.33739972330002; fWeightMatrix0to1[2][9] = -3.05538794393567; fWeightMatrix0to1[3][9] = 1.23964471517711; fWeightMatrix0to1[4][9] = -0.223307931862887; fWeightMatrix0to1[5][9] = -2.81702259568312; fWeightMatrix0to1[6][9] = 0.534918669922984; fWeightMatrix0to1[7][9] = -0.0498785134785078; fWeightMatrix0to1[8][9] = -0.292657894692225; fWeightMatrix0to1[9][9] = -0.223739967612964; fWeightMatrix0to1[10][9] = 0.843820516191448; fWeightMatrix0to1[11][9] = -1.0349860197782; fWeightMatrix0to1[12][9] = -0.153767618001765; fWeightMatrix0to1[13][9] = 0.92578323623622; fWeightMatrix0to1[0][10] = 1.30363039862737; fWeightMatrix0to1[1][10] = -0.792490542692807; fWeightMatrix0to1[2][10] = -2.11607101761458; fWeightMatrix0to1[3][10] = 0.178141952258475; fWeightMatrix0to1[4][10] = -1.95227096872658; fWeightMatrix0to1[5][10] = 1.33040901422609; fWeightMatrix0to1[6][10] = -1.31443527259528; fWeightMatrix0to1[7][10] = -0.591170932757081; fWeightMatrix0to1[8][10] = -1.16929657306257; fWeightMatrix0to1[9][10] = 0.504119871679195; fWeightMatrix0to1[10][10] = -0.609820716409745; fWeightMatrix0to1[11][10] = 0.827441387091758; fWeightMatrix0to1[12][10] = 0.883029924690135; fWeightMatrix0to1[13][10] = -1.345412880679; fWeightMatrix0to1[0][11] = -0.594857994996274; fWeightMatrix0to1[1][11] = -1.19392503469729; fWeightMatrix0to1[2][11] = 1.50316715718521; fWeightMatrix0to1[3][11] = 1.02561574188902; fWeightMatrix0to1[4][11] = 0.324672203169931; fWeightMatrix0to1[5][11] = -1.44196726137382; fWeightMatrix0to1[6][11] = 2.33716042550349; fWeightMatrix0to1[7][11] = 0.398447451960753; fWeightMatrix0to1[8][11] = 0.511184857602448; fWeightMatrix0to1[9][11] = -1.01834002310868; fWeightMatrix0to1[10][11] = 0.334254030051117; fWeightMatrix0to1[11][11] = 0.689561098727758; fWeightMatrix0to1[12][11] = -1.16026974952991; fWeightMatrix0to1[13][11] = 0.607462111459831; fWeightMatrix0to1[0][12] = -0.970595905883447; fWeightMatrix0to1[1][12] = -1.5217358663896; fWeightMatrix0to1[2][12] = 2.42951274077098; fWeightMatrix0to1[3][12] = 1.202996309923; fWeightMatrix0to1[4][12] = -0.705873269940821; fWeightMatrix0to1[5][12] = 0.817676381742316; fWeightMatrix0to1[6][12] = 2.72202064024513; fWeightMatrix0to1[7][12] = -0.568212191764599; fWeightMatrix0to1[8][12] = -0.235579923137185; fWeightMatrix0to1[9][12] = -1.9231986135033; fWeightMatrix0to1[10][12] = -0.000192703686318633; fWeightMatrix0to1[11][12] = -0.82566697986956; fWeightMatrix0to1[12][12] = -2.23537045157435; fWeightMatrix0to1[13][12] = 0.567933101921277; fWeightMatrix0to1[0][13] = 2.11885393166827; fWeightMatrix0to1[1][13] = -0.799707262219314; fWeightMatrix0to1[2][13] = -1.42774586076452; fWeightMatrix0to1[3][13] = -0.872131093580711; fWeightMatrix0to1[4][13] = 0.994298235402145; fWeightMatrix0to1[5][13] = 1.13010682435351; fWeightMatrix0to1[6][13] = -8.78540101467732; fWeightMatrix0to1[7][13] = 13.9005581875467; fWeightMatrix0to1[8][13] = 0.915610427679238; fWeightMatrix0to1[9][13] = 6.34343851678622; fWeightMatrix0to1[10][13] = -4.60779704787834; fWeightMatrix0to1[11][13] = 1.64154968357513; fWeightMatrix0to1[12][13] = 1.14436653908183; fWeightMatrix0to1[13][13] = 0.157588766452518; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.71014413217372; fWeightMatrix1to2[1][0] = -0.119758803218322; fWeightMatrix1to2[2][0] = 0.0185533895187079; fWeightMatrix1to2[3][0] = -1.53917009554774; fWeightMatrix1to2[4][0] = -0.627547323783328; fWeightMatrix1to2[5][0] = -0.838415520432288; fWeightMatrix1to2[6][0] = -0.612987557987505; fWeightMatrix1to2[7][0] = -1.78326938610522; fWeightMatrix1to2[8][0] = 1.0102804561036; fWeightMatrix1to2[9][0] = 0.0360835228545731; fWeightMatrix1to2[10][0] = -1.48018851621905; fWeightMatrix1to2[11][0] = 0.000762679053727007; fWeightMatrix1to2[12][0] = 0.223628903545043; fWeightMatrix1to2[0][1] = 1.07790640948373; fWeightMatrix1to2[1][1] = -0.776914831888875; fWeightMatrix1to2[2][1] = -1.03195112204461; fWeightMatrix1to2[3][1] = -2.3368197115081; fWeightMatrix1to2[4][1] = -0.878030928867498; fWeightMatrix1to2[5][1] = 1.72768593691283; fWeightMatrix1to2[6][1] = 1.28862692006763; fWeightMatrix1to2[7][1] = 0.812769710706401; fWeightMatrix1to2[8][1] = -0.178244552717227; fWeightMatrix1to2[9][1] = 2.44265511497138; fWeightMatrix1to2[10][1] = -1.69342853046398; fWeightMatrix1to2[11][1] = -1.21490915747777; fWeightMatrix1to2[12][1] = -0.504184132564489; fWeightMatrix1to2[0][2] = -1.75711716498401; fWeightMatrix1to2[1][2] = 1.80838556235053; fWeightMatrix1to2[2][2] = -0.205102021945251; fWeightMatrix1to2[3][2] = -0.8885048724634; fWeightMatrix1to2[4][2] = 0.865707810916806; fWeightMatrix1to2[5][2] = -0.200600159666531; fWeightMatrix1to2[6][2] = -1.89182123483099; fWeightMatrix1to2[7][2] = -1.0884307175672; fWeightMatrix1to2[8][2] = 1.69986038865912; fWeightMatrix1to2[9][2] = 0.17087096607417; fWeightMatrix1to2[10][2] = 0.461024057081928; fWeightMatrix1to2[11][2] = -1.2434749278323; fWeightMatrix1to2[12][2] = 2.25499052564802; fWeightMatrix1to2[0][3] = -1.82982488667869; fWeightMatrix1to2[1][3] = 0.440566630567059; fWeightMatrix1to2[2][3] = -0.145646011729336; fWeightMatrix1to2[3][3] = -1.54936122303578; fWeightMatrix1to2[4][3] = -1.31965057104211; fWeightMatrix1to2[5][3] = -1.92928991628604; fWeightMatrix1to2[6][3] = -0.368305519090621; fWeightMatrix1to2[7][3] = -1.844172948214; fWeightMatrix1to2[8][3] = -1.63001748464154; fWeightMatrix1to2[9][3] = -1.6597716338823; fWeightMatrix1to2[10][3] = 0.147541052746128; fWeightMatrix1to2[11][3] = -0.97755516612706; fWeightMatrix1to2[12][3] = -1.68758126766978; fWeightMatrix1to2[0][4] = 0.303404204828851; fWeightMatrix1to2[1][4] = -1.51774340872587; fWeightMatrix1to2[2][4] = -0.778092773392149; fWeightMatrix1to2[3][4] = -1.05417512019494; fWeightMatrix1to2[4][4] = 0.276473458331677; fWeightMatrix1to2[5][4] = 0.727135840664374; fWeightMatrix1to2[6][4] = 0.58609105772606; fWeightMatrix1to2[7][4] = -0.616107098482405; fWeightMatrix1to2[8][4] = -1.88625227311917; fWeightMatrix1to2[9][4] = -1.6034488389263; fWeightMatrix1to2[10][4] = -1.8958963609085; fWeightMatrix1to2[11][4] = -1.55126957347157; fWeightMatrix1to2[12][4] = -1.128571725105; fWeightMatrix1to2[0][5] = -0.51050701406626; fWeightMatrix1to2[1][5] = -0.0853352338326277; fWeightMatrix1to2[2][5] = -0.212121236011941; fWeightMatrix1to2[3][5] = 0.561797404325436; fWeightMatrix1to2[4][5] = -0.498673512941615; fWeightMatrix1to2[5][5] = -3.07445581275204; fWeightMatrix1to2[6][5] = 0.665697932532015; fWeightMatrix1to2[7][5] = 1.59118249304854; fWeightMatrix1to2[8][5] = -1.97629541770493; fWeightMatrix1to2[9][5] = -0.894769202180158; fWeightMatrix1to2[10][5] = 0.093898754072112; fWeightMatrix1to2[11][5] = 1.35389023002799; fWeightMatrix1to2[12][5] = -1.16817843248734; fWeightMatrix1to2[0][6] = -2.58166894972109; fWeightMatrix1to2[1][6] = 4.23514303225027; fWeightMatrix1to2[2][6] = 0.694723415098111; fWeightMatrix1to2[3][6] = 3.3070919876309; fWeightMatrix1to2[4][6] = 1.02855872115475; fWeightMatrix1to2[5][6] = -1.4239017329597; fWeightMatrix1to2[6][6] = -0.784916042204884; fWeightMatrix1to2[7][6] = -0.508850620988409; fWeightMatrix1to2[8][6] = -2.13529577967007; fWeightMatrix1to2[9][6] = -1.37108413630222; fWeightMatrix1to2[10][6] = -2.46720973974399; fWeightMatrix1to2[11][6] = -1.9726742661977; fWeightMatrix1to2[12][6] = -2.19015725586756; fWeightMatrix1to2[0][7] = 3.06511444377435; fWeightMatrix1to2[1][7] = -10.4139194229766; fWeightMatrix1to2[2][7] = 5.38490224630045; fWeightMatrix1to2[3][7] = -3.05874740897295; fWeightMatrix1to2[4][7] = 1.28002695959347; fWeightMatrix1to2[5][7] = -0.992495852991972; fWeightMatrix1to2[6][7] = 0.221968559778405; fWeightMatrix1to2[7][7] = 1.45417604254372; fWeightMatrix1to2[8][7] = -2.77307314423384; fWeightMatrix1to2[9][7] = 1.48620432324994; fWeightMatrix1to2[10][7] = 0.0823267232611311; fWeightMatrix1to2[11][7] = -1.6662414738285; fWeightMatrix1to2[12][7] = -2.61964560014655; fWeightMatrix1to2[0][8] = -1.58893655466838; fWeightMatrix1to2[1][8] = -2.10713049657431; fWeightMatrix1to2[2][8] = -0.121867356274563; fWeightMatrix1to2[3][8] = -0.27017482274208; fWeightMatrix1to2[4][8] = 0.309070355991774; fWeightMatrix1to2[5][8] = -0.877913888519733; fWeightMatrix1to2[6][8] = -1.31589045259484; fWeightMatrix1to2[7][8] = -1.65209413916559; fWeightMatrix1to2[8][8] = -1.11302495436151; fWeightMatrix1to2[9][8] = -1.24344220197486; fWeightMatrix1to2[10][8] = 0.662168379296217; fWeightMatrix1to2[11][8] = -1.4004679185503; fWeightMatrix1to2[12][8] = -1.47148705830062; fWeightMatrix1to2[0][9] = 2.94462248839313; fWeightMatrix1to2[1][9] = -4.69740287071641; fWeightMatrix1to2[2][9] = 0.490941287116793; fWeightMatrix1to2[3][9] = 1.37353637443985; fWeightMatrix1to2[4][9] = -1.76095952029931; fWeightMatrix1to2[5][9] = 1.41780089741459; fWeightMatrix1to2[6][9] = 1.07033824492547; fWeightMatrix1to2[7][9] = -0.713723914007221; fWeightMatrix1to2[8][9] = -1.49110046108204; fWeightMatrix1to2[9][9] = 0.833156269087126; fWeightMatrix1to2[10][9] = -0.777848531820109; fWeightMatrix1to2[11][9] = 0.8046201190073; fWeightMatrix1to2[12][9] = 0.528578897793083; fWeightMatrix1to2[0][10] = -0.00207990068928734; fWeightMatrix1to2[1][10] = 3.54629072407105; fWeightMatrix1to2[2][10] = -5.49657841798172; fWeightMatrix1to2[3][10] = -0.00785507659855783; fWeightMatrix1to2[4][10] = -1.8070382293954; fWeightMatrix1to2[5][10] = -0.646118216964103; fWeightMatrix1to2[6][10] = -1.95247277673052; fWeightMatrix1to2[7][10] = -1.18342432223607; fWeightMatrix1to2[8][10] = -1.79094825676775; fWeightMatrix1to2[9][10] = -0.0147760115145612; fWeightMatrix1to2[10][10] = -0.419703552217428; fWeightMatrix1to2[11][10] = 1.04165859614851; fWeightMatrix1to2[12][10] = -1.18034175760637; fWeightMatrix1to2[0][11] = -1.10929916522203; fWeightMatrix1to2[1][11] = -2.21605441222958; fWeightMatrix1to2[2][11] = -1.55304437710985; fWeightMatrix1to2[3][11] = 2.36096469508608; fWeightMatrix1to2[4][11] = -1.6110418371369; fWeightMatrix1to2[5][11] = -0.532629334432296; fWeightMatrix1to2[6][11] = -1.98488047644671; fWeightMatrix1to2[7][11] = -0.474199693590428; fWeightMatrix1to2[8][11] = -0.236489377394198; fWeightMatrix1to2[9][11] = 1.117759111188; fWeightMatrix1to2[10][11] = -1.492709398252; fWeightMatrix1to2[11][11] = 0.976667742989692; fWeightMatrix1to2[12][11] = 0.438200145309904; fWeightMatrix1to2[0][12] = 0.119489128335542; fWeightMatrix1to2[1][12] = -0.175394803744928; fWeightMatrix1to2[2][12] = -1.98917967879819; fWeightMatrix1to2[3][12] = -1.28692058904756; fWeightMatrix1to2[4][12] = -1.63071048084185; fWeightMatrix1to2[5][12] = -0.0585823175497676; fWeightMatrix1to2[6][12] = -1.01467583166438; fWeightMatrix1to2[7][12] = -0.845491970125863; fWeightMatrix1to2[8][12] = 0.161630994739513; fWeightMatrix1to2[9][12] = -1.76049418376472; fWeightMatrix1to2[10][12] = -1.73876324391173; fWeightMatrix1to2[11][12] = -1.52818441780636; fWeightMatrix1to2[12][12] = -0.379002425425732; fWeightMatrix1to2[0][13] = -2.0186497172941; fWeightMatrix1to2[1][13] = -1.4774693215807; fWeightMatrix1to2[2][13] = -0.484748796573586; fWeightMatrix1to2[3][13] = -0.615992875392997; fWeightMatrix1to2[4][13] = -1.54303122954972; fWeightMatrix1to2[5][13] = 0.276487118034678; fWeightMatrix1to2[6][13] = -1.87879592273842; fWeightMatrix1to2[7][13] = -0.955390391776685; fWeightMatrix1to2[8][13] = 1.65724150458683; fWeightMatrix1to2[9][13] = 0.00442329156457383; fWeightMatrix1to2[10][13] = -0.296393878497413; fWeightMatrix1to2[11][13] = 0.126512244855067; fWeightMatrix1to2[12][13] = -0.0371553119757794; fWeightMatrix1to2[0][14] = 0.724977237623978; fWeightMatrix1to2[1][14] = -0.746362167366195; fWeightMatrix1to2[2][14] = -2.03630333606265; fWeightMatrix1to2[3][14] = -0.864070410927774; fWeightMatrix1to2[4][14] = -1.66967412789414; fWeightMatrix1to2[5][14] = -0.231681267155942; fWeightMatrix1to2[6][14] = 0.217376458378444; fWeightMatrix1to2[7][14] = 0.412724184571141; fWeightMatrix1to2[8][14] = 0.545023026234367; fWeightMatrix1to2[9][14] = 0.564651844200317; fWeightMatrix1to2[10][14] = 0.723886839752841; fWeightMatrix1to2[11][14] = -1.20796426434828; fWeightMatrix1to2[12][14] = 0.209204302742073; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.571386393868319; fWeightMatrix2to3[0][1] = -0.662701536792165; fWeightMatrix2to3[0][2] = -2.17484286367959; fWeightMatrix2to3[0][3] = -1.54622208833026; fWeightMatrix2to3[0][4] = 0.985150037630431; fWeightMatrix2to3[0][5] = -0.524118991414609; fWeightMatrix2to3[0][6] = 0.378097968239391; fWeightMatrix2to3[0][7] = 1.97978050259678; fWeightMatrix2to3[0][8] = -1.65958346877119; fWeightMatrix2to3[0][9] = 1.07782396851115; fWeightMatrix2to3[0][10] = 0.799457786296207; fWeightMatrix2to3[0][11] = 0.842567106164807; fWeightMatrix2to3[0][12] = -1.04681914153326; fWeightMatrix2to3[0][13] = 0.894715893839749; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l