// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 11:01:16 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job411 Training events: 57130 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [47.1748542786,746.708984375] LepAPt LepAPt 'F' [20.0000362396,155.865875244] LepBPt LepBPt 'F' [10.0002250671,70.5846633911] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.13770258427,19.6634273529] MetSpec MetSpec 'F' [15.0022125244,289.142150879] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1877574921,490.248291016] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.172416463494,301.426879883] addEt addEt 'F' [47.1748542786,378.260345459] dPhiLepSumMet dPhiLepSumMet 'F' [0.0448210090399,3.14158248901] dPhiLeptons dPhiLeptons 'F' [4.53725397165e-06,1.11007142067] dRLeptons dRLeptons 'F' [0.200007483363,1.15372478962] lep1_E lep1_E 'F' [20.0413379669,232.717926025] lep2_E lep2_E 'F' [10.0094184875,117.131721497] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 47.1748542785645; fVmax[0] = 746.708984375; fVmin[1] = 20.000036239624; fVmax[1] = 155.865875244141; fVmin[2] = 10.0002250671387; fVmax[2] = 70.5846633911133; fVmin[3] = 1.13770258426666; fVmax[3] = 19.6634273529053; fVmin[4] = 15.0022125244141; fVmax[4] = 289.142150878906; fVmin[5] = 30.1877574920654; fVmax[5] = 490.248291015625; fVmin[6] = 0.172416463494301; fVmax[6] = 301.426879882812; fVmin[7] = 47.1748542785645; fVmax[7] = 378.260345458984; fVmin[8] = 0.0448210090398788; fVmax[8] = 3.14158248901367; fVmin[9] = 4.53725397164817e-06; fVmax[9] = 1.11007142066956; fVmin[10] = 0.200007483363152; fVmax[10] = 1.15372478961945; fVmin[11] = 20.0413379669189; fVmax[11] = 232.717926025391; fVmin[12] = 10.0094184875488; fVmax[12] = 117.131721496582; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.309161361505785; fWeightMatrix0to1[1][0] = 2.48553199227676; fWeightMatrix0to1[2][0] = 0.835761889138511; fWeightMatrix0to1[3][0] = 2.05417487787591; fWeightMatrix0to1[4][0] = -1.46783219714868; fWeightMatrix0to1[5][0] = -1.39760734884692; fWeightMatrix0to1[6][0] = -0.686346076983892; fWeightMatrix0to1[7][0] = 1.33239088696603; fWeightMatrix0to1[8][0] = -2.54073753852519; fWeightMatrix0to1[9][0] = -0.845136919250332; fWeightMatrix0to1[10][0] = -1.98472226943448; fWeightMatrix0to1[11][0] = -0.017207396162552; fWeightMatrix0to1[12][0] = -1.18020029684166; fWeightMatrix0to1[13][0] = -0.11022856902894; fWeightMatrix0to1[0][1] = -0.775342262759908; fWeightMatrix0to1[1][1] = 0.906826705328663; fWeightMatrix0to1[2][1] = 0.728557333243376; fWeightMatrix0to1[3][1] = 1.93638458062724; fWeightMatrix0to1[4][1] = 2.95982872905594; fWeightMatrix0to1[5][1] = 1.20166372516331; fWeightMatrix0to1[6][1] = -3.54374305828604; fWeightMatrix0to1[7][1] = 0.124043232608476; fWeightMatrix0to1[8][1] = 0.381225044994388; fWeightMatrix0to1[9][1] = 2.27369438441505; fWeightMatrix0to1[10][1] = -2.74093294343059; fWeightMatrix0to1[11][1] = 1.15232560844561; fWeightMatrix0to1[12][1] = -0.966087089381129; fWeightMatrix0to1[13][1] = 1.69453632091956; fWeightMatrix0to1[0][2] = -1.62293560810194; fWeightMatrix0to1[1][2] = 0.634293538082589; fWeightMatrix0to1[2][2] = 0.562791566733037; fWeightMatrix0to1[3][2] = 1.83596635093652; fWeightMatrix0to1[4][2] = -1.5435982960731; fWeightMatrix0to1[5][2] = 0.528933318802388; fWeightMatrix0to1[6][2] = -1.3098621735784; fWeightMatrix0to1[7][2] = 0.619184396868645; fWeightMatrix0to1[8][2] = -4.5012046418963; fWeightMatrix0to1[9][2] = 1.67250469466984; fWeightMatrix0to1[10][2] = -2.5358723145123; fWeightMatrix0to1[11][2] = 1.23678771206758; fWeightMatrix0to1[12][2] = 2.25055879552469; fWeightMatrix0to1[13][2] = 2.70091995905981; fWeightMatrix0to1[0][3] = 2.19625911901698; fWeightMatrix0to1[1][3] = 0.904600927478355; fWeightMatrix0to1[2][3] = -0.352726607640935; fWeightMatrix0to1[3][3] = -1.18005045663468; fWeightMatrix0to1[4][3] = 1.5067038061643; fWeightMatrix0to1[5][3] = 0.994639229841828; fWeightMatrix0to1[6][3] = 2.36889101987191; fWeightMatrix0to1[7][3] = 0.680409454030087; fWeightMatrix0to1[8][3] = 0.509148120179484; fWeightMatrix0to1[9][3] = 2.37749435130477; fWeightMatrix0to1[10][3] = -1.99900240911522; fWeightMatrix0to1[11][3] = 0.960752938988177; fWeightMatrix0to1[12][3] = -3.46300618861891; fWeightMatrix0to1[13][3] = 0.118541034172024; fWeightMatrix0to1[0][4] = -1.04788546758042; fWeightMatrix0to1[1][4] = -1.90978518360148; fWeightMatrix0to1[2][4] = 1.89822806122031; fWeightMatrix0to1[3][4] = 1.18977427637362; fWeightMatrix0to1[4][4] = -0.698700480655489; fWeightMatrix0to1[5][4] = 0.849701813650169; fWeightMatrix0to1[6][4] = 0.542358676797014; fWeightMatrix0to1[7][4] = -0.276257699421633; fWeightMatrix0to1[8][4] = 1.1390468649414; fWeightMatrix0to1[9][4] = 0.0591113575808294; fWeightMatrix0to1[10][4] = -0.279422988940961; fWeightMatrix0to1[11][4] = -0.694026199593186; fWeightMatrix0to1[12][4] = 0.550988263242605; fWeightMatrix0to1[13][4] = -0.67915111272365; fWeightMatrix0to1[0][5] = -0.820501414593758; fWeightMatrix0to1[1][5] = -0.868261129807098; fWeightMatrix0to1[2][5] = 0.570521966755753; fWeightMatrix0to1[3][5] = 1.38841318551197; fWeightMatrix0to1[4][5] = 1.59507112612399; fWeightMatrix0to1[5][5] = -0.602594617884464; fWeightMatrix0to1[6][5] = -0.583250015365405; fWeightMatrix0to1[7][5] = -2.92037556221747; fWeightMatrix0to1[8][5] = 0.485353869659855; fWeightMatrix0to1[9][5] = -0.104127408574508; fWeightMatrix0to1[10][5] = 1.88264675011805; fWeightMatrix0to1[11][5] = 0.87378809957432; fWeightMatrix0to1[12][5] = 2.07052376724717; fWeightMatrix0to1[13][5] = -1.425395241166; fWeightMatrix0to1[0][6] = -0.684712889427731; fWeightMatrix0to1[1][6] = 0.0356836675723958; fWeightMatrix0to1[2][6] = 1.35832121751207; fWeightMatrix0to1[3][6] = -0.707989334681623; fWeightMatrix0to1[4][6] = -0.147804483312567; fWeightMatrix0to1[5][6] = -0.808025008506791; fWeightMatrix0to1[6][6] = -3.24342314146727; fWeightMatrix0to1[7][6] = -0.191364564172522; fWeightMatrix0to1[8][6] = 0.171409491090982; fWeightMatrix0to1[9][6] = 0.264283791275431; fWeightMatrix0to1[10][6] = -1.58227388717709; fWeightMatrix0to1[11][6] = 3.18901642976096; fWeightMatrix0to1[12][6] = -2.21546924630623; fWeightMatrix0to1[13][6] = 0.66153181754326; fWeightMatrix0to1[0][7] = -1.37949520065137; fWeightMatrix0to1[1][7] = 0.845083516626663; fWeightMatrix0to1[2][7] = -1.51011420688732; fWeightMatrix0to1[3][7] = -1.36840526344005; fWeightMatrix0to1[4][7] = 2.58503876853472; fWeightMatrix0to1[5][7] = 2.41764658315305; fWeightMatrix0to1[6][7] = -7.930990152166; fWeightMatrix0to1[7][7] = 4.18786719916771; fWeightMatrix0to1[8][7] = -4.24760248296697; fWeightMatrix0to1[9][7] = 2.99014853071759; fWeightMatrix0to1[10][7] = -2.2922251930991; fWeightMatrix0to1[11][7] = 1.79478996497728; fWeightMatrix0to1[12][7] = -5.78710155654568; fWeightMatrix0to1[13][7] = 5.39290781360714; fWeightMatrix0to1[0][8] = 0.0946668910191321; fWeightMatrix0to1[1][8] = -0.409662499048427; fWeightMatrix0to1[2][8] = -0.861120351212414; fWeightMatrix0to1[3][8] = 1.39239805531812; fWeightMatrix0to1[4][8] = 0.646740759890656; fWeightMatrix0to1[5][8] = -2.29350010863407; fWeightMatrix0to1[6][8] = 2.34806472480626; fWeightMatrix0to1[7][8] = 0.21217059748623; fWeightMatrix0to1[8][8] = -1.52608075919895; fWeightMatrix0to1[9][8] = -1.19635338726695; fWeightMatrix0to1[10][8] = -0.746980219639147; fWeightMatrix0to1[11][8] = 1.40646381950096; fWeightMatrix0to1[12][8] = 0.522729956825433; fWeightMatrix0to1[13][8] = 0.945792903341784; fWeightMatrix0to1[0][9] = -0.282528391326737; fWeightMatrix0to1[1][9] = -0.0863881107280049; fWeightMatrix0to1[2][9] = -1.80151130096654; fWeightMatrix0to1[3][9] = 1.25415895740263; fWeightMatrix0to1[4][9] = 1.47935302536687; fWeightMatrix0to1[5][9] = -0.788821806300031; fWeightMatrix0to1[6][9] = -0.118679164372; fWeightMatrix0to1[7][9] = -0.455377157438534; fWeightMatrix0to1[8][9] = -0.551433720696695; fWeightMatrix0to1[9][9] = 0.367402292009387; fWeightMatrix0to1[10][9] = 0.329958257337021; fWeightMatrix0to1[11][9] = -1.46700109640803; fWeightMatrix0to1[12][9] = 0.537323342340867; fWeightMatrix0to1[13][9] = 0.496322832797924; fWeightMatrix0to1[0][10] = 1.7014143457314; fWeightMatrix0to1[1][10] = -0.0603159161962117; fWeightMatrix0to1[2][10] = -1.26031964262262; fWeightMatrix0to1[3][10] = 0.0679421906958131; fWeightMatrix0to1[4][10] = 0.222769829553307; fWeightMatrix0to1[5][10] = 1.46126234284848; fWeightMatrix0to1[6][10] = -1.03187949635709; fWeightMatrix0to1[7][10] = -0.123607497492478; fWeightMatrix0to1[8][10] = -0.847796499322317; fWeightMatrix0to1[9][10] = -0.357102272658919; fWeightMatrix0to1[10][10] = -1.67602597086571; fWeightMatrix0to1[11][10] = 0.69230488035862; fWeightMatrix0to1[12][10] = 0.63529331463283; fWeightMatrix0to1[13][10] = -0.0600653984085459; fWeightMatrix0to1[0][11] = -0.58455117817324; fWeightMatrix0to1[1][11] = -0.397377153379667; fWeightMatrix0to1[2][11] = 2.657480196421; fWeightMatrix0to1[3][11] = 0.858839746270873; fWeightMatrix0to1[4][11] = 2.27637632901263; fWeightMatrix0to1[5][11] = -1.92160351204622; fWeightMatrix0to1[6][11] = 2.8428437862363; fWeightMatrix0to1[7][11] = -0.601649563201199; fWeightMatrix0to1[8][11] = 1.92753481930754; fWeightMatrix0to1[9][11] = 0.719370878096387; fWeightMatrix0to1[10][11] = 0.738689890105682; fWeightMatrix0to1[11][11] = -0.00613928307115115; fWeightMatrix0to1[12][11] = -0.873663400463527; fWeightMatrix0to1[13][11] = 0.0957398502022389; fWeightMatrix0to1[0][12] = -0.671294691127897; fWeightMatrix0to1[1][12] = -0.718300802377503; fWeightMatrix0to1[2][12] = 1.99777796505676; fWeightMatrix0to1[3][12] = 1.27325620479606; fWeightMatrix0to1[4][12] = -1.40288723679041; fWeightMatrix0to1[5][12] = 1.67527386053675; fWeightMatrix0to1[6][12] = 1.37824303715659; fWeightMatrix0to1[7][12] = -2.24788598770452; fWeightMatrix0to1[8][12] = -1.32412235515901; fWeightMatrix0to1[9][12] = -0.00372501473352094; fWeightMatrix0to1[10][12] = -0.289250052039508; fWeightMatrix0to1[11][12] = -1.76115833305924; fWeightMatrix0to1[12][12] = -1.11533777935601; fWeightMatrix0to1[13][12] = 0.602353949441422; fWeightMatrix0to1[0][13] = 1.64998011577964; fWeightMatrix0to1[1][13] = -0.810227462935508; fWeightMatrix0to1[2][13] = -0.776701186147301; fWeightMatrix0to1[3][13] = -1.03977132892019; fWeightMatrix0to1[4][13] = 0.864476308026809; fWeightMatrix0to1[5][13] = 2.26266130604972; fWeightMatrix0to1[6][13] = -11.5449180468263; fWeightMatrix0to1[7][13] = 5.72566655783713; fWeightMatrix0to1[8][13] = -4.01493891788782; fWeightMatrix0to1[9][13] = 6.24379734607784; fWeightMatrix0to1[10][13] = -0.696499901198751; fWeightMatrix0to1[11][13] = 2.2040228710248; fWeightMatrix0to1[12][13] = -5.52220411813621; fWeightMatrix0to1[13][13] = 5.79387474232045; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.18696089840553; fWeightMatrix1to2[1][0] = -0.369887570713712; fWeightMatrix1to2[2][0] = -0.911649512669999; fWeightMatrix1to2[3][0] = -1.7950719599122; fWeightMatrix1to2[4][0] = -0.851978677778648; fWeightMatrix1to2[5][0] = -0.831676536731427; fWeightMatrix1to2[6][0] = -0.484149292643708; fWeightMatrix1to2[7][0] = -1.69951685792005; fWeightMatrix1to2[8][0] = 0.791810329083156; fWeightMatrix1to2[9][0] = 0.244674493954214; fWeightMatrix1to2[10][0] = -1.03645328985927; fWeightMatrix1to2[11][0] = 0.37904443062145; fWeightMatrix1to2[12][0] = 0.135812906733499; fWeightMatrix1to2[0][1] = 1.05729354628446; fWeightMatrix1to2[1][1] = -0.242737195408096; fWeightMatrix1to2[2][1] = -1.18578808050246; fWeightMatrix1to2[3][1] = -2.13282315387718; fWeightMatrix1to2[4][1] = -0.930034529719974; fWeightMatrix1to2[5][1] = 1.76843858835004; fWeightMatrix1to2[6][1] = 1.25686836862564; fWeightMatrix1to2[7][1] = 0.765905061915702; fWeightMatrix1to2[8][1] = 0.179436782078653; fWeightMatrix1to2[9][1] = 1.92299921612696; fWeightMatrix1to2[10][1] = -1.82810085142056; fWeightMatrix1to2[11][1] = -1.32793239258155; fWeightMatrix1to2[12][1] = -0.77447772629489; fWeightMatrix1to2[0][2] = -1.38643895835751; fWeightMatrix1to2[1][2] = 0.637824697448444; fWeightMatrix1to2[2][2] = 1.96296319718652; fWeightMatrix1to2[3][2] = -1.33676295956759; fWeightMatrix1to2[4][2] = 0.75695860825756; fWeightMatrix1to2[5][2] = -0.215045728269517; fWeightMatrix1to2[6][2] = -2.12987307830118; fWeightMatrix1to2[7][2] = -1.1621376247143; fWeightMatrix1to2[8][2] = 0.94241036059317; fWeightMatrix1to2[9][2] = -0.221628726223253; fWeightMatrix1to2[10][2] = 0.620482778713582; fWeightMatrix1to2[11][2] = -1.12598754116855; fWeightMatrix1to2[12][2] = 1.93805172473742; fWeightMatrix1to2[0][3] = -1.58614349978699; fWeightMatrix1to2[1][3] = 0.468770822964648; fWeightMatrix1to2[2][3] = 0.00309368827053539; fWeightMatrix1to2[3][3] = -1.62042255185056; fWeightMatrix1to2[4][3] = -1.31949995868265; fWeightMatrix1to2[5][3] = -1.8608336986332; fWeightMatrix1to2[6][3] = -0.471193151300359; fWeightMatrix1to2[7][3] = -1.86001192751781; fWeightMatrix1to2[8][3] = -1.60881413684478; fWeightMatrix1to2[9][3] = -1.70860403423016; fWeightMatrix1to2[10][3] = 0.0695797387470703; fWeightMatrix1to2[11][3] = -0.923214260359788; fWeightMatrix1to2[12][3] = -1.78594352072837; fWeightMatrix1to2[0][4] = 0.647821450532571; fWeightMatrix1to2[1][4] = -1.3197714769479; fWeightMatrix1to2[2][4] = -2.59111409130698; fWeightMatrix1to2[3][4] = -1.21572116302095; fWeightMatrix1to2[4][4] = 0.158363561476658; fWeightMatrix1to2[5][4] = 0.675473520025215; fWeightMatrix1to2[6][4] = 0.608938741555354; fWeightMatrix1to2[7][4] = -0.154305603168138; fWeightMatrix1to2[8][4] = -1.88492394349779; fWeightMatrix1to2[9][4] = -1.74672428723559; fWeightMatrix1to2[10][4] = -1.65302805747448; fWeightMatrix1to2[11][4] = -1.71776573467235; fWeightMatrix1to2[12][4] = 1.76886078075307; fWeightMatrix1to2[0][5] = -0.700638781004677; fWeightMatrix1to2[1][5] = 0.24904280183921; fWeightMatrix1to2[2][5] = -2.22910236660073; fWeightMatrix1to2[3][5] = -0.344515797201557; fWeightMatrix1to2[4][5] = -0.596812821828582; fWeightMatrix1to2[5][5] = -2.57165544842643; fWeightMatrix1to2[6][5] = 0.605485036498606; fWeightMatrix1to2[7][5] = 0.932819663162974; fWeightMatrix1to2[8][5] = -1.41380507654865; fWeightMatrix1to2[9][5] = -1.6112111153967; fWeightMatrix1to2[10][5] = 0.104562647470508; fWeightMatrix1to2[11][5] = 0.742395920609784; fWeightMatrix1to2[12][5] = -2.21703508997693; fWeightMatrix1to2[0][6] = -1.58749297769034; fWeightMatrix1to2[1][6] = 2.26661948393361; fWeightMatrix1to2[2][6] = 6.11559842992363; fWeightMatrix1to2[3][6] = 1.83669645989812; fWeightMatrix1to2[4][6] = 0.733885132791049; fWeightMatrix1to2[5][6] = -1.42537052564487; fWeightMatrix1to2[6][6] = -0.488130296240788; fWeightMatrix1to2[7][6] = -0.743331096229699; fWeightMatrix1to2[8][6] = -2.2317090786445; fWeightMatrix1to2[9][6] = -1.71603658392837; fWeightMatrix1to2[10][6] = -2.34048326639356; fWeightMatrix1to2[11][6] = -1.52216059741979; fWeightMatrix1to2[12][6] = -2.70231697746021; fWeightMatrix1to2[0][7] = 1.19611109419301; fWeightMatrix1to2[1][7] = -4.90558729031982; fWeightMatrix1to2[2][7] = 0.824688423499875; fWeightMatrix1to2[3][7] = -2.00126380196519; fWeightMatrix1to2[4][7] = 0.966954370777087; fWeightMatrix1to2[5][7] = -1.50186195049638; fWeightMatrix1to2[6][7] = -0.0827430879134946; fWeightMatrix1to2[7][7] = 0.851191755361947; fWeightMatrix1to2[8][7] = -2.20662433303642; fWeightMatrix1to2[9][7] = -0.298610581605254; fWeightMatrix1to2[10][7] = -0.34844378736171; fWeightMatrix1to2[11][7] = -1.22973365487719; fWeightMatrix1to2[12][7] = -2.64901261804381; fWeightMatrix1to2[0][8] = -2.03942109986193; fWeightMatrix1to2[1][8] = -1.08846003646049; fWeightMatrix1to2[2][8] = 3.43342837034425; fWeightMatrix1to2[3][8] = -0.55479039942869; fWeightMatrix1to2[4][8] = 0.175012912036924; fWeightMatrix1to2[5][8] = -1.54471649913829; fWeightMatrix1to2[6][8] = -1.27714015488403; fWeightMatrix1to2[7][8] = -1.18135272139094; fWeightMatrix1to2[8][8] = -1.30258525126445; fWeightMatrix1to2[9][8] = -0.616300391474697; fWeightMatrix1to2[10][8] = 1.49402531816219; fWeightMatrix1to2[11][8] = -0.383517276318282; fWeightMatrix1to2[12][8] = 1.4679008799597; fWeightMatrix1to2[0][9] = 1.52252323512279; fWeightMatrix1to2[1][9] = -2.24497456374568; fWeightMatrix1to2[2][9] = -4.86173354264001; fWeightMatrix1to2[3][9] = 1.0307341696195; fWeightMatrix1to2[4][9] = -1.80205966214555; fWeightMatrix1to2[5][9] = 1.19675966246274; fWeightMatrix1to2[6][9] = 1.05800160618957; fWeightMatrix1to2[7][9] = -0.871444193146884; fWeightMatrix1to2[8][9] = -1.19877282079251; fWeightMatrix1to2[9][9] = -0.468042081377394; fWeightMatrix1to2[10][9] = -0.972554596664381; fWeightMatrix1to2[11][9] = 0.26960152402363; fWeightMatrix1to2[12][9] = 0.961728265816232; fWeightMatrix1to2[0][10] = 0.085995275288342; fWeightMatrix1to2[1][10] = 2.40679091192554; fWeightMatrix1to2[2][10] = -1.02651633337214; fWeightMatrix1to2[3][10] = 1.25545915856414; fWeightMatrix1to2[4][10] = -2.11328916931556; fWeightMatrix1to2[5][10] = -0.480248907541068; fWeightMatrix1to2[6][10] = -2.62512113740406; fWeightMatrix1to2[7][10] = -2.16650899757148; fWeightMatrix1to2[8][10] = -0.623321736229176; fWeightMatrix1to2[9][10] = 0.272102669057359; fWeightMatrix1to2[10][10] = -0.0313237985626393; fWeightMatrix1to2[11][10] = 1.24168353581393; fWeightMatrix1to2[12][10] = -0.901590862154356; fWeightMatrix1to2[0][11] = -0.930185752348762; fWeightMatrix1to2[1][11] = -2.09555052014947; fWeightMatrix1to2[2][11] = -3.25020466989804; fWeightMatrix1to2[3][11] = 1.35776893371228; fWeightMatrix1to2[4][11] = -1.60700773638768; fWeightMatrix1to2[5][11] = -0.61797773906526; fWeightMatrix1to2[6][11] = -2.07291994112785; fWeightMatrix1to2[7][11] = 0.0248495436239624; fWeightMatrix1to2[8][11] = -0.381248554390308; fWeightMatrix1to2[9][11] = -0.0747903265936171; fWeightMatrix1to2[10][11] = -1.49681004763408; fWeightMatrix1to2[11][11] = 0.915194379085032; fWeightMatrix1to2[12][11] = 1.29908079477098; fWeightMatrix1to2[0][12] = 0.240007222700155; fWeightMatrix1to2[1][12] = 0.445117808308245; fWeightMatrix1to2[2][12] = 4.02712254316596; fWeightMatrix1to2[3][12] = -1.2260091899762; fWeightMatrix1to2[4][12] = -1.82148209669478; fWeightMatrix1to2[5][12] = 0.542074607161229; fWeightMatrix1to2[6][12] = -1.35557983754891; fWeightMatrix1to2[7][12] = -1.17720749942852; fWeightMatrix1to2[8][12] = 0.148946083433088; fWeightMatrix1to2[9][12] = -1.95654977736379; fWeightMatrix1to2[10][12] = -2.20917576924756; fWeightMatrix1to2[11][12] = -0.988652364862073; fWeightMatrix1to2[12][12] = -0.321602263217931; fWeightMatrix1to2[0][13] = -1.57064950079969; fWeightMatrix1to2[1][13] = -3.50230189529793; fWeightMatrix1to2[2][13] = -4.7296055911817; fWeightMatrix1to2[3][13] = -1.27970720972281; fWeightMatrix1to2[4][13] = -1.67133554019761; fWeightMatrix1to2[5][13] = 0.92906250551477; fWeightMatrix1to2[6][13] = -1.54670775298444; fWeightMatrix1to2[7][13] = -1.17639280195743; fWeightMatrix1to2[8][13] = 1.48699020635118; fWeightMatrix1to2[9][13] = -0.101612490053248; fWeightMatrix1to2[10][13] = 0.215911047888285; fWeightMatrix1to2[11][13] = 0.0205476936868831; fWeightMatrix1to2[12][13] = 0.375467126440925; fWeightMatrix1to2[0][14] = 0.529487639812294; fWeightMatrix1to2[1][14] = -1.025276648472; fWeightMatrix1to2[2][14] = -2.20758825522165; fWeightMatrix1to2[3][14] = -1.10963611568104; fWeightMatrix1to2[4][14] = -1.90322230060686; fWeightMatrix1to2[5][14] = -0.18423960050427; fWeightMatrix1to2[6][14] = 0.20932378240388; fWeightMatrix1to2[7][14] = 0.13596424940851; fWeightMatrix1to2[8][14] = 0.331529090562655; fWeightMatrix1to2[9][14] = 0.232834207235992; fWeightMatrix1to2[10][14] = 0.956734334099026; fWeightMatrix1to2[11][14] = -0.915554707522788; fWeightMatrix1to2[12][14] = -0.274641513851684; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.542864946932207; fWeightMatrix2to3[0][1] = 0.948739469253447; fWeightMatrix2to3[0][2] = -0.562001456386443; fWeightMatrix2to3[0][3] = -0.968311287913995; fWeightMatrix2to3[0][4] = 0.711871874348933; fWeightMatrix2to3[0][5] = 0.630610967135141; fWeightMatrix2to3[0][6] = 0.0752452963771209; fWeightMatrix2to3[0][7] = 1.71285134451682; fWeightMatrix2to3[0][8] = 0.802254656707185; fWeightMatrix2to3[0][9] = 0.478543628224723; fWeightMatrix2to3[0][10] = 1.13330235313703; fWeightMatrix2to3[0][11] = 0.398831970760573; fWeightMatrix2to3[0][12] = -0.962782840286864; fWeightMatrix2to3[0][13] = 0.475693275967369; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l