// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:17:36 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job406 Training events: 48966 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [49.6229476929,809.992675781] LepAPt LepAPt 'F' [20.0012817383,166.771118164] LepBPt LepBPt 'F' [10.0008592606,71.5196914673] MetSigLeptonsJets MetSigLeptonsJets 'F' [0.845276415348,15.6614294052] MetSpec MetSpec 'F' [15.0038528442,236.562576294] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1877574921,477.770996094] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [3.10723280907,327.821258545] addEt addEt 'F' [48.5138130188,408.254943848] dPhiLepSumMet dPhiLepSumMet 'F' [0.0498042888939,3.14158654213] dPhiLeptons dPhiLeptons 'F' [1.71661376953e-05,1.11657130718] dRLeptons dRLeptons 'F' [0.200001657009,1.13453125954] lep1_E lep1_E 'F' [20.0063362122,232.066116333] lep2_E lep2_E 'F' [10.0073814392,118.581718445] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 49.6229476928711; fVmax[0] = 809.99267578125; fVmin[1] = 20.0012817382812; fVmax[1] = 166.771118164062; fVmin[2] = 10.0008592605591; fVmax[2] = 71.5196914672852; fVmin[3] = 0.845276415348053; fVmax[3] = 15.6614294052124; fVmin[4] = 15.0038528442383; fVmax[4] = 236.562576293945; fVmin[5] = 30.1877574920654; fVmax[5] = 477.77099609375; fVmin[6] = 3.10723280906677; fVmax[6] = 327.821258544922; fVmin[7] = 48.5138130187988; fVmax[7] = 408.254943847656; fVmin[8] = 0.0498042888939381; fVmax[8] = 3.14158654212952; fVmin[9] = 1.71661376953125e-05; fVmax[9] = 1.11657130718231; fVmin[10] = 0.200001657009125; fVmax[10] = 1.13453125953674; fVmin[11] = 20.0063362121582; fVmax[11] = 232.066116333008; fVmin[12] = 10.007381439209; fVmax[12] = 118.581718444824; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.203177906208747; fWeightMatrix0to1[1][0] = 1.84388501002089; fWeightMatrix0to1[2][0] = 0.322074826319119; fWeightMatrix0to1[3][0] = 1.57699478066967; fWeightMatrix0to1[4][0] = -2.18667498610914; fWeightMatrix0to1[5][0] = -1.34703560282802; fWeightMatrix0to1[6][0] = -0.539742512468084; fWeightMatrix0to1[7][0] = 0.86629919660827; fWeightMatrix0to1[8][0] = -2.11455006364045; fWeightMatrix0to1[9][0] = -0.404475511905677; fWeightMatrix0to1[10][0] = -0.139706888502763; fWeightMatrix0to1[11][0] = -0.428347544932016; fWeightMatrix0to1[12][0] = -1.51944668520097; fWeightMatrix0to1[13][0] = 0.0585856278416443; fWeightMatrix0to1[0][1] = -0.593702376244674; fWeightMatrix0to1[1][1] = 1.08908746223699; fWeightMatrix0to1[2][1] = 0.302282132788436; fWeightMatrix0to1[3][1] = 1.82436435954997; fWeightMatrix0to1[4][1] = -0.0234334370776433; fWeightMatrix0to1[5][1] = 0.349045751791691; fWeightMatrix0to1[6][1] = -3.66642586778695; fWeightMatrix0to1[7][1] = 3.02153971756079; fWeightMatrix0to1[8][1] = 2.53985819829585; fWeightMatrix0to1[9][1] = 0.853098029224559; fWeightMatrix0to1[10][1] = -4.24830584125262; fWeightMatrix0to1[11][1] = 2.12977281812647; fWeightMatrix0to1[12][1] = 1.06917876378969; fWeightMatrix0to1[13][1] = 1.90253527929283; fWeightMatrix0to1[0][2] = -1.90191664069259; fWeightMatrix0to1[1][2] = 0.321516825198459; fWeightMatrix0to1[2][2] = -1.03784878873604; fWeightMatrix0to1[3][2] = 0.747543225217878; fWeightMatrix0to1[4][2] = -0.31113185905483; fWeightMatrix0to1[5][2] = -0.444457166629867; fWeightMatrix0to1[6][2] = -2.27019386365812; fWeightMatrix0to1[7][2] = 4.10549793760209; fWeightMatrix0to1[8][2] = -1.7917942911025; fWeightMatrix0to1[9][2] = 2.33493543488891; fWeightMatrix0to1[10][2] = -3.82052692421358; fWeightMatrix0to1[11][2] = 1.07625529518556; fWeightMatrix0to1[12][2] = 0.888142922914126; fWeightMatrix0to1[13][2] = 3.63891196473307; fWeightMatrix0to1[0][3] = 1.77046230297367; fWeightMatrix0to1[1][3] = 1.38746721717755; fWeightMatrix0to1[2][3] = -1.82066335473268; fWeightMatrix0to1[3][3] = -1.07769586334469; fWeightMatrix0to1[4][3] = 0.930913111947392; fWeightMatrix0to1[5][3] = -0.417947437389516; fWeightMatrix0to1[6][3] = 4.86458225845737; fWeightMatrix0to1[7][3] = 2.11117629220589; fWeightMatrix0to1[8][3] = 2.02483308557757; fWeightMatrix0to1[9][3] = 1.54996731557251; fWeightMatrix0to1[10][3] = -1.32949048234347; fWeightMatrix0to1[11][3] = 1.62940826490085; fWeightMatrix0to1[12][3] = -2.05468322810343; fWeightMatrix0to1[13][3] = -0.0289238619393601; fWeightMatrix0to1[0][4] = -1.34835768175866; fWeightMatrix0to1[1][4] = -1.69935186583859; fWeightMatrix0to1[2][4] = 0.848948558435706; fWeightMatrix0to1[3][4] = 0.900189723960755; fWeightMatrix0to1[4][4] = -1.61857924961153; fWeightMatrix0to1[5][4] = 0.0887573706424754; fWeightMatrix0to1[6][4] = 0.625089901169597; fWeightMatrix0to1[7][4] = 1.66645217012656; fWeightMatrix0to1[8][4] = 1.85307263598454; fWeightMatrix0to1[9][4] = 0.274597132236994; fWeightMatrix0to1[10][4] = -0.265043241184939; fWeightMatrix0to1[11][4] = -0.567857219480576; fWeightMatrix0to1[12][4] = 0.365013976329711; fWeightMatrix0to1[13][4] = -0.43909205725507; fWeightMatrix0to1[0][5] = -0.568000077527474; fWeightMatrix0to1[1][5] = -1.62158402935751; fWeightMatrix0to1[2][5] = 0.375879981149772; fWeightMatrix0to1[3][5] = 0.889754979101955; fWeightMatrix0to1[4][5] = 1.14387852294981; fWeightMatrix0to1[5][5] = -0.129242575917782; fWeightMatrix0to1[6][5] = -2.85796522476118; fWeightMatrix0to1[7][5] = -2.26411084325549; fWeightMatrix0to1[8][5] = 0.207793534918254; fWeightMatrix0to1[9][5] = 1.13822499450006; fWeightMatrix0to1[10][5] = 3.22758339925506; fWeightMatrix0to1[11][5] = 0.544381667582144; fWeightMatrix0to1[12][5] = 0.788331116024712; fWeightMatrix0to1[13][5] = 0.162440435195559; fWeightMatrix0to1[0][6] = -0.834620417308735; fWeightMatrix0to1[1][6] = -0.235768977034238; fWeightMatrix0to1[2][6] = 0.619448399339037; fWeightMatrix0to1[3][6] = -1.26196168339277; fWeightMatrix0to1[4][6] = -1.70945822200446; fWeightMatrix0to1[5][6] = -1.64627306430156; fWeightMatrix0to1[6][6] = -0.454636060652929; fWeightMatrix0to1[7][6] = 1.66504220447474; fWeightMatrix0to1[8][6] = 1.77438716105022; fWeightMatrix0to1[9][6] = -1.01079475844487; fWeightMatrix0to1[10][6] = -2.05331610949724; fWeightMatrix0to1[11][6] = 2.09480482119975; fWeightMatrix0to1[12][6] = -1.16198794013643; fWeightMatrix0to1[13][6] = -1.43087603131507; fWeightMatrix0to1[0][7] = -1.50896968859071; fWeightMatrix0to1[1][7] = 0.525281917839478; fWeightMatrix0to1[2][7] = -2.64989170133295; fWeightMatrix0to1[3][7] = -1.6776162568275; fWeightMatrix0to1[4][7] = 0.76669785316041; fWeightMatrix0to1[5][7] = 0.814234126296017; fWeightMatrix0to1[6][7] = -5.12393993276397; fWeightMatrix0to1[7][7] = 8.37094730564979; fWeightMatrix0to1[8][7] = 0.309281558772361; fWeightMatrix0to1[9][7] = -0.185081082240875; fWeightMatrix0to1[10][7] = -3.95169652942091; fWeightMatrix0to1[11][7] = 1.84502127790151; fWeightMatrix0to1[12][7] = -2.08607131196013; fWeightMatrix0to1[13][7] = 3.95551690681406; fWeightMatrix0to1[0][8] = -0.0689272292982465; fWeightMatrix0to1[1][8] = 0.78607065459673; fWeightMatrix0to1[2][8] = -0.622391118910082; fWeightMatrix0to1[3][8] = 2.07212996865936; fWeightMatrix0to1[4][8] = 0.478948152184712; fWeightMatrix0to1[5][8] = -1.84694259828172; fWeightMatrix0to1[6][8] = 4.32076009912195; fWeightMatrix0to1[7][8] = -1.60088765522275; fWeightMatrix0to1[8][8] = 0.00491647050527874; fWeightMatrix0to1[9][8] = -0.468014068920578; fWeightMatrix0to1[10][8] = -0.826421992955256; fWeightMatrix0to1[11][8] = 2.38197326785595; fWeightMatrix0to1[12][8] = 0.565764607529987; fWeightMatrix0to1[13][8] = -0.172884205284089; fWeightMatrix0to1[0][9] = -0.106783851733235; fWeightMatrix0to1[1][9] = 0.396011220424497; fWeightMatrix0to1[2][9] = -2.51360706400806; fWeightMatrix0to1[3][9] = 1.82222630108434; fWeightMatrix0to1[4][9] = -0.219536892060542; fWeightMatrix0to1[5][9] = -2.38667832449236; fWeightMatrix0to1[6][9] = 1.64881881209543; fWeightMatrix0to1[7][9] = -0.284023625207631; fWeightMatrix0to1[8][9] = 1.31818452379464; fWeightMatrix0to1[9][9] = 1.30443765128045; fWeightMatrix0to1[10][9] = 1.14871460924864; fWeightMatrix0to1[11][9] = -1.8501003662415; fWeightMatrix0to1[12][9] = -0.117517808799764; fWeightMatrix0to1[13][9] = 2.93296305659172; fWeightMatrix0to1[0][10] = 1.85538720963201; fWeightMatrix0to1[1][10] = 0.396951202624672; fWeightMatrix0to1[2][10] = -1.8241123523093; fWeightMatrix0to1[3][10] = 0.94604911073423; fWeightMatrix0to1[4][10] = -1.93292557084631; fWeightMatrix0to1[5][10] = 1.31596740911864; fWeightMatrix0to1[6][10] = -0.674481115668832; fWeightMatrix0to1[7][10] = -0.0819309201857582; fWeightMatrix0to1[8][10] = -0.0666575396770562; fWeightMatrix0to1[9][10] = 0.281530192570115; fWeightMatrix0to1[10][10] = -1.13126649690572; fWeightMatrix0to1[11][10] = 1.67475056288123; fWeightMatrix0to1[12][10] = 1.43302036394478; fWeightMatrix0to1[13][10] = -1.36918877981967; fWeightMatrix0to1[0][11] = -0.596806321936018; fWeightMatrix0to1[1][11] = 0.141837953316867; fWeightMatrix0to1[2][11] = 2.87009346201148; fWeightMatrix0to1[3][11] = -0.442537985493634; fWeightMatrix0to1[4][11] = 0.402230606465421; fWeightMatrix0to1[5][11] = -1.84308359837707; fWeightMatrix0to1[6][11] = 4.11586824432155; fWeightMatrix0to1[7][11] = 0.443712075114197; fWeightMatrix0to1[8][11] = -0.240371874694708; fWeightMatrix0to1[9][11] = 1.37436511633911; fWeightMatrix0to1[10][11] = -0.115484811608527; fWeightMatrix0to1[11][11] = 0.267240998228103; fWeightMatrix0to1[12][11] = -1.00236468361665; fWeightMatrix0to1[13][11] = -0.997547796045987; fWeightMatrix0to1[0][12] = -0.877921060471469; fWeightMatrix0to1[1][12] = -0.91536423529617; fWeightMatrix0to1[2][12] = 2.02482426619871; fWeightMatrix0to1[3][12] = 0.00352997717780687; fWeightMatrix0to1[4][12] = -0.588647400197913; fWeightMatrix0to1[5][12] = 1.59056769308958; fWeightMatrix0to1[6][12] = 1.61894932779227; fWeightMatrix0to1[7][12] = -1.03645387357549; fWeightMatrix0to1[8][12] = 0.178404697550253; fWeightMatrix0to1[9][12] = 0.6705938226286; fWeightMatrix0to1[10][12] = -0.644892111502951; fWeightMatrix0to1[11][12] = -0.444050609221117; fWeightMatrix0to1[12][12] = -1.71407018358671; fWeightMatrix0to1[13][12] = -0.858268050598084; fWeightMatrix0to1[0][13] = 1.5827840679324; fWeightMatrix0to1[1][13] = -1.08892802635188; fWeightMatrix0to1[2][13] = -0.208209482981332; fWeightMatrix0to1[3][13] = -0.290835887454926; fWeightMatrix0to1[4][13] = 0.918243757375295; fWeightMatrix0to1[5][13] = 0.87196386146203; fWeightMatrix0to1[6][13] = -8.89235250628641; fWeightMatrix0to1[7][13] = 13.4777226148167; fWeightMatrix0to1[8][13] = 1.30715143514332; fWeightMatrix0to1[9][13] = 0.246662971461288; fWeightMatrix0to1[10][13] = -6.73904888104522; fWeightMatrix0to1[11][13] = 2.10382366972634; fWeightMatrix0to1[12][13] = 0.823779411721893; fWeightMatrix0to1[13][13] = 3.90824320684149; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.86164179283439; fWeightMatrix1to2[1][0] = -0.491352530980201; fWeightMatrix1to2[2][0] = -0.187609170582158; fWeightMatrix1to2[3][0] = -1.79830499279324; fWeightMatrix1to2[4][0] = -0.412772557493691; fWeightMatrix1to2[5][0] = -0.810433279824456; fWeightMatrix1to2[6][0] = -0.552991184987898; fWeightMatrix1to2[7][0] = -1.99469360296982; fWeightMatrix1to2[8][0] = 0.865862311896509; fWeightMatrix1to2[9][0] = 0.178962275466169; fWeightMatrix1to2[10][0] = -1.48836300557368; fWeightMatrix1to2[11][0] = -0.243319668452136; fWeightMatrix1to2[12][0] = 0.0873387721570177; fWeightMatrix1to2[0][1] = 1.13065806639296; fWeightMatrix1to2[1][1] = -0.656392006584575; fWeightMatrix1to2[2][1] = 0.0450039682132215; fWeightMatrix1to2[3][1] = -2.26559854682168; fWeightMatrix1to2[4][1] = -0.78084811302276; fWeightMatrix1to2[5][1] = 2.00053574538085; fWeightMatrix1to2[6][1] = 1.31991458710978; fWeightMatrix1to2[7][1] = 0.763781677745042; fWeightMatrix1to2[8][1] = 0.254632985999171; fWeightMatrix1to2[9][1] = 1.95251020270474; fWeightMatrix1to2[10][1] = -1.57923119588323; fWeightMatrix1to2[11][1] = -1.3051028438292; fWeightMatrix1to2[12][1] = -0.247945152088795; fWeightMatrix1to2[0][2] = -1.71936694473145; fWeightMatrix1to2[1][2] = 2.15136761310557; fWeightMatrix1to2[2][2] = 2.11824829474285; fWeightMatrix1to2[3][2] = -0.858611914297178; fWeightMatrix1to2[4][2] = 0.177627233687181; fWeightMatrix1to2[5][2] = 0.354034479082066; fWeightMatrix1to2[6][2] = -1.90240087669038; fWeightMatrix1to2[7][2] = -1.48336768250367; fWeightMatrix1to2[8][2] = 0.903916853828954; fWeightMatrix1to2[9][2] = -0.131222004175619; fWeightMatrix1to2[10][2] = 0.856278535003496; fWeightMatrix1to2[11][2] = -1.1547856515058; fWeightMatrix1to2[12][2] = 1.82491626480866; fWeightMatrix1to2[0][3] = -1.6937752259364; fWeightMatrix1to2[1][3] = -0.797925945786209; fWeightMatrix1to2[2][3] = 0.0661056007267253; fWeightMatrix1to2[3][3] = -2.09811368899183; fWeightMatrix1to2[4][3] = -1.55453163809122; fWeightMatrix1to2[5][3] = -1.47038622711851; fWeightMatrix1to2[6][3] = -0.361798340306594; fWeightMatrix1to2[7][3] = -1.86224781358731; fWeightMatrix1to2[8][3] = -1.53944423307712; fWeightMatrix1to2[9][3] = -1.75049883623813; fWeightMatrix1to2[10][3] = 0.4925422351533; fWeightMatrix1to2[11][3] = -1.00002668826062; fWeightMatrix1to2[12][3] = -1.76735588954727; fWeightMatrix1to2[0][4] = 0.532559720800312; fWeightMatrix1to2[1][4] = -1.96519559629824; fWeightMatrix1to2[2][4] = -0.684183803558042; fWeightMatrix1to2[3][4] = -1.15150460205263; fWeightMatrix1to2[4][4] = 0.521119002143306; fWeightMatrix1to2[5][4] = 0.976218787269438; fWeightMatrix1to2[6][4] = 0.727331434320359; fWeightMatrix1to2[7][4] = -0.928944574794568; fWeightMatrix1to2[8][4] = -2.12407233902435; fWeightMatrix1to2[9][4] = -1.6394830342446; fWeightMatrix1to2[10][4] = -1.69869700720239; fWeightMatrix1to2[11][4] = -1.76477460583102; fWeightMatrix1to2[12][4] = -1.19582530918123; fWeightMatrix1to2[0][5] = -0.555001889306932; fWeightMatrix1to2[1][5] = 0.632707362512397; fWeightMatrix1to2[2][5] = -1.42056701508519; fWeightMatrix1to2[3][5] = -0.34180055726525; fWeightMatrix1to2[4][5] = -0.193752238622828; fWeightMatrix1to2[5][5] = -2.78094543681339; fWeightMatrix1to2[6][5] = 0.646847501072024; fWeightMatrix1to2[7][5] = 1.11081357971076; fWeightMatrix1to2[8][5] = -1.46594365850914; fWeightMatrix1to2[9][5] = -1.5317582047184; fWeightMatrix1to2[10][5] = 0.020305833479775; fWeightMatrix1to2[11][5] = 1.11474958644736; fWeightMatrix1to2[12][5] = -1.14420384698045; fWeightMatrix1to2[0][6] = -1.08008152088617; fWeightMatrix1to2[1][6] = 4.2061137734203; fWeightMatrix1to2[2][6] = -0.771424760727387; fWeightMatrix1to2[3][6] = 2.90427556041643; fWeightMatrix1to2[4][6] = 1.22998491494454; fWeightMatrix1to2[5][6] = -1.7521993776923; fWeightMatrix1to2[6][6] = -0.502723448474948; fWeightMatrix1to2[7][6] = -0.188030063018189; fWeightMatrix1to2[8][6] = -2.05441786527418; fWeightMatrix1to2[9][6] = -1.08571930713643; fWeightMatrix1to2[10][6] = -1.98160774776308; fWeightMatrix1to2[11][6] = -1.60041964061871; fWeightMatrix1to2[12][6] = -2.33652952165306; fWeightMatrix1to2[0][7] = 1.42677939965168; fWeightMatrix1to2[1][7] = -8.624973940611; fWeightMatrix1to2[2][7] = 2.82811503336499; fWeightMatrix1to2[3][7] = -3.1366183127591; fWeightMatrix1to2[4][7] = 2.36234824084843; fWeightMatrix1to2[5][7] = -0.569727352475973; fWeightMatrix1to2[6][7] = 0.538679013317931; fWeightMatrix1to2[7][7] = 0.879910247543549; fWeightMatrix1to2[8][7] = -1.68256866557665; fWeightMatrix1to2[9][7] = 0.34133464259841; fWeightMatrix1to2[10][7] = 0.180335697773841; fWeightMatrix1to2[11][7] = -1.90104515347049; fWeightMatrix1to2[12][7] = -2.10784388514164; fWeightMatrix1to2[0][8] = -1.57935933979664; fWeightMatrix1to2[1][8] = -2.21735693859522; fWeightMatrix1to2[2][8] = 3.08071350815683; fWeightMatrix1to2[3][8] = -0.822326263566615; fWeightMatrix1to2[4][8] = 0.708632086798517; fWeightMatrix1to2[5][8] = -0.510578920803126; fWeightMatrix1to2[6][8] = -1.02786652384974; fWeightMatrix1to2[7][8] = -1.54040012547414; fWeightMatrix1to2[8][8] = -1.02583494870823; fWeightMatrix1to2[9][8] = -1.21764333695298; fWeightMatrix1to2[10][8] = 1.03377292119192; fWeightMatrix1to2[11][8] = -1.35211820503034; fWeightMatrix1to2[12][8] = -1.30513871805314; fWeightMatrix1to2[0][9] = 1.69984271946664; fWeightMatrix1to2[1][9] = -0.283234695707734; fWeightMatrix1to2[2][9] = -0.754855139117217; fWeightMatrix1to2[3][9] = 1.40769847656112; fWeightMatrix1to2[4][9] = -1.81613037880803; fWeightMatrix1to2[5][9] = 1.41531078775494; fWeightMatrix1to2[6][9] = 0.790442104136667; fWeightMatrix1to2[7][9] = -0.994739885430082; fWeightMatrix1to2[8][9] = -1.05760955802803; fWeightMatrix1to2[9][9] = -0.352846425051511; fWeightMatrix1to2[10][9] = -0.796226110294562; fWeightMatrix1to2[11][9] = 0.507421724020599; fWeightMatrix1to2[12][9] = 1.84944230256971; fWeightMatrix1to2[0][10] = 0.388101516731085; fWeightMatrix1to2[1][10] = 5.21294098670272; fWeightMatrix1to2[2][10] = -4.43130257502301; fWeightMatrix1to2[3][10] = 1.4574856520349; fWeightMatrix1to2[4][10] = -1.90033544203373; fWeightMatrix1to2[5][10] = -0.998441322053016; fWeightMatrix1to2[6][10] = -1.97936411552881; fWeightMatrix1to2[7][10] = -1.38897400562975; fWeightMatrix1to2[8][10] = -0.926953419164378; fWeightMatrix1to2[9][10] = 0.269631465305083; fWeightMatrix1to2[10][10] = -0.818763297044196; fWeightMatrix1to2[11][10] = 0.892368462051386; fWeightMatrix1to2[12][10] = -0.574714253948279; fWeightMatrix1to2[0][11] = -0.934537017180423; fWeightMatrix1to2[1][11] = -2.63689880957696; fWeightMatrix1to2[2][11] = 1.62385549778643; fWeightMatrix1to2[3][11] = 2.27101605528969; fWeightMatrix1to2[4][11] = -1.55499434766832; fWeightMatrix1to2[5][11] = 0.446053356162741; fWeightMatrix1to2[6][11] = -2.18337964967131; fWeightMatrix1to2[7][11] = -0.966372578447254; fWeightMatrix1to2[8][11] = -0.357437924884725; fWeightMatrix1to2[9][11] = 0.000623404817882182; fWeightMatrix1to2[10][11] = -1.21984879688067; fWeightMatrix1to2[11][11] = 0.868676207122149; fWeightMatrix1to2[12][11] = 0.399877607925335; fWeightMatrix1to2[0][12] = -0.155851781489356; fWeightMatrix1to2[1][12] = -1.08526220153358; fWeightMatrix1to2[2][12] = -1.2334601821111; fWeightMatrix1to2[3][12] = -1.59699520026343; fWeightMatrix1to2[4][12] = -1.42402199139361; fWeightMatrix1to2[5][12] = -0.339612382703504; fWeightMatrix1to2[6][12] = -1.07580244160081; fWeightMatrix1to2[7][12] = -1.09297887819816; fWeightMatrix1to2[8][12] = -0.0850541737641012; fWeightMatrix1to2[9][12] = -2.34676964830531; fWeightMatrix1to2[10][12] = -1.99182760868784; fWeightMatrix1to2[11][12] = -1.76407737896583; fWeightMatrix1to2[12][12] = -0.457643223642512; fWeightMatrix1to2[0][13] = -1.89233741813076; fWeightMatrix1to2[1][13] = -4.72930026459622; fWeightMatrix1to2[2][13] = -1.01326897439575; fWeightMatrix1to2[3][13] = -1.42780971853242; fWeightMatrix1to2[4][13] = -1.76016837440637; fWeightMatrix1to2[5][13] = 1.67232789881741; fWeightMatrix1to2[6][13] = -1.91328903078377; fWeightMatrix1to2[7][13] = -1.41713055268429; fWeightMatrix1to2[8][13] = 1.67207119220577; fWeightMatrix1to2[9][13] = 0.502023625141437; fWeightMatrix1to2[10][13] = 0.779799264091447; fWeightMatrix1to2[11][13] = 0.0727176731501533; fWeightMatrix1to2[12][13] = 0.443909101862277; fWeightMatrix1to2[0][14] = 0.774078814451386; fWeightMatrix1to2[1][14] = -1.15061694747777; fWeightMatrix1to2[2][14] = -1.90684864723944; fWeightMatrix1to2[3][14] = -1.13540690862158; fWeightMatrix1to2[4][14] = -1.43462788100014; fWeightMatrix1to2[5][14] = -0.027510637294321; fWeightMatrix1to2[6][14] = 0.267150595760439; fWeightMatrix1to2[7][14] = 0.0606249917266861; fWeightMatrix1to2[8][14] = 0.424113341924041; fWeightMatrix1to2[9][14] = 0.344661259190421; fWeightMatrix1to2[10][14] = 0.848064241928466; fWeightMatrix1to2[11][14] = -1.46260546604145; fWeightMatrix1to2[12][14] = 0.130187708134657; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = 0.619766552366718; fWeightMatrix2to3[0][1] = -0.747515257309341; fWeightMatrix2to3[0][2] = -0.998042085985617; fWeightMatrix2to3[0][3] = -0.812454062989043; fWeightMatrix2to3[0][4] = 1.34063887717605; fWeightMatrix2to3[0][5] = 1.13225530474582; fWeightMatrix2to3[0][6] = 0.934502893863892; fWeightMatrix2to3[0][7] = 1.78900913647695; fWeightMatrix2to3[0][8] = 0.364327495362465; fWeightMatrix2to3[0][9] = 1.07793666258533; fWeightMatrix2to3[0][10] = 0.890890189180118; fWeightMatrix2to3[0][11] = 0.161095034261692; fWeightMatrix2to3[0][12] = -0.40855332485787; fWeightMatrix2to3[0][13] = 0.679117345953067; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l