// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:53:47 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job412 Training events: 51330 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [48.6484184265,844.079467773] LepAPt LepAPt 'F' [20.0002613068,215.671981812] LepBPt LepBPt 'F' [10.000038147,74.0718307495] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.14757633209,20.3265991211] MetSpec MetSpec 'F' [15.0046319962,320.548706055] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1895332336,527.566101074] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [2.83339142799,373.748504639] addEt addEt 'F' [48.6484184265,438.620880127] dPhiLepSumMet dPhiLepSumMet 'F' [0.0173770282418,3.1415913105] dPhiLeptons dPhiLeptons 'F' [4.88758087158e-06,1.12033128738] dRLeptons dRLeptons 'F' [0.200001657009,1.13469779491] lep1_E lep1_E 'F' [20.0293254852,232.066116333] lep2_E lep2_E 'F' [10.025138855,130.657516479] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 48.6484184265137; fVmax[0] = 844.079467773438; fVmin[1] = 20.0002613067627; fVmax[1] = 215.671981811523; fVmin[2] = 10.0000381469727; fVmax[2] = 74.0718307495117; fVmin[3] = 1.14757633209229; fVmax[3] = 20.3265991210938; fVmin[4] = 15.0046319961548; fVmax[4] = 320.548706054688; fVmin[5] = 30.1895332336426; fVmax[5] = 527.566101074219; fVmin[6] = 2.83339142799377; fVmax[6] = 373.748504638672; fVmin[7] = 48.6484184265137; fVmax[7] = 438.620880126953; fVmin[8] = 0.0173770282417536; fVmax[8] = 3.1415913105011; fVmin[9] = 4.88758087158203e-06; fVmax[9] = 1.12033128738403; fVmin[10] = 0.200001657009125; fVmax[10] = 1.13469779491425; fVmin[11] = 20.0293254852295; fVmax[11] = 232.066116333008; fVmin[12] = 10.0251388549805; fVmax[12] = 130.657516479492; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.338403365498049; fWeightMatrix0to1[1][0] = 2.07910566172066; fWeightMatrix0to1[2][0] = 0.78304596292102; fWeightMatrix0to1[3][0] = 1.76188941792176; fWeightMatrix0to1[4][0] = -0.539600390761776; fWeightMatrix0to1[5][0] = -1.40975446067076; fWeightMatrix0to1[6][0] = -0.898753048288148; fWeightMatrix0to1[7][0] = 2.52122653826607; fWeightMatrix0to1[8][0] = -1.39650433354355; fWeightMatrix0to1[9][0] = -0.03812125676929; fWeightMatrix0to1[10][0] = -2.37533822190362; fWeightMatrix0to1[11][0] = -0.65508947641749; fWeightMatrix0to1[12][0] = -1.27486979378665; fWeightMatrix0to1[13][0] = -0.0361411856923238; fWeightMatrix0to1[0][1] = -0.555053515383; fWeightMatrix0to1[1][1] = 1.55547297461573; fWeightMatrix0to1[2][1] = -0.584571566768937; fWeightMatrix0to1[3][1] = 1.4046041495815; fWeightMatrix0to1[4][1] = 3.29731078210795; fWeightMatrix0to1[5][1] = 1.82679742742509; fWeightMatrix0to1[6][1] = -3.04554355092002; fWeightMatrix0to1[7][1] = 1.33645468034814; fWeightMatrix0to1[8][1] = 0.681375901221537; fWeightMatrix0to1[9][1] = 2.3857557576139; fWeightMatrix0to1[10][1] = -0.320913360034079; fWeightMatrix0to1[11][1] = -0.694227453014752; fWeightMatrix0to1[12][1] = 1.21856831849332; fWeightMatrix0to1[13][1] = -3.30216593988806; fWeightMatrix0to1[0][2] = -1.94740121302606; fWeightMatrix0to1[1][2] = 0.105874257715461; fWeightMatrix0to1[2][2] = -0.149804859489555; fWeightMatrix0to1[3][2] = 1.54486530410947; fWeightMatrix0to1[4][2] = 1.58943181510289; fWeightMatrix0to1[5][2] = -0.728244382622538; fWeightMatrix0to1[6][2] = 0.675108208785691; fWeightMatrix0to1[7][2] = 0.0985397872526952; fWeightMatrix0to1[8][2] = -1.45034518459992; fWeightMatrix0to1[9][2] = 3.07694138370407; fWeightMatrix0to1[10][2] = -3.95272246779967; fWeightMatrix0to1[11][2] = 0.68891312705334; fWeightMatrix0to1[12][2] = 1.16054843490798; fWeightMatrix0to1[13][2] = 1.59129658486319; fWeightMatrix0to1[0][3] = 1.88990536905723; fWeightMatrix0to1[1][3] = 1.2652369930041; fWeightMatrix0to1[2][3] = -1.33731210850303; fWeightMatrix0to1[3][3] = -1.64135201197841; fWeightMatrix0to1[4][3] = 0.787864457326191; fWeightMatrix0to1[5][3] = 0.333188116923088; fWeightMatrix0to1[6][3] = 4.22593474849644; fWeightMatrix0to1[7][3] = 1.45930998093117; fWeightMatrix0to1[8][3] = 1.49747098987676; fWeightMatrix0to1[9][3] = 0.551478070828729; fWeightMatrix0to1[10][3] = -0.503857666937226; fWeightMatrix0to1[11][3] = 0.951321438169782; fWeightMatrix0to1[12][3] = -2.05580775697346; fWeightMatrix0to1[13][3] = -1.70823178058446; fWeightMatrix0to1[0][4] = -1.31301490581122; fWeightMatrix0to1[1][4] = -1.597035936628; fWeightMatrix0to1[2][4] = 1.1739257850134; fWeightMatrix0to1[3][4] = 0.652093665769929; fWeightMatrix0to1[4][4] = -0.782591355203157; fWeightMatrix0to1[5][4] = 0.592994857065257; fWeightMatrix0to1[6][4] = 0.367802200286337; fWeightMatrix0to1[7][4] = 1.19520585798053; fWeightMatrix0to1[8][4] = 1.798839139901; fWeightMatrix0to1[9][4] = 0.000441842412862734; fWeightMatrix0to1[10][4] = 0.275271597789073; fWeightMatrix0to1[11][4] = -1.61259895815905; fWeightMatrix0to1[12][4] = 0.578075110176336; fWeightMatrix0to1[13][4] = -1.11831394823995; fWeightMatrix0to1[0][5] = -0.749281684413026; fWeightMatrix0to1[1][5] = -1.43222679731621; fWeightMatrix0to1[2][5] = 0.833078790481909; fWeightMatrix0to1[3][5] = 1.17966539832112; fWeightMatrix0to1[4][5] = 2.61909751629; fWeightMatrix0to1[5][5] = -0.348601478989483; fWeightMatrix0to1[6][5] = -2.33950931332049; fWeightMatrix0to1[7][5] = -1.65056779216391; fWeightMatrix0to1[8][5] = 1.04817824389887; fWeightMatrix0to1[9][5] = 1.53785422217877; fWeightMatrix0to1[10][5] = 1.23018074768507; fWeightMatrix0to1[11][5] = 0.474452490624599; fWeightMatrix0to1[12][5] = 1.02443990441119; fWeightMatrix0to1[13][5] = -0.08309761708454; fWeightMatrix0to1[0][6] = -0.884216919488876; fWeightMatrix0to1[1][6] = 0.143456866326622; fWeightMatrix0to1[2][6] = 0.784406658023227; fWeightMatrix0to1[3][6] = -1.23503619272047; fWeightMatrix0to1[4][6] = 0.673655082890951; fWeightMatrix0to1[5][6] = -1.34875277892985; fWeightMatrix0to1[6][6] = -1.56721765478665; fWeightMatrix0to1[7][6] = 1.2070314876572; fWeightMatrix0to1[8][6] = 1.58778028392039; fWeightMatrix0to1[9][6] = 0.0382470761531128; fWeightMatrix0to1[10][6] = -1.58406069117111; fWeightMatrix0to1[11][6] = 1.20766333162022; fWeightMatrix0to1[12][6] = -0.910131857275196; fWeightMatrix0to1[13][6] = -2.12032019411481; fWeightMatrix0to1[0][7] = -1.43893040129426; fWeightMatrix0to1[1][7] = 1.40304038838806; fWeightMatrix0to1[2][7] = -2.47677227232601; fWeightMatrix0to1[3][7] = -1.81099509014634; fWeightMatrix0to1[4][7] = 8.25639972338; fWeightMatrix0to1[5][7] = 1.97153915079736; fWeightMatrix0to1[6][7] = -4.27778181106719; fWeightMatrix0to1[7][7] = 8.02364099632658; fWeightMatrix0to1[8][7] = -0.985726126054839; fWeightMatrix0to1[9][7] = 4.37972658880336; fWeightMatrix0to1[10][7] = -4.57493203548672; fWeightMatrix0to1[11][7] = -0.194446125572749; fWeightMatrix0to1[12][7] = -1.97405997843153; fWeightMatrix0to1[13][7] = -0.71709876981318; fWeightMatrix0to1[0][8] = 0.393763629505511; fWeightMatrix0to1[1][8] = 0.650332460070772; fWeightMatrix0to1[2][8] = -2.00789846425941; fWeightMatrix0to1[3][8] = 1.47682070940326; fWeightMatrix0to1[4][8] = 0.233960339463266; fWeightMatrix0to1[5][8] = -1.16276639120451; fWeightMatrix0to1[6][8] = 1.3784644345111; fWeightMatrix0to1[7][8] = -1.33887607402634; fWeightMatrix0to1[8][8] = -1.73388001581639; fWeightMatrix0to1[9][8] = -0.655319507901851; fWeightMatrix0to1[10][8] = 2.04465505696908; fWeightMatrix0to1[11][8] = 1.57822625564663; fWeightMatrix0to1[12][8] = 0.32854181906725; fWeightMatrix0to1[13][8] = -1.88729176646399; fWeightMatrix0to1[0][9] = -0.389828913846259; fWeightMatrix0to1[1][9] = -0.0406710107962281; fWeightMatrix0to1[2][9] = -2.94164439724713; fWeightMatrix0to1[3][9] = 1.7599605768434; fWeightMatrix0to1[4][9] = 0.875339729265815; fWeightMatrix0to1[5][9] = -0.98981147053609; fWeightMatrix0to1[6][9] = -0.0277439515208419; fWeightMatrix0to1[7][9] = -0.281675595131277; fWeightMatrix0to1[8][9] = 0.0500335574885874; fWeightMatrix0to1[9][9] = -0.197416127400758; fWeightMatrix0to1[10][9] = 1.68929557453881; fWeightMatrix0to1[11][9] = -1.74296421916659; fWeightMatrix0to1[12][9] = 0.386167709041426; fWeightMatrix0to1[13][9] = 2.07827661880476; fWeightMatrix0to1[0][10] = 1.56227454385569; fWeightMatrix0to1[1][10] = 0.280593101129399; fWeightMatrix0to1[2][10] = -3.57507072001248; fWeightMatrix0to1[3][10] = 0.03331849976759; fWeightMatrix0to1[4][10] = -0.687550899865712; fWeightMatrix0to1[5][10] = 1.10321247233779; fWeightMatrix0to1[6][10] = -0.252316446661973; fWeightMatrix0to1[7][10] = 0.0110176564844894; fWeightMatrix0to1[8][10] = -0.410154223566456; fWeightMatrix0to1[9][10] = -0.274877515284328; fWeightMatrix0to1[10][10] = -0.374721912503827; fWeightMatrix0to1[11][10] = 0.546638936051339; fWeightMatrix0to1[12][10] = 1.72120857532127; fWeightMatrix0to1[13][10] = -0.277437389737019; fWeightMatrix0to1[0][11] = -0.577597286554482; fWeightMatrix0to1[1][11] = 0.573459932686167; fWeightMatrix0to1[2][11] = 1.48674751299562; fWeightMatrix0to1[3][11] = -0.58594314767047; fWeightMatrix0to1[4][11] = -1.00023804679322; fWeightMatrix0to1[5][11] = -1.6006918034109; fWeightMatrix0to1[6][11] = 1.9691547419709; fWeightMatrix0to1[7][11] = -0.2696254940745; fWeightMatrix0to1[8][11] = 0.443650361454661; fWeightMatrix0to1[9][11] = -0.746704655000889; fWeightMatrix0to1[10][11] = 3.99841341934633; fWeightMatrix0to1[11][11] = -0.178224600357526; fWeightMatrix0to1[12][11] = -1.01047934555901; fWeightMatrix0to1[13][11] = -1.13196259193207; fWeightMatrix0to1[0][12] = -0.995005803094937; fWeightMatrix0to1[1][12] = -0.359025807090611; fWeightMatrix0to1[2][12] = 1.16436417571609; fWeightMatrix0to1[3][12] = 0.136703769324678; fWeightMatrix0to1[4][12] = -0.142215475596043; fWeightMatrix0to1[5][12] = 0.663066588827567; fWeightMatrix0to1[6][12] = 1.77431611034043; fWeightMatrix0to1[7][12] = -0.413999393996409; fWeightMatrix0to1[8][12] = 0.650881250737758; fWeightMatrix0to1[9][12] = -0.0366420376668817; fWeightMatrix0to1[10][12] = -2.30192642042299; fWeightMatrix0to1[11][12] = -1.45986135335768; fWeightMatrix0to1[12][12] = -1.79325177638698; fWeightMatrix0to1[13][12] = 1.03525442056687; fWeightMatrix0to1[0][13] = 1.83899806276839; fWeightMatrix0to1[1][13] = -0.15061254044082; fWeightMatrix0to1[2][13] = -0.246959300254253; fWeightMatrix0to1[3][13] = -0.493221170254604; fWeightMatrix0to1[4][13] = 10.3383318607212; fWeightMatrix0to1[5][13] = 1.8070978015278; fWeightMatrix0to1[6][13] = -5.34333654448878; fWeightMatrix0to1[7][13] = 9.94148103222321; fWeightMatrix0to1[8][13] = -1.02625160629228; fWeightMatrix0to1[9][13] = 8.20720098424674; fWeightMatrix0to1[10][13] = -7.53547788309784; fWeightMatrix0to1[11][13] = 1.20430379758868; fWeightMatrix0to1[12][13] = 0.406668447002372; fWeightMatrix0to1[13][13] = -0.57306776256962; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -3.21427789716905; fWeightMatrix1to2[1][0] = -0.110556940087381; fWeightMatrix1to2[2][0] = 0.174065398565828; fWeightMatrix1to2[3][0] = -1.80364757157959; fWeightMatrix1to2[4][0] = -0.702103820359516; fWeightMatrix1to2[5][0] = -0.446406749191095; fWeightMatrix1to2[6][0] = 0.99708373288054; fWeightMatrix1to2[7][0] = -1.68185142486757; fWeightMatrix1to2[8][0] = 0.545995966834925; fWeightMatrix1to2[9][0] = 0.053638298897463; fWeightMatrix1to2[10][0] = -1.00646639732235; fWeightMatrix1to2[11][0] = -0.549860934204775; fWeightMatrix1to2[12][0] = 0.00896600355518385; fWeightMatrix1to2[0][1] = 1.18047085513416; fWeightMatrix1to2[1][1] = -0.0673093535704948; fWeightMatrix1to2[2][1] = -0.213591695359003; fWeightMatrix1to2[3][1] = -2.09020988126294; fWeightMatrix1to2[4][1] = -0.92480447005071; fWeightMatrix1to2[5][1] = 1.7533110039; fWeightMatrix1to2[6][1] = 2.04657169933816; fWeightMatrix1to2[7][1] = 1.23533084339573; fWeightMatrix1to2[8][1] = 0.0815508394492264; fWeightMatrix1to2[9][1] = 1.97268383234185; fWeightMatrix1to2[10][1] = -1.76598357215422; fWeightMatrix1to2[11][1] = -1.35867618977819; fWeightMatrix1to2[12][1] = -0.368259602075818; fWeightMatrix1to2[0][2] = -1.41804403080436; fWeightMatrix1to2[1][2] = 0.731275874598875; fWeightMatrix1to2[2][2] = 1.53464803886894; fWeightMatrix1to2[3][2] = -0.856963302676735; fWeightMatrix1to2[4][2] = 0.774604711999014; fWeightMatrix1to2[5][2] = -0.0186102810816668; fWeightMatrix1to2[6][2] = -3.43379659676234; fWeightMatrix1to2[7][2] = -1.26251627658996; fWeightMatrix1to2[8][2] = 0.924949377214744; fWeightMatrix1to2[9][2] = -0.207249140124453; fWeightMatrix1to2[10][2] = 0.993833912618642; fWeightMatrix1to2[11][2] = -1.14415334838871; fWeightMatrix1to2[12][2] = 1.5526954253442; fWeightMatrix1to2[0][3] = -2.29840082536955; fWeightMatrix1to2[1][3] = 0.158496469707481; fWeightMatrix1to2[2][3] = 0.151581445318085; fWeightMatrix1to2[3][3] = -1.51804351931961; fWeightMatrix1to2[4][3] = -1.30992502163534; fWeightMatrix1to2[5][3] = -1.43773227995318; fWeightMatrix1to2[6][3] = 0.0185495092678514; fWeightMatrix1to2[7][3] = -1.84452023723537; fWeightMatrix1to2[8][3] = -1.87093690789642; fWeightMatrix1to2[9][3] = -1.7968221857936; fWeightMatrix1to2[10][3] = 0.263344677156823; fWeightMatrix1to2[11][3] = -1.07624034133366; fWeightMatrix1to2[12][3] = -1.68327222038983; fWeightMatrix1to2[0][4] = 2.11222762293957; fWeightMatrix1to2[1][4] = -3.43504925127164; fWeightMatrix1to2[2][4] = 0.127363976282828; fWeightMatrix1to2[3][4] = -1.70204021311871; fWeightMatrix1to2[4][4] = 0.210777198675759; fWeightMatrix1to2[5][4] = 1.28204619455921; fWeightMatrix1to2[6][4] = 6.50305260932372; fWeightMatrix1to2[7][4] = -0.448529371447873; fWeightMatrix1to2[8][4] = -2.01845260564232; fWeightMatrix1to2[9][4] = -1.6553645688732; fWeightMatrix1to2[10][4] = -1.70699894514297; fWeightMatrix1to2[11][4] = -1.82097610621577; fWeightMatrix1to2[12][4] = -0.968481023365921; fWeightMatrix1to2[0][5] = -0.392971505059039; fWeightMatrix1to2[1][5] = 0.735092162372647; fWeightMatrix1to2[2][5] = -0.631785706852974; fWeightMatrix1to2[3][5] = -0.0543763915445966; fWeightMatrix1to2[4][5] = -0.535706990601696; fWeightMatrix1to2[5][5] = -2.81164314842623; fWeightMatrix1to2[6][5] = 0.998573232569831; fWeightMatrix1to2[7][5] = 1.35262432660238; fWeightMatrix1to2[8][5] = -1.70755452963812; fWeightMatrix1to2[9][5] = -1.67401513322986; fWeightMatrix1to2[10][5] = 0.38138031467397; fWeightMatrix1to2[11][5] = 0.850369967433899; fWeightMatrix1to2[12][5] = -0.796082485018584; fWeightMatrix1to2[0][6] = -1.77911321722687; fWeightMatrix1to2[1][6] = 2.08580351252655; fWeightMatrix1to2[2][6] = 1.23321193521644; fWeightMatrix1to2[3][6] = 2.09345611298425; fWeightMatrix1to2[4][6] = 0.985982051923456; fWeightMatrix1to2[5][6] = -1.10034094304775; fWeightMatrix1to2[6][6] = -3.34434607343995; fWeightMatrix1to2[7][6] = -0.186586548142927; fWeightMatrix1to2[8][6] = -1.39093803245298; fWeightMatrix1to2[9][6] = -1.28235819429055; fWeightMatrix1to2[10][6] = -2.35342556334108; fWeightMatrix1to2[11][6] = -1.24240975406078; fWeightMatrix1to2[12][6] = -2.33874983669797; fWeightMatrix1to2[0][7] = 2.50183116702127; fWeightMatrix1to2[1][7] = -3.78609007402418; fWeightMatrix1to2[2][7] = 1.5152504001153; fWeightMatrix1to2[3][7] = -2.26429813113965; fWeightMatrix1to2[4][7] = 1.1243428432032; fWeightMatrix1to2[5][7] = -1.21169749307051; fWeightMatrix1to2[6][7] = 6.03150021650417; fWeightMatrix1to2[7][7] = 1.29711413803392; fWeightMatrix1to2[8][7] = -2.08375719556401; fWeightMatrix1to2[9][7] = 0.228297342101835; fWeightMatrix1to2[10][7] = 0.396865499296773; fWeightMatrix1to2[11][7] = -2.15139582687429; fWeightMatrix1to2[12][7] = -2.96289407438905; fWeightMatrix1to2[0][8] = -1.75727956022693; fWeightMatrix1to2[1][8] = -0.995141311269823; fWeightMatrix1to2[2][8] = 1.80359731058948; fWeightMatrix1to2[3][8] = -0.625091108441336; fWeightMatrix1to2[4][8] = 0.269055826812815; fWeightMatrix1to2[5][8] = -0.990629527465335; fWeightMatrix1to2[6][8] = -0.94919657852305; fWeightMatrix1to2[7][8] = -1.74843528293729; fWeightMatrix1to2[8][8] = -1.01854623201611; fWeightMatrix1to2[9][8] = -1.4112959632864; fWeightMatrix1to2[10][8] = 0.920439102352116; fWeightMatrix1to2[11][8] = -1.368388365342; fWeightMatrix1to2[12][8] = -1.15110272953275; fWeightMatrix1to2[0][9] = 0.870772622354631; fWeightMatrix1to2[1][9] = -2.72594576266483; fWeightMatrix1to2[2][9] = -0.334895895181529; fWeightMatrix1to2[3][9] = 1.12933078196308; fWeightMatrix1to2[4][9] = -1.79469583985449; fWeightMatrix1to2[5][9] = 2.06697761014314; fWeightMatrix1to2[6][9] = 4.96749056842609; fWeightMatrix1to2[7][9] = -0.516215046036291; fWeightMatrix1to2[8][9] = -1.51773727069505; fWeightMatrix1to2[9][9] = -0.156074878843945; fWeightMatrix1to2[10][9] = -1.64759492202562; fWeightMatrix1to2[11][9] = 0.293552384843288; fWeightMatrix1to2[12][9] = 1.10487717293153; fWeightMatrix1to2[0][10] = 2.26519690879883; fWeightMatrix1to2[1][10] = 2.14500348375208; fWeightMatrix1to2[2][10] = -1.84871864149082; fWeightMatrix1to2[3][10] = 0.764632383701558; fWeightMatrix1to2[4][10] = -1.93813354679681; fWeightMatrix1to2[5][10] = -1.07803687234296; fWeightMatrix1to2[6][10] = -5.31282553123184; fWeightMatrix1to2[7][10] = -1.73099268416184; fWeightMatrix1to2[8][10] = -1.08566316245877; fWeightMatrix1to2[9][10] = 0.486196424689029; fWeightMatrix1to2[10][10] = 0.504024841938334; fWeightMatrix1to2[11][10] = 0.705708592841757; fWeightMatrix1to2[12][10] = -1.42416828054133; fWeightMatrix1to2[0][11] = -1.0646051528052; fWeightMatrix1to2[1][11] = -1.56802865774478; fWeightMatrix1to2[2][11] = -0.214739055634728; fWeightMatrix1to2[3][11] = 1.47653756799228; fWeightMatrix1to2[4][11] = -1.57296359160514; fWeightMatrix1to2[5][11] = -0.51202801327645; fWeightMatrix1to2[6][11] = -0.435612273244908; fWeightMatrix1to2[7][11] = -0.183538860221221; fWeightMatrix1to2[8][11] = -0.893057451701866; fWeightMatrix1to2[9][11] = -0.527924888864393; fWeightMatrix1to2[10][11] = -1.47722861293729; fWeightMatrix1to2[11][11] = 0.608776810060031; fWeightMatrix1to2[12][11] = -0.0201331222796263; fWeightMatrix1to2[0][12] = -0.105493390824501; fWeightMatrix1to2[1][12] = -1.13975686294114; fWeightMatrix1to2[2][12] = -1.13080410040727; fWeightMatrix1to2[3][12] = -1.82859403842363; fWeightMatrix1to2[4][12] = -1.73964639944405; fWeightMatrix1to2[5][12] = 0.367873114729075; fWeightMatrix1to2[6][12] = 0.755568722065353; fWeightMatrix1to2[7][12] = -1.25166508837233; fWeightMatrix1to2[8][12] = -0.4483007543656; fWeightMatrix1to2[9][12] = -2.34725630081068; fWeightMatrix1to2[10][12] = -1.34518458507033; fWeightMatrix1to2[11][12] = -2.04192207193123; fWeightMatrix1to2[12][12] = -0.606266904206548; fWeightMatrix1to2[0][13] = -4.47719199061371; fWeightMatrix1to2[1][13] = -2.2350583119254; fWeightMatrix1to2[2][13] = -0.95199180437118; fWeightMatrix1to2[3][13] = -1.01154697129126; fWeightMatrix1to2[4][13] = -1.62754970181038; fWeightMatrix1to2[5][13] = 0.111437816093736; fWeightMatrix1to2[6][13] = -1.01237447342721; fWeightMatrix1to2[7][13] = -1.74842024151056; fWeightMatrix1to2[8][13] = 1.00707061901615; fWeightMatrix1to2[9][13] = -0.0839581144660939; fWeightMatrix1to2[10][13] = 0.0598447770591504; fWeightMatrix1to2[11][13] = -0.229975328538304; fWeightMatrix1to2[12][13] = 0.011093701190935; fWeightMatrix1to2[0][14] = 0.483190134870788; fWeightMatrix1to2[1][14] = -0.77579713529602; fWeightMatrix1to2[2][14] = -1.30969276955546; fWeightMatrix1to2[3][14] = -1.13821442932804; fWeightMatrix1to2[4][14] = -1.74703005165379; fWeightMatrix1to2[5][14] = 0.0301173438414762; fWeightMatrix1to2[6][14] = 1.84316727875247; fWeightMatrix1to2[7][14] = 0.355006984303203; fWeightMatrix1to2[8][14] = 0.0751212425714602; fWeightMatrix1to2[9][14] = 0.150588384896541; fWeightMatrix1to2[10][14] = 1.15108146589287; fWeightMatrix1to2[11][14] = -1.77028024067691; fWeightMatrix1to2[12][14] = 0.102296748591715; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -1.3454901685381; fWeightMatrix2to3[0][1] = 0.106002208106845; fWeightMatrix2to3[0][2] = 0.0643198457535069; fWeightMatrix2to3[0][3] = -1.05553198339461; fWeightMatrix2to3[0][4] = 0.902206262485521; fWeightMatrix2to3[0][5] = 0.530595605691912; fWeightMatrix2to3[0][6] = 0.754878683498826; fWeightMatrix2to3[0][7] = 2.03438583935942; fWeightMatrix2to3[0][8] = -0.350975061168376; fWeightMatrix2to3[0][9] = 0.893601160774086; fWeightMatrix2to3[0][10] = 1.38367968460371; fWeightMatrix2to3[0][11] = -0.190234263222648; fWeightMatrix2to3[0][12] = -1.3248840088611; fWeightMatrix2to3[0][13] = 0.0488246596331313; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l