// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.26/00 [334336] Creator : stdenis Date : Sun Jan 1 10:48:54 2012 Host : Linux psr-lts309-32bit-kvm 2.4.21-63.ELsmp #1 SMP Wed Nov 4 04:34:43 CST 2009 i686 i686 i386 GNU/Linux Dir : /data/cdf04/stdenis/batch/run20572/job413 Training events: 45948 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] RandomSeed: "1" [Random Number Seed for TRandom3] RandomFile: "None" [Random Number input file for TRandom3] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 13 Ht Ht 'F' [46.6659545898,824.779174805] LepAPt LepAPt 'F' [20.0003662109,155.865875244] LepBPt LepBPt 'F' [10.0010261536,72.9998397827] MetSigLeptonsJets MetSigLeptonsJets 'F' [1.00477731228,19.6708087921] MetSpec MetSpec 'F' [15.0119848251,262.380706787] SumEtLeptonsJets SumEtLeptonsJets 'F' [30.1484737396,533.700561523] VSumJetLeptonsPt VSumJetLeptonsPt 'F' [0.678276479244,374.426086426] addEt addEt 'F' [46.6659545898,422.574371338] dPhiLepSumMet dPhiLepSumMet 'F' [0.0517990663648,3.14159059525] dPhiLeptons dPhiLeptons 'F' [1.81440645974e-05,1.13430726528] dRLeptons dRLeptons 'F' [0.200005456805,1.13770997524] lep1_E lep1_E 'F' [20.0262546539,232.717926025] lep2_E lep2_E 'F' [10.0074262619,136.060577393] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 13 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "Ht", "LepAPt", "LepBPt", "MetSigLeptonsJets", "MetSpec", "SumEtLeptonsJets", "VSumJetLeptonsPt", "addEt", "dPhiLepSumMet", "dPhiLeptons", "dRLeptons", "lep1_E", "lep2_E" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 46.6659545898438; fVmax[0] = 824.779174804688; fVmin[1] = 20.0003662109375; fVmax[1] = 155.865875244141; fVmin[2] = 10.0010261535645; fVmax[2] = 72.9998397827148; fVmin[3] = 1.00477731227875; fVmax[3] = 19.6708087921143; fVmin[4] = 15.0119848251343; fVmax[4] = 262.380706787109; fVmin[5] = 30.148473739624; fVmax[5] = 533.700561523438; fVmin[6] = 0.678276479244232; fVmax[6] = 374.426086425781; fVmin[7] = 46.6659545898438; fVmax[7] = 422.574371337891; fVmin[8] = 0.0517990663647652; fVmax[8] = 3.14159059524536; fVmin[9] = 1.81440645974362e-05; fVmax[9] = 1.13430726528168; fVmin[10] = 0.200005456805229; fVmax[10] = 1.13770997524261; fVmin[11] = 20.0262546539307; fVmax[11] = 232.717926025391; fVmin[12] = 10.0074262619019; fVmax[12] = 136.060577392578; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; fType[11] = 'F'; fType[12] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[13]; double fVmax[13]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[13]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[15][14]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[14][15]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][14]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 14; fWeights[0] = new double[14]; fLayerSize[1] = 15; fWeights[1] = new double[15]; fLayerSize[2] = 14; fWeights[2] = new double[14]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = -0.3836848012946; fWeightMatrix0to1[1][0] = 2.2518691892693; fWeightMatrix0to1[2][0] = 0.93660689612196; fWeightMatrix0to1[3][0] = 2.64924379309184; fWeightMatrix0to1[4][0] = -0.995133592509868; fWeightMatrix0to1[5][0] = -0.342309891831039; fWeightMatrix0to1[6][0] = -1.26209870371135; fWeightMatrix0to1[7][0] = 2.71409122566991; fWeightMatrix0to1[8][0] = -1.57239199720498; fWeightMatrix0to1[9][0] = -0.423658306675235; fWeightMatrix0to1[10][0] = -1.03551008610244; fWeightMatrix0to1[11][0] = -0.570731331540889; fWeightMatrix0to1[12][0] = -0.99331718193085; fWeightMatrix0to1[13][0] = 0.0702628859954197; fWeightMatrix0to1[0][1] = -0.542629559948844; fWeightMatrix0to1[1][1] = 0.805812138472497; fWeightMatrix0to1[2][1] = 0.56300823260029; fWeightMatrix0to1[3][1] = 5.15748787447817; fWeightMatrix0to1[4][1] = 2.9830416263208; fWeightMatrix0to1[5][1] = 2.36002187421364; fWeightMatrix0to1[6][1] = -0.982712911975555; fWeightMatrix0to1[7][1] = 0.501933579335956; fWeightMatrix0to1[8][1] = 5.05551452590287; fWeightMatrix0to1[9][1] = 1.23026397573671; fWeightMatrix0to1[10][1] = -1.25712817799482; fWeightMatrix0to1[11][1] = -0.877466116381388; fWeightMatrix0to1[12][1] = 0.275576934585473; fWeightMatrix0to1[13][1] = -1.76147575870885; fWeightMatrix0to1[0][2] = -1.9314464540686; fWeightMatrix0to1[1][2] = 0.587035805684592; fWeightMatrix0to1[2][2] = 0.64418195568299; fWeightMatrix0to1[3][2] = 4.34057380416733; fWeightMatrix0to1[4][2] = 1.60649825652992; fWeightMatrix0to1[5][2] = 0.818268738821063; fWeightMatrix0to1[6][2] = -0.157404210315196; fWeightMatrix0to1[7][2] = 2.92861544557032; fWeightMatrix0to1[8][2] = -3.69659651750831; fWeightMatrix0to1[9][2] = 2.65842508251287; fWeightMatrix0to1[10][2] = -1.73680252648058; fWeightMatrix0to1[11][2] = 1.21236900781672; fWeightMatrix0to1[12][2] = 2.48741748249942; fWeightMatrix0to1[13][2] = 0.865383896891655; fWeightMatrix0to1[0][3] = 1.94688325886493; fWeightMatrix0to1[1][3] = 1.26047137208844; fWeightMatrix0to1[2][3] = -0.416290885095006; fWeightMatrix0to1[3][3] = -2.99864282306345; fWeightMatrix0to1[4][3] = 0.863121515248786; fWeightMatrix0to1[5][3] = -0.677582416184844; fWeightMatrix0to1[6][3] = 0.977171906416694; fWeightMatrix0to1[7][3] = 2.06410560839693; fWeightMatrix0to1[8][3] = 0.622949502375384; fWeightMatrix0to1[9][3] = 1.72818512840489; fWeightMatrix0to1[10][3] = -2.09458386490266; fWeightMatrix0to1[11][3] = 0.897301440174855; fWeightMatrix0to1[12][3] = -1.33654185196555; fWeightMatrix0to1[13][3] = -0.174568588734221; fWeightMatrix0to1[0][4] = -1.27157572188327; fWeightMatrix0to1[1][4] = -1.51597206410306; fWeightMatrix0to1[2][4] = 1.8709166389021; fWeightMatrix0to1[3][4] = 1.25120913904614; fWeightMatrix0to1[4][4] = -0.419602250505652; fWeightMatrix0to1[5][4] = 1.24446126000877; fWeightMatrix0to1[6][4] = -0.874672065827326; fWeightMatrix0to1[7][4] = 3.0078305470042; fWeightMatrix0to1[8][4] = 1.19971923717939; fWeightMatrix0to1[9][4] = 0.466477944579443; fWeightMatrix0to1[10][4] = 0.249524893171852; fWeightMatrix0to1[11][4] = -1.57520485141695; fWeightMatrix0to1[12][4] = 1.23809171551973; fWeightMatrix0to1[13][4] = -0.729968604152009; fWeightMatrix0to1[0][5] = -0.831397897388997; fWeightMatrix0to1[1][5] = -1.1634943698842; fWeightMatrix0to1[2][5] = 0.698881797830915; fWeightMatrix0to1[3][5] = 1.89140901958867; fWeightMatrix0to1[4][5] = 1.98400505632496; fWeightMatrix0to1[5][5] = 0.746308310462067; fWeightMatrix0to1[6][5] = -2.3239175481227; fWeightMatrix0to1[7][5] = -2.13361030835674; fWeightMatrix0to1[8][5] = 0.963502896341681; fWeightMatrix0to1[9][5] = 1.02671814380412; fWeightMatrix0to1[10][5] = 2.82079435319331; fWeightMatrix0to1[11][5] = 0.656561555219648; fWeightMatrix0to1[12][5] = 1.11288912325307; fWeightMatrix0to1[13][5] = -0.157150141434878; fWeightMatrix0to1[0][6] = -0.900714700702311; fWeightMatrix0to1[1][6] = 0.0128702810536989; fWeightMatrix0to1[2][6] = 1.39868722409408; fWeightMatrix0to1[3][6] = -0.58848907636556; fWeightMatrix0to1[4][6] = -0.649865383224078; fWeightMatrix0to1[5][6] = -0.871560241245066; fWeightMatrix0to1[6][6] = -0.714904149893681; fWeightMatrix0to1[7][6] = 0.786758824967519; fWeightMatrix0to1[8][6] = 1.61727240594602; fWeightMatrix0to1[9][6] = -1.20897840548498; fWeightMatrix0to1[10][6] = -0.467418756938591; fWeightMatrix0to1[11][6] = 1.426174830283; fWeightMatrix0to1[12][6] = -0.277803165396177; fWeightMatrix0to1[13][6] = -1.05407250186075; fWeightMatrix0to1[0][7] = -1.39966800855146; fWeightMatrix0to1[1][7] = 0.417354270046944; fWeightMatrix0to1[2][7] = -1.48691500549691; fWeightMatrix0to1[3][7] = 2.89012797588907; fWeightMatrix0to1[4][7] = 4.63168989360781; fWeightMatrix0to1[5][7] = 4.03790226860258; fWeightMatrix0to1[6][7] = -1.80484659794067; fWeightMatrix0to1[7][7] = 7.83314481407647; fWeightMatrix0to1[8][7] = 0.323105433030038; fWeightMatrix0to1[9][7] = 0.667567381610711; fWeightMatrix0to1[10][7] = -0.393987358438729; fWeightMatrix0to1[11][7] = -0.467249241775829; fWeightMatrix0to1[12][7] = -1.69789820591542; fWeightMatrix0to1[13][7] = -0.752200261396362; fWeightMatrix0to1[0][8] = 0.483135875438848; fWeightMatrix0to1[1][8] = 0.0232899535897208; fWeightMatrix0to1[2][8] = -1.02161875102144; fWeightMatrix0to1[3][8] = 3.13661225825974; fWeightMatrix0to1[4][8] = 0.140912144197025; fWeightMatrix0to1[5][8] = -3.67366680255426; fWeightMatrix0to1[6][8] = 0.567673889304915; fWeightMatrix0to1[7][8] = -2.52497083764368; fWeightMatrix0to1[8][8] = -0.596657834322117; fWeightMatrix0to1[9][8] = -0.676587153948539; fWeightMatrix0to1[10][8] = -2.37830212511208; fWeightMatrix0to1[11][8] = 1.22663712563053; fWeightMatrix0to1[12][8] = 0.36806933117025; fWeightMatrix0to1[13][8] = -0.315445794557825; fWeightMatrix0to1[0][9] = -0.502990961614226; fWeightMatrix0to1[1][9] = 0.41736292074345; fWeightMatrix0to1[2][9] = -1.70259483713529; fWeightMatrix0to1[3][9] = -0.968776739955264; fWeightMatrix0to1[4][9] = 1.41108105992982; fWeightMatrix0to1[5][9] = -0.839475453613016; fWeightMatrix0to1[6][9] = 0.487112153925665; fWeightMatrix0to1[7][9] = 0.136440631689655; fWeightMatrix0to1[8][9] = -0.572376059521909; fWeightMatrix0to1[9][9] = 0.652474132385849; fWeightMatrix0to1[10][9] = 0.86177472481568; fWeightMatrix0to1[11][9] = -1.42822157858131; fWeightMatrix0to1[12][9] = 1.87862405050971; fWeightMatrix0to1[13][9] = 0.462682266614886; fWeightMatrix0to1[0][10] = 1.46902183529058; fWeightMatrix0to1[1][10] = 0.0554098003598859; fWeightMatrix0to1[2][10] = -1.26705354012425; fWeightMatrix0to1[3][10] = 0.953664297611743; fWeightMatrix0to1[4][10] = -0.140526915542139; fWeightMatrix0to1[5][10] = -0.111021900369911; fWeightMatrix0to1[6][10] = 1.10879177646593; fWeightMatrix0to1[7][10] = 0.353723260091193; fWeightMatrix0to1[8][10] = 0.900630261387243; fWeightMatrix0to1[9][10] = -0.392545279662278; fWeightMatrix0to1[10][10] = -1.91295447437846; fWeightMatrix0to1[11][10] = 1.15076593064952; fWeightMatrix0to1[12][10] = 1.3482971355912; fWeightMatrix0to1[13][10] = -0.641177467376843; fWeightMatrix0to1[0][11] = -0.489184600864143; fWeightMatrix0to1[1][11] = 0.0530015430130339; fWeightMatrix0to1[2][11] = 2.66253335962637; fWeightMatrix0to1[3][11] = -0.229767670446227; fWeightMatrix0to1[4][11] = -0.614841875458809; fWeightMatrix0to1[5][11] = -2.66370998085788; fWeightMatrix0to1[6][11] = 1.05392338303568; fWeightMatrix0to1[7][11] = -0.621092226106726; fWeightMatrix0to1[8][11] = 1.79925129303994; fWeightMatrix0to1[9][11] = 0.465956403875616; fWeightMatrix0to1[10][11] = 1.51440862877363; fWeightMatrix0to1[11][11] = -0.0595956818664869; fWeightMatrix0to1[12][11] = -2.42040658386099; fWeightMatrix0to1[13][11] = 1.6557668168565; fWeightMatrix0to1[0][12] = -0.929448359966939; fWeightMatrix0to1[1][12] = -0.362828158459499; fWeightMatrix0to1[2][12] = 2.24158076213355; fWeightMatrix0to1[3][12] = -0.953056378908008; fWeightMatrix0to1[4][12] = -1.90324095672154; fWeightMatrix0to1[5][12] = 0.0337483284446154; fWeightMatrix0to1[6][12] = 1.52083969336672; fWeightMatrix0to1[7][12] = -0.378447638297506; fWeightMatrix0to1[8][12] = -1.94388984737953; fWeightMatrix0to1[9][12] = -0.502364330372355; fWeightMatrix0to1[10][12] = 0.530078725877214; fWeightMatrix0to1[11][12] = -0.932796721567991; fWeightMatrix0to1[12][12] = -1.08061962490929; fWeightMatrix0to1[13][12] = 2.55607947396179; fWeightMatrix0to1[0][13] = 1.95998603805449; fWeightMatrix0to1[1][13] = -1.69975402494822; fWeightMatrix0to1[2][13] = -0.542193456241121; fWeightMatrix0to1[3][13] = 7.09446749440447; fWeightMatrix0to1[4][13] = 5.55352943162853; fWeightMatrix0to1[5][13] = 5.44120216870956; fWeightMatrix0to1[6][13] = 0.00628551832701973; fWeightMatrix0to1[7][13] = 12.4894719059759; fWeightMatrix0to1[8][13] = 0.340933696287454; fWeightMatrix0to1[9][13] = 2.26618934129791; fWeightMatrix0to1[10][13] = 0.387001877070671; fWeightMatrix0to1[11][13] = 0.150313856736602; fWeightMatrix0to1[12][13] = 0.0623762569914268; fWeightMatrix0to1[13][13] = -4.56329161373943; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -2.9760850664808; fWeightMatrix1to2[1][0] = 0.339144134832334; fWeightMatrix1to2[2][0] = -1.05833434104562; fWeightMatrix1to2[3][0] = -0.504151688523718; fWeightMatrix1to2[4][0] = -0.830951611568158; fWeightMatrix1to2[5][0] = -0.742487599306499; fWeightMatrix1to2[6][0] = -0.676104283917202; fWeightMatrix1to2[7][0] = -2.49653547664643; fWeightMatrix1to2[8][0] = -0.277326661455326; fWeightMatrix1to2[9][0] = 0.0018253972091458; fWeightMatrix1to2[10][0] = -1.29286451124148; fWeightMatrix1to2[11][0] = -0.174680358261179; fWeightMatrix1to2[12][0] = 0.0403667438646007; fWeightMatrix1to2[0][1] = 0.935352065159797; fWeightMatrix1to2[1][1] = 0.542075239639722; fWeightMatrix1to2[2][1] = -0.291344460563205; fWeightMatrix1to2[3][1] = -2.21734735688997; fWeightMatrix1to2[4][1] = -0.916791989825558; fWeightMatrix1to2[5][1] = 1.53986599589911; fWeightMatrix1to2[6][1] = 1.28051266814212; fWeightMatrix1to2[7][1] = 0.740687610556697; fWeightMatrix1to2[8][1] = -0.297750317732971; fWeightMatrix1to2[9][1] = 1.9283047674653; fWeightMatrix1to2[10][1] = -1.65198081516393; fWeightMatrix1to2[11][1] = -1.34742974872081; fWeightMatrix1to2[12][1] = -0.147749141231006; fWeightMatrix1to2[0][2] = -1.61721934139388; fWeightMatrix1to2[1][2] = 0.885342893958349; fWeightMatrix1to2[2][2] = 2.29180934921692; fWeightMatrix1to2[3][2] = -1.41999943964575; fWeightMatrix1to2[4][2] = 0.778807913769942; fWeightMatrix1to2[5][2] = -0.394759435107158; fWeightMatrix1to2[6][2] = -1.73631323847209; fWeightMatrix1to2[7][2] = -1.09047835620223; fWeightMatrix1to2[8][2] = 0.506116006848949; fWeightMatrix1to2[9][2] = 0.040883997800122; fWeightMatrix1to2[10][2] = 1.1381113858844; fWeightMatrix1to2[11][2] = -1.22217094770606; fWeightMatrix1to2[12][2] = 1.85167931242822; fWeightMatrix1to2[0][3] = -2.0645116367488; fWeightMatrix1to2[1][3] = -0.424384074079561; fWeightMatrix1to2[2][3] = 1.2215445139714; fWeightMatrix1to2[3][3] = -5.5577652023201; fWeightMatrix1to2[4][3] = -1.44473391081602; fWeightMatrix1to2[5][3] = -1.92991851813272; fWeightMatrix1to2[6][3] = 0.78585830194846; fWeightMatrix1to2[7][3] = -1.81160704725847; fWeightMatrix1to2[8][3] = -4.14640246270479; fWeightMatrix1to2[9][3] = -1.91261074022906; fWeightMatrix1to2[10][3] = 0.388910581304943; fWeightMatrix1to2[11][3] = -1.32922593797994; fWeightMatrix1to2[12][3] = -2.03549033393455; fWeightMatrix1to2[0][4] = 0.480497530422449; fWeightMatrix1to2[1][4] = -0.718897030683717; fWeightMatrix1to2[2][4] = 0.407401020931573; fWeightMatrix1to2[3][4] = -3.87560975177925; fWeightMatrix1to2[4][4] = 0.239300295256163; fWeightMatrix1to2[5][4] = 1.97208070687697; fWeightMatrix1to2[6][4] = 1.0658474758468; fWeightMatrix1to2[7][4] = -0.929312552159101; fWeightMatrix1to2[8][4] = -3.30560968222224; fWeightMatrix1to2[9][4] = -1.79595061839718; fWeightMatrix1to2[10][4] = -1.23121604517771; fWeightMatrix1to2[11][4] = -1.67582976726099; fWeightMatrix1to2[12][4] = 0.198669522924729; fWeightMatrix1to2[0][5] = -0.654402008776087; fWeightMatrix1to2[1][5] = 2.11036186046025; fWeightMatrix1to2[2][5] = -0.852818766329667; fWeightMatrix1to2[3][5] = -2.73417626034429; fWeightMatrix1to2[4][5] = -0.513201660879847; fWeightMatrix1to2[5][5] = -2.96861887542505; fWeightMatrix1to2[6][5] = 0.794284941266973; fWeightMatrix1to2[7][5] = 0.994467862287293; fWeightMatrix1to2[8][5] = -2.99094556605721; fWeightMatrix1to2[9][5] = -1.72276720370541; fWeightMatrix1to2[10][5] = 0.566427210794386; fWeightMatrix1to2[11][5] = 1.27081638956144; fWeightMatrix1to2[12][5] = -0.100257109049515; fWeightMatrix1to2[0][6] = -1.32911049802021; fWeightMatrix1to2[1][6] = -0.969573435884539; fWeightMatrix1to2[2][6] = 0.990173974687072; fWeightMatrix1to2[3][6] = 2.24566305072055; fWeightMatrix1to2[4][6] = 0.850072838230916; fWeightMatrix1to2[5][6] = -0.940932778216237; fWeightMatrix1to2[6][6] = -0.35901375076325; fWeightMatrix1to2[7][6] = -0.795451816019074; fWeightMatrix1to2[8][6] = -2.43745190857089; fWeightMatrix1to2[9][6] = -1.20276537043989; fWeightMatrix1to2[10][6] = -2.52217390630005; fWeightMatrix1to2[11][6] = -1.42529548928399; fWeightMatrix1to2[12][6] = -2.9578345859044; fWeightMatrix1to2[0][7] = 1.16788251515732; fWeightMatrix1to2[1][7] = -2.46431088332665; fWeightMatrix1to2[2][7] = 0.89386293143963; fWeightMatrix1to2[3][7] = -6.86115565607146; fWeightMatrix1to2[4][7] = 1.48823479299305; fWeightMatrix1to2[5][7] = -1.56166336301417; fWeightMatrix1to2[6][7] = 0.792576351505038; fWeightMatrix1to2[7][7] = 1.07523185232183; fWeightMatrix1to2[8][7] = -4.97921788771729; fWeightMatrix1to2[9][7] = 0.0424812443626037; fWeightMatrix1to2[10][7] = 0.737349053099672; fWeightMatrix1to2[11][7] = -2.19378859437174; fWeightMatrix1to2[12][7] = -1.95917843410156; fWeightMatrix1to2[0][8] = -1.77042107307391; fWeightMatrix1to2[1][8] = -0.47667477485927; fWeightMatrix1to2[2][8] = 4.53881847618623; fWeightMatrix1to2[3][8] = -1.0089498287632; fWeightMatrix1to2[4][8] = 0.245304297440055; fWeightMatrix1to2[5][8] = -0.775387546587268; fWeightMatrix1to2[6][8] = -1.55870688909151; fWeightMatrix1to2[7][8] = -1.63811188097806; fWeightMatrix1to2[8][8] = -1.54751408519916; fWeightMatrix1to2[9][8] = -1.47962280701914; fWeightMatrix1to2[10][8] = 1.0386584602724; fWeightMatrix1to2[11][8] = -1.26593248474659; fWeightMatrix1to2[12][8] = -1.14536809187155; fWeightMatrix1to2[0][9] = 1.27472143741786; fWeightMatrix1to2[1][9] = -0.467853543199665; fWeightMatrix1to2[2][9] = -1.9379322799878; fWeightMatrix1to2[3][9] = 0.119965796900763; fWeightMatrix1to2[4][9] = -1.72966854598602; fWeightMatrix1to2[5][9] = 0.650813889328664; fWeightMatrix1to2[6][9] = 0.824426684551297; fWeightMatrix1to2[7][9] = -0.980254015632288; fWeightMatrix1to2[8][9] = -3.26561248533631; fWeightMatrix1to2[9][9] = -0.0485083078807329; fWeightMatrix1to2[10][9] = -0.805072321497814; fWeightMatrix1to2[11][9] = 0.1505774104593; fWeightMatrix1to2[12][9] = 1.38934501660847; fWeightMatrix1to2[0][10] = 0.55602541480606; fWeightMatrix1to2[1][10] = 1.61585976136274; fWeightMatrix1to2[2][10] = -2.17259356185159; fWeightMatrix1to2[3][10] = 1.10450884608189; fWeightMatrix1to2[4][10] = -1.84502434803534; fWeightMatrix1to2[5][10] = -1.2536662515987; fWeightMatrix1to2[6][10] = -1.97667027368929; fWeightMatrix1to2[7][10] = -1.89406624751147; fWeightMatrix1to2[8][10] = -0.735790693055911; fWeightMatrix1to2[9][10] = 0.718635428456985; fWeightMatrix1to2[10][10] = -0.160386855435911; fWeightMatrix1to2[11][10] = 0.182454005662906; fWeightMatrix1to2[12][10] = -1.17443890974324; fWeightMatrix1to2[0][11] = -1.2308986840986; fWeightMatrix1to2[1][11] = -1.59091195781333; fWeightMatrix1to2[2][11] = -0.848352968322665; fWeightMatrix1to2[3][11] = 2.14598145603811; fWeightMatrix1to2[4][11] = -1.73547527392307; fWeightMatrix1to2[5][11] = -0.207759406730791; fWeightMatrix1to2[6][11] = -2.21552743888872; fWeightMatrix1to2[7][11] = -0.927837205808519; fWeightMatrix1to2[8][11] = -1.41416334413167; fWeightMatrix1to2[9][11] = -0.625823773966801; fWeightMatrix1to2[10][11] = -1.66641131526355; fWeightMatrix1to2[11][11] = 0.958307114893832; fWeightMatrix1to2[12][11] = 0.537798275383428; fWeightMatrix1to2[0][12] = 0.0515192741722576; fWeightMatrix1to2[1][12] = -1.7001554858362; fWeightMatrix1to2[2][12] = -2.06511228193365; fWeightMatrix1to2[3][12] = -1.33532518983091; fWeightMatrix1to2[4][12] = -1.94694830266452; fWeightMatrix1to2[5][12] = 0.675610190864074; fWeightMatrix1to2[6][12] = -1.54542889132883; fWeightMatrix1to2[7][12] = -1.58068795031017; fWeightMatrix1to2[8][12] = -0.181889170531258; fWeightMatrix1to2[9][12] = -2.46391023351835; fWeightMatrix1to2[10][12] = -1.90054761234032; fWeightMatrix1to2[11][12] = -1.95441910012662; fWeightMatrix1to2[12][12] = -0.803346101705706; fWeightMatrix1to2[0][13] = -1.79833326451869; fWeightMatrix1to2[1][13] = -2.37755830051686; fWeightMatrix1to2[2][13] = -1.49705015760769; fWeightMatrix1to2[3][13] = -0.812417484357534; fWeightMatrix1to2[4][13] = -1.64724915916464; fWeightMatrix1to2[5][13] = 0.600222159082656; fWeightMatrix1to2[6][13] = -1.65450176297879; fWeightMatrix1to2[7][13] = -1.37002867559512; fWeightMatrix1to2[8][13] = 3.05102696933462; fWeightMatrix1to2[9][13] = -0.0191604682327339; fWeightMatrix1to2[10][13] = -0.0143992465689926; fWeightMatrix1to2[11][13] = 0.252021038525072; fWeightMatrix1to2[12][13] = -0.00853711008583627; fWeightMatrix1to2[0][14] = 0.652634729794951; fWeightMatrix1to2[1][14] = -0.283551133546097; fWeightMatrix1to2[2][14] = -2.58494494947833; fWeightMatrix1to2[3][14] = 0.160350321800144; fWeightMatrix1to2[4][14] = -1.87489758289501; fWeightMatrix1to2[5][14] = -0.193326777632255; fWeightMatrix1to2[6][14] = 0.116899087306823; fWeightMatrix1to2[7][14] = -0.454717430394286; fWeightMatrix1to2[8][14] = -0.775709906645858; fWeightMatrix1to2[9][14] = 0.105973144878512; fWeightMatrix1to2[10][14] = 0.987720692072872; fWeightMatrix1to2[11][14] = -1.40215985983749; fWeightMatrix1to2[12][14] = 0.110516244995739; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.660619167134384; fWeightMatrix2to3[0][1] = -0.877492535647532; fWeightMatrix2to3[0][2] = -1.25979577989908; fWeightMatrix2to3[0][3] = -0.785764986642388; fWeightMatrix2to3[0][4] = 0.811584962293852; fWeightMatrix2to3[0][5] = -1.31194480835043; fWeightMatrix2to3[0][6] = 0.254821418887782; fWeightMatrix2to3[0][7] = 1.46370226946802; fWeightMatrix2to3[0][8] = -1.60248209502115; fWeightMatrix2to3[0][9] = 0.921283753028341; fWeightMatrix2to3[0][10] = 1.28537254536381; fWeightMatrix2to3[0][11] = -0.833637539942783; fWeightMatrix2to3[0][12] = -0.850298546513987; fWeightMatrix2to3[0][13] = 0.938376000164091; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l