// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:21:51 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 18312 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,0.999921858311] LRWW LRWW 'F' [0,1] LRWg LRWg 'F' [0,0.999933362007] LRWj LRWj 'F' [0,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.0806331635,233.529418945] MetDelPhi MetDelPhi 'F' [0.118904195726,3.13627409935] MetSpec MetSpec 'F' [15.0246295929,171.627700806] dPhiLeptons dPhiLeptons 'F' [9.91821289062e-05,3.1400179863] dRLeptons dRLeptons 'F' [0.400271952152,4.6129732132] dimass dimass 'F' [16.0121879578,578.400939941] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 0.999921858310699; fVmin[1] = 0; fVmax[1] = 1; fVmin[2] = 0; fVmax[2] = 0.999933362007141; fVmin[3] = 0; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.0806331634521; fVmax[5] = 233.529418945312; fVmin[6] = 0.118904195725918; fVmax[6] = 3.13627409934998; fVmin[7] = 15.0246295928955; fVmax[7] = 171.627700805664; fVmin[8] = 9.918212890625e-05; fVmax[8] = 3.14001798629761; fVmin[9] = 0.400271952152252; fVmax[9] = 4.6129732131958; fVmin[10] = 16.0121879577637; fVmax[10] = 578.400939941406; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.49882997664669; fWeightMatrix0to1[1][0] = 1.50273710141075; fWeightMatrix0to1[2][0] = -0.566712404807425; fWeightMatrix0to1[3][0] = 1.85559039923621; fWeightMatrix0to1[4][0] = 0.0285150841052317; fWeightMatrix0to1[5][0] = 3.55631158623919; fWeightMatrix0to1[6][0] = 1.44337981189335; fWeightMatrix0to1[7][0] = -0.846735229018476; fWeightMatrix0to1[8][0] = 0.295843178716473; fWeightMatrix0to1[9][0] = 1.66171683223033; fWeightMatrix0to1[10][0] = 0.693211283355864; fWeightMatrix0to1[11][0] = 0.280073695424815; fWeightMatrix0to1[0][1] = 1.28337542703234; fWeightMatrix0to1[1][1] = 1.54308249680581; fWeightMatrix0to1[2][1] = 1.20138240261653; fWeightMatrix0to1[3][1] = -2.05786930473246; fWeightMatrix0to1[4][1] = 1.26554175304051; fWeightMatrix0to1[5][1] = -0.436480487648247; fWeightMatrix0to1[6][1] = -1.52659980348027; fWeightMatrix0to1[7][1] = 1.60300020057399; fWeightMatrix0to1[8][1] = 2.25617307372046; fWeightMatrix0to1[9][1] = 1.41463401702772; fWeightMatrix0to1[10][1] = 1.08054149678757; fWeightMatrix0to1[11][1] = -0.336655626386065; fWeightMatrix0to1[0][2] = 0.725860867242902; fWeightMatrix0to1[1][2] = 0.938223072559066; fWeightMatrix0to1[2][2] = -0.10523811279057; fWeightMatrix0to1[3][2] = -1.40017003757819; fWeightMatrix0to1[4][2] = 2.09986915537873; fWeightMatrix0to1[5][2] = -0.440553870221425; fWeightMatrix0to1[6][2] = -0.257413744216161; fWeightMatrix0to1[7][2] = -0.986783795507718; fWeightMatrix0to1[8][2] = -0.568461995154445; fWeightMatrix0to1[9][2] = -0.595557185286166; fWeightMatrix0to1[10][2] = -1.01435534446386; fWeightMatrix0to1[11][2] = 0.485663505944747; fWeightMatrix0to1[0][3] = 0.268991965444482; fWeightMatrix0to1[1][3] = 0.571588809363051; fWeightMatrix0to1[2][3] = 1.26914028150529; fWeightMatrix0to1[3][3] = 1.26002456986963; fWeightMatrix0to1[4][3] = -1.46006270696601; fWeightMatrix0to1[5][3] = 0.113058657332768; fWeightMatrix0to1[6][3] = -0.988145417834562; fWeightMatrix0to1[7][3] = -1.4527662255869; fWeightMatrix0to1[8][3] = 0.017154334087645; fWeightMatrix0to1[9][3] = -1.81848464917984; fWeightMatrix0to1[10][3] = -0.759416096782015; fWeightMatrix0to1[11][3] = 0.852685587997105; fWeightMatrix0to1[0][4] = 0.378095033735342; fWeightMatrix0to1[1][4] = -0.366026719852706; fWeightMatrix0to1[2][4] = 1.21538005384837; fWeightMatrix0to1[3][4] = -1.21091340603423; fWeightMatrix0to1[4][4] = 0.0989574845296566; fWeightMatrix0to1[5][4] = -0.832023144301221; fWeightMatrix0to1[6][4] = -0.875283742899635; fWeightMatrix0to1[7][4] = 0.000455309443619238; fWeightMatrix0to1[8][4] = -0.254726956891825; fWeightMatrix0to1[9][4] = 0.750966957604668; fWeightMatrix0to1[10][4] = 0.0881784797097952; fWeightMatrix0to1[11][4] = -0.140724949433035; fWeightMatrix0to1[0][5] = -1.04659224569258; fWeightMatrix0to1[1][5] = -0.604407238741956; fWeightMatrix0to1[2][5] = -1.90244159426466; fWeightMatrix0to1[3][5] = 0.00390185624476204; fWeightMatrix0to1[4][5] = 1.12415138836943; fWeightMatrix0to1[5][5] = -1.96621470381464; fWeightMatrix0to1[6][5] = 0.724670559868719; fWeightMatrix0to1[7][5] = -0.873736376056419; fWeightMatrix0to1[8][5] = -0.245496883481544; fWeightMatrix0to1[9][5] = 0.261308255544317; fWeightMatrix0to1[10][5] = 0.199167721647133; fWeightMatrix0to1[11][5] = -0.717684723943563; fWeightMatrix0to1[0][6] = -0.991635178578782; fWeightMatrix0to1[1][6] = -0.623209113168749; fWeightMatrix0to1[2][6] = 1.096792812569; fWeightMatrix0to1[3][6] = -0.633195215779744; fWeightMatrix0to1[4][6] = -0.305402337242563; fWeightMatrix0to1[5][6] = -0.58488825021215; fWeightMatrix0to1[6][6] = -2.35932966546663; fWeightMatrix0to1[7][6] = 1.82540398710115; fWeightMatrix0to1[8][6] = -0.237142013182783; fWeightMatrix0to1[9][6] = 0.686649025062828; fWeightMatrix0to1[10][6] = 1.59430028579304; fWeightMatrix0to1[11][6] = 0.901183405273222; fWeightMatrix0to1[0][7] = -0.710720590765317; fWeightMatrix0to1[1][7] = 0.213880226582643; fWeightMatrix0to1[2][7] = 1.99045931283655; fWeightMatrix0to1[3][7] = -1.17686636602443; fWeightMatrix0to1[4][7] = 1.38676745508859; fWeightMatrix0to1[5][7] = -0.315236777835268; fWeightMatrix0to1[6][7] = -0.591385433484942; fWeightMatrix0to1[7][7] = -0.75708838101247; fWeightMatrix0to1[8][7] = -0.36155327101873; fWeightMatrix0to1[9][7] = 1.53966147699526; fWeightMatrix0to1[10][7] = -0.971059663087132; fWeightMatrix0to1[11][7] = 2.82900252711097; fWeightMatrix0to1[0][8] = -1.07049816490453; fWeightMatrix0to1[1][8] = -0.780581667078818; fWeightMatrix0to1[2][8] = 1.86187482799198; fWeightMatrix0to1[3][8] = -2.07794489327706; fWeightMatrix0to1[4][8] = -0.246069302374294; fWeightMatrix0to1[5][8] = 0.301491671718585; fWeightMatrix0to1[6][8] = -0.815835166447342; fWeightMatrix0to1[7][8] = -0.406843472113614; fWeightMatrix0to1[8][8] = 1.57006197808986; fWeightMatrix0to1[9][8] = -0.724831627784546; fWeightMatrix0to1[10][8] = 0.963168768153087; fWeightMatrix0to1[11][8] = -1.04725865621709; fWeightMatrix0to1[0][9] = -0.547844250302443; fWeightMatrix0to1[1][9] = -1.65996173754803; fWeightMatrix0to1[2][9] = 1.80779179000745; fWeightMatrix0to1[3][9] = -1.1536208994887; fWeightMatrix0to1[4][9] = -0.225077042993424; fWeightMatrix0to1[5][9] = 0.457288131788178; fWeightMatrix0to1[6][9] = -0.820269118535312; fWeightMatrix0to1[7][9] = 0.363342216786131; fWeightMatrix0to1[8][9] = -0.673044464535034; fWeightMatrix0to1[9][9] = -2.10904226206229; fWeightMatrix0to1[10][9] = -1.41802637522869; fWeightMatrix0to1[11][9] = 0.206702906237675; fWeightMatrix0to1[0][10] = -0.62770409092396; fWeightMatrix0to1[1][10] = 0.491810563173041; fWeightMatrix0to1[2][10] = 1.69190310225487; fWeightMatrix0to1[3][10] = -2.23763677173658; fWeightMatrix0to1[4][10] = 0.920294267966316; fWeightMatrix0to1[5][10] = -1.30473125402436; fWeightMatrix0to1[6][10] = -2.35899063823407; fWeightMatrix0to1[7][10] = 0.30553319167342; fWeightMatrix0to1[8][10] = -0.0534074784135472; fWeightMatrix0to1[9][10] = -1.1096784478478; fWeightMatrix0to1[10][10] = -0.934267336068076; fWeightMatrix0to1[11][10] = -1.89440801542306; fWeightMatrix0to1[0][11] = -1.04113048611061; fWeightMatrix0to1[1][11] = -0.176102866321574; fWeightMatrix0to1[2][11] = -0.120290728994856; fWeightMatrix0to1[3][11] = 0.949557036625155; fWeightMatrix0to1[4][11] = 0.9484084307774; fWeightMatrix0to1[5][11] = 1.14956084287334; fWeightMatrix0to1[6][11] = -1.98484275078501; fWeightMatrix0to1[7][11] = -0.500344573640921; fWeightMatrix0to1[8][11] = 2.47811703493995; fWeightMatrix0to1[9][11] = -1.39155916559325; fWeightMatrix0to1[10][11] = 0.0407340779267307; fWeightMatrix0to1[11][11] = 0.136089244118214; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.47322868127709; fWeightMatrix1to2[1][0] = 0.243335770106648; fWeightMatrix1to2[2][0] = -1.05279835635934; fWeightMatrix1to2[3][0] = -0.909381337971985; fWeightMatrix1to2[4][0] = -2.01168376824092; fWeightMatrix1to2[5][0] = -0.0641519619261345; fWeightMatrix1to2[6][0] = 1.58149054268347; fWeightMatrix1to2[7][0] = 1.34694459623603; fWeightMatrix1to2[8][0] = 1.48146271918256; fWeightMatrix1to2[9][0] = -2.59559564551705; fWeightMatrix1to2[10][0] = -0.312279153721929; fWeightMatrix1to2[0][1] = 0.973548287090749; fWeightMatrix1to2[1][1] = -1.57743675257487; fWeightMatrix1to2[2][1] = -0.832497119057555; fWeightMatrix1to2[3][1] = -0.0597184130100237; fWeightMatrix1to2[4][1] = 1.21878260068101; fWeightMatrix1to2[5][1] = -1.7181603340952; fWeightMatrix1to2[6][1] = -0.744777723933667; fWeightMatrix1to2[7][1] = 0.0181159063003982; fWeightMatrix1to2[8][1] = -1.50338398341431; fWeightMatrix1to2[9][1] = -1.73516657560281; fWeightMatrix1to2[10][1] = 0.291161449642245; fWeightMatrix1to2[0][2] = 0.56131240495917; fWeightMatrix1to2[1][2] = -0.176899554882005; fWeightMatrix1to2[2][2] = -1.21693065729406; fWeightMatrix1to2[3][2] = -0.15843585336398; fWeightMatrix1to2[4][2] = 1.4360424713243; fWeightMatrix1to2[5][2] = -0.952945823553856; fWeightMatrix1to2[6][2] = 0.911588929168844; fWeightMatrix1to2[7][2] = -2.04275390930761; fWeightMatrix1to2[8][2] = 0.272059592393452; fWeightMatrix1to2[9][2] = -0.111231250252059; fWeightMatrix1to2[10][2] = 0.0707389540035337; fWeightMatrix1to2[0][3] = -1.33982764015074; fWeightMatrix1to2[1][3] = 0.679245903878496; fWeightMatrix1to2[2][3] = 1.17418679295537; fWeightMatrix1to2[3][3] = -2.12122750863859; fWeightMatrix1to2[4][3] = 0.580027995593341; fWeightMatrix1to2[5][3] = 0.479894659850369; fWeightMatrix1to2[6][3] = 1.04264474406692; fWeightMatrix1to2[7][3] = 0.108256668065457; fWeightMatrix1to2[8][3] = -1.49827510003813; fWeightMatrix1to2[9][3] = -1.89095737441837; fWeightMatrix1to2[10][3] = -0.137829137240786; fWeightMatrix1to2[0][4] = 0.465165890499185; fWeightMatrix1to2[1][4] = 0.344539979186118; fWeightMatrix1to2[2][4] = 0.208236736458946; fWeightMatrix1to2[3][4] = -0.901541171976458; fWeightMatrix1to2[4][4] = 0.618206631376148; fWeightMatrix1to2[5][4] = 0.523149422171104; fWeightMatrix1to2[6][4] = -1.28516282317132; fWeightMatrix1to2[7][4] = 0.0223707312691244; fWeightMatrix1to2[8][4] = 0.96846605925089; fWeightMatrix1to2[9][4] = 0.074553215692005; fWeightMatrix1to2[10][4] = 1.34934959977001; fWeightMatrix1to2[0][5] = -0.582451988181287; fWeightMatrix1to2[1][5] = 0.609343710724083; fWeightMatrix1to2[2][5] = -0.711513067502285; fWeightMatrix1to2[3][5] = -0.703453693945482; fWeightMatrix1to2[4][5] = -2.19758775137845; fWeightMatrix1to2[5][5] = -1.4191232689091; fWeightMatrix1to2[6][5] = -1.00019653976958; fWeightMatrix1to2[7][5] = 0.142431968585196; fWeightMatrix1to2[8][5] = -2.13810324725952; fWeightMatrix1to2[9][5] = 1.53233267044637; fWeightMatrix1to2[10][5] = -0.820220284791748; fWeightMatrix1to2[0][6] = -1.17424267611; fWeightMatrix1to2[1][6] = 1.24311778712367; fWeightMatrix1to2[2][6] = -1.48701605234596; fWeightMatrix1to2[3][6] = -1.93314613838533; fWeightMatrix1to2[4][6] = 0.660328282238106; fWeightMatrix1to2[5][6] = -0.305711551031584; fWeightMatrix1to2[6][6] = 0.0217084824839746; fWeightMatrix1to2[7][6] = -2.72168754823776; fWeightMatrix1to2[8][6] = -1.10764703983074; fWeightMatrix1to2[9][6] = 1.23938192405941; fWeightMatrix1to2[10][6] = -3.38262868527497; fWeightMatrix1to2[0][7] = 1.09460329279828; fWeightMatrix1to2[1][7] = 0.348989947311748; fWeightMatrix1to2[2][7] = -0.781242102531765; fWeightMatrix1to2[3][7] = -1.29649728870197; fWeightMatrix1to2[4][7] = -1.10060969469968; fWeightMatrix1to2[5][7] = -0.466667352562041; fWeightMatrix1to2[6][7] = -0.746504034328976; fWeightMatrix1to2[7][7] = -1.90748801225051; fWeightMatrix1to2[8][7] = -0.0427815082590192; fWeightMatrix1to2[9][7] = -1.38310254505749; fWeightMatrix1to2[10][7] = 1.39223746211525; fWeightMatrix1to2[0][8] = -1.6718032863627; fWeightMatrix1to2[1][8] = 0.256309208856497; fWeightMatrix1to2[2][8] = 0.563945169517476; fWeightMatrix1to2[3][8] = 0.702938325824029; fWeightMatrix1to2[4][8] = -1.46856261538611; fWeightMatrix1to2[5][8] = 1.33242980661474; fWeightMatrix1to2[6][8] = -2.23924967341314; fWeightMatrix1to2[7][8] = -0.0975215758998997; fWeightMatrix1to2[8][8] = -0.379900280045777; fWeightMatrix1to2[9][8] = -2.20748895774881; fWeightMatrix1to2[10][8] = 1.67494846681109; fWeightMatrix1to2[0][9] = -0.266478862654552; fWeightMatrix1to2[1][9] = 1.16967966706024; fWeightMatrix1to2[2][9] = 1.34690582400945; fWeightMatrix1to2[3][9] = -0.000593929063721493; fWeightMatrix1to2[4][9] = 1.05881985390228; fWeightMatrix1to2[5][9] = -0.923171812082349; fWeightMatrix1to2[6][9] = -1.30071469325092; fWeightMatrix1to2[7][9] = -0.690033643764885; fWeightMatrix1to2[8][9] = -0.726740165762221; fWeightMatrix1to2[9][9] = -1.47338546499684; fWeightMatrix1to2[10][9] = -1.49213654700723; fWeightMatrix1to2[0][10] = 0.900802148584175; fWeightMatrix1to2[1][10] = -0.249301911197944; fWeightMatrix1to2[2][10] = -1.9497150980127; fWeightMatrix1to2[3][10] = 1.1753661875004; fWeightMatrix1to2[4][10] = -1.47708171011089; fWeightMatrix1to2[5][10] = -0.269721542383481; fWeightMatrix1to2[6][10] = -1.35281019621987; fWeightMatrix1to2[7][10] = 0.202101861220668; fWeightMatrix1to2[8][10] = -0.281393799963119; fWeightMatrix1to2[9][10] = -0.482187702017658; fWeightMatrix1to2[10][10] = -0.783435797428055; fWeightMatrix1to2[0][11] = 0.80309064897014; fWeightMatrix1to2[1][11] = -0.292791769683607; fWeightMatrix1to2[2][11] = 0.485307387819673; fWeightMatrix1to2[3][11] = -1.31531482721169; fWeightMatrix1to2[4][11] = -0.717300216978003; fWeightMatrix1to2[5][11] = -1.92416007244475; fWeightMatrix1to2[6][11] = 0.768007667762561; fWeightMatrix1to2[7][11] = 0.948634815157944; fWeightMatrix1to2[8][11] = -0.330927312715524; fWeightMatrix1to2[9][11] = 1.75553185904996; fWeightMatrix1to2[10][11] = 0.123760234216206; fWeightMatrix1to2[0][12] = -0.103541959941552; fWeightMatrix1to2[1][12] = 0.450387444128983; fWeightMatrix1to2[2][12] = -0.700824106313548; fWeightMatrix1to2[3][12] = -0.511366336284469; fWeightMatrix1to2[4][12] = 1.57658356182719; fWeightMatrix1to2[5][12] = 1.2685659628615; fWeightMatrix1to2[6][12] = -2.28412929220638; fWeightMatrix1to2[7][12] = -2.24500004583237; fWeightMatrix1to2[8][12] = 0.231876020019525; fWeightMatrix1to2[9][12] = -0.207997711113955; fWeightMatrix1to2[10][12] = 1.62289147261988; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.385380682847755; fWeightMatrix2to3[0][1] = 0.27103842709366; fWeightMatrix2to3[0][2] = -0.863200440767901; fWeightMatrix2to3[0][3] = 0.0559365943420652; fWeightMatrix2to3[0][4] = -0.0823087305728585; fWeightMatrix2to3[0][5] = -0.801273068771117; fWeightMatrix2to3[0][6] = -1.19359837021953; fWeightMatrix2to3[0][7] = -0.221760485691592; fWeightMatrix2to3[0][8] = 0.906963398647611; fWeightMatrix2to3[0][9] = 1.44239093453722; fWeightMatrix2to3[0][10] = -0.572789483718738; fWeightMatrix2to3[0][11] = 0.813883696526084; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l