// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:08:49 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 10764 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,0.999995052814] LRWW LRWW 'F' [2.17662395269e-15,1.0015938282] LRWg LRWg 'F' [0,0.999949336052] LRWj LRWj 'F' [-0.00159388501197,1] LRZZ LRZZ 'F' [0,0.995219707489] Met Met 'F' [15.1351919174,275.269592285] MetDelPhi MetDelPhi 'F' [0.126148730516,3.13938593864] MetSpec MetSpec 'F' [15.0015153885,147.88381958] dPhiLeptons dPhiLeptons 'F' [0.000421762466431,3.13967418671] dRLeptons dRLeptons 'F' [0.407130092382,4.52171230316] dimass dimass 'F' [16.0061912537,453.994476318] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 0.999995052814484; fVmin[1] = 2.17662395269116e-15; fVmax[1] = 1.00159382820129; fVmin[2] = 0; fVmax[2] = 0.999949336051941; fVmin[3] = -0.001593885011971; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 0.995219707489014; fVmin[5] = 15.1351919174194; fVmax[5] = 275.269592285156; fVmin[6] = 0.126148730516434; fVmax[6] = 3.13938593864441; fVmin[7] = 15.0015153884888; fVmax[7] = 147.883819580078; fVmin[8] = 0.000421762466430664; fVmax[8] = 3.13967418670654; fVmin[9] = 0.407130092382431; fVmax[9] = 4.52171230316162; fVmin[10] = 16.0061912536621; fVmax[10] = 453.994476318359; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.68947161687445; fWeightMatrix0to1[1][0] = 2.46811847137345; fWeightMatrix0to1[2][0] = -0.423152830379053; fWeightMatrix0to1[3][0] = 1.6996389781137; fWeightMatrix0to1[4][0] = 0.446585746389936; fWeightMatrix0to1[5][0] = 2.81125094115633; fWeightMatrix0to1[6][0] = 1.41282349430458; fWeightMatrix0to1[7][0] = -1.05771622015871; fWeightMatrix0to1[8][0] = -0.1470176789744; fWeightMatrix0to1[9][0] = 1.41765000357; fWeightMatrix0to1[10][0] = 1.22595288764892; fWeightMatrix0to1[11][0] = 1.01872438902945; fWeightMatrix0to1[0][1] = 1.69492039319763; fWeightMatrix0to1[1][1] = 1.38960340589374; fWeightMatrix0to1[2][1] = 1.32069998834214; fWeightMatrix0to1[3][1] = -2.01801991441964; fWeightMatrix0to1[4][1] = 1.09930204618278; fWeightMatrix0to1[5][1] = 0.0577161940140724; fWeightMatrix0to1[6][1] = -1.45102093812467; fWeightMatrix0to1[7][1] = 1.07987931450352; fWeightMatrix0to1[8][1] = 1.99076231886993; fWeightMatrix0to1[9][1] = 1.87492660465731; fWeightMatrix0to1[10][1] = 0.515121015332939; fWeightMatrix0to1[11][1] = -0.171338996869393; fWeightMatrix0to1[0][2] = 0.367772800094802; fWeightMatrix0to1[1][2] = 1.06070577029185; fWeightMatrix0to1[2][2] = -0.415939877334002; fWeightMatrix0to1[3][2] = -1.43367176358581; fWeightMatrix0to1[4][2] = 2.04739248232911; fWeightMatrix0to1[5][2] = 0.0278830496318481; fWeightMatrix0to1[6][2] = -0.0745996992554088; fWeightMatrix0to1[7][2] = -0.537697174962606; fWeightMatrix0to1[8][2] = -0.602874366043938; fWeightMatrix0to1[9][2] = -0.41454113231626; fWeightMatrix0to1[10][2] = -1.31916752653096; fWeightMatrix0to1[11][2] = 0.512091501683748; fWeightMatrix0to1[0][3] = 0.493118744661838; fWeightMatrix0to1[1][3] = 0.939094760469839; fWeightMatrix0to1[2][3] = 1.23234569220017; fWeightMatrix0to1[3][3] = 1.63459132910022; fWeightMatrix0to1[4][3] = -1.2341952382589; fWeightMatrix0to1[5][3] = 0.0207828084134658; fWeightMatrix0to1[6][3] = -0.716466572014773; fWeightMatrix0to1[7][3] = -1.60339636511081; fWeightMatrix0to1[8][3] = -0.827529150522545; fWeightMatrix0to1[9][3] = -1.53694140177274; fWeightMatrix0to1[10][3] = -0.305312331539596; fWeightMatrix0to1[11][3] = 1.01367913176179; fWeightMatrix0to1[0][4] = 0.251421688771617; fWeightMatrix0to1[1][4] = -0.607429091848256; fWeightMatrix0to1[2][4] = 1.14115692728095; fWeightMatrix0to1[3][4] = -1.31513393851102; fWeightMatrix0to1[4][4] = -0.0408543796026979; fWeightMatrix0to1[5][4] = -0.484164568344092; fWeightMatrix0to1[6][4] = -0.332152196275794; fWeightMatrix0to1[7][4] = 0.977672462646503; fWeightMatrix0to1[8][4] = 0.0258241271912341; fWeightMatrix0to1[9][4] = 0.677407977029712; fWeightMatrix0to1[10][4] = -0.0863185979077634; fWeightMatrix0to1[11][4] = -0.0669044089066977; fWeightMatrix0to1[0][5] = -0.894786887116269; fWeightMatrix0to1[1][5] = -0.639302455819293; fWeightMatrix0to1[2][5] = -2.04460780260642; fWeightMatrix0to1[3][5] = 0.430452227569155; fWeightMatrix0to1[4][5] = 1.17353111611932; fWeightMatrix0to1[5][5] = -1.7354892973108; fWeightMatrix0to1[6][5] = 1.49001559106989; fWeightMatrix0to1[7][5] = -0.0982667284913572; fWeightMatrix0to1[8][5] = -0.187589697843569; fWeightMatrix0to1[9][5] = 0.216471725831547; fWeightMatrix0to1[10][5] = -0.166248770446373; fWeightMatrix0to1[11][5] = -0.966267509743991; fWeightMatrix0to1[0][6] = -1.14296519346094; fWeightMatrix0to1[1][6] = -0.100639965260505; fWeightMatrix0to1[2][6] = 1.14100397323501; fWeightMatrix0to1[3][6] = -0.681796545232712; fWeightMatrix0to1[4][6] = 0.0622589454017567; fWeightMatrix0to1[5][6] = -0.466778592619129; fWeightMatrix0to1[6][6] = -1.22068592983984; fWeightMatrix0to1[7][6] = 1.65404128284595; fWeightMatrix0to1[8][6] = 0.36568922436881; fWeightMatrix0to1[9][6] = 1.60840462916675; fWeightMatrix0to1[10][6] = 1.43284256756872; fWeightMatrix0to1[11][6] = 0.670170525851989; fWeightMatrix0to1[0][7] = -0.532948904735527; fWeightMatrix0to1[1][7] = 0.278413730030205; fWeightMatrix0to1[2][7] = 1.85759424048165; fWeightMatrix0to1[3][7] = -0.828322963396193; fWeightMatrix0to1[4][7] = 1.58338488155531; fWeightMatrix0to1[5][7] = -0.676549466363819; fWeightMatrix0to1[6][7] = -0.370525217295628; fWeightMatrix0to1[7][7] = 0.18111983914872; fWeightMatrix0to1[8][7] = 0.00401875111016684; fWeightMatrix0to1[9][7] = 1.46085918945001; fWeightMatrix0to1[10][7] = -1.47611118547378; fWeightMatrix0to1[11][7] = 2.24030698018001; fWeightMatrix0to1[0][8] = -0.300457172118767; fWeightMatrix0to1[1][8] = -0.630262739338355; fWeightMatrix0to1[2][8] = 1.62869958851522; fWeightMatrix0to1[3][8] = -1.86833005835286; fWeightMatrix0to1[4][8] = -0.0597147926926143; fWeightMatrix0to1[5][8] = 0.89021857816515; fWeightMatrix0to1[6][8] = -0.573692765380444; fWeightMatrix0to1[7][8] = -0.0593194966288885; fWeightMatrix0to1[8][8] = 1.74575792662837; fWeightMatrix0to1[9][8] = -0.0115206220043303; fWeightMatrix0to1[10][8] = 1.3760769350417; fWeightMatrix0to1[11][8] = -1.04589448457288; fWeightMatrix0to1[0][9] = -0.183484061639826; fWeightMatrix0to1[1][9] = -1.66890673007129; fWeightMatrix0to1[2][9] = 1.74456096663868; fWeightMatrix0to1[3][9] = -1.1319316980566; fWeightMatrix0to1[4][9] = -0.123858744191721; fWeightMatrix0to1[5][9] = 0.64557115292348; fWeightMatrix0to1[6][9] = -1.46057855180332; fWeightMatrix0to1[7][9] = 1.21035337925966; fWeightMatrix0to1[8][9] = -1.30209058900664; fWeightMatrix0to1[9][9] = -1.89606467654604; fWeightMatrix0to1[10][9] = -0.970749837823885; fWeightMatrix0to1[11][9] = 0.483657791202562; fWeightMatrix0to1[0][10] = -0.62965977219929; fWeightMatrix0to1[1][10] = 0.956483859624464; fWeightMatrix0to1[2][10] = 1.84448494100534; fWeightMatrix0to1[3][10] = -2.02560280301935; fWeightMatrix0to1[4][10] = 0.871100694796679; fWeightMatrix0to1[5][10] = -1.27451436415254; fWeightMatrix0to1[6][10] = -2.9701261815216; fWeightMatrix0to1[7][10] = 1.31689525339896; fWeightMatrix0to1[8][10] = -0.399895643951945; fWeightMatrix0to1[9][10] = -1.18425791557395; fWeightMatrix0to1[10][10] = -1.04903214185664; fWeightMatrix0to1[11][10] = -1.10322828766433; fWeightMatrix0to1[0][11] = -1.07156372215608; fWeightMatrix0to1[1][11] = 0.148537067899626; fWeightMatrix0to1[2][11] = -0.229060144427753; fWeightMatrix0to1[3][11] = 0.939342936802946; fWeightMatrix0to1[4][11] = 1.04673995619725; fWeightMatrix0to1[5][11] = 1.43588231714994; fWeightMatrix0to1[6][11] = -2.05800989753775; fWeightMatrix0to1[7][11] = -1.49010124288818; fWeightMatrix0to1[8][11] = 2.2036451899239; fWeightMatrix0to1[9][11] = -1.36770861395954; fWeightMatrix0to1[10][11] = 0.496093157599756; fWeightMatrix0to1[11][11] = -0.191259378012063; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.62026632471849; fWeightMatrix1to2[1][0] = 0.372087454187249; fWeightMatrix1to2[2][0] = -0.836598772031547; fWeightMatrix1to2[3][0] = -0.814843878849933; fWeightMatrix1to2[4][0] = -2.21779873734479; fWeightMatrix1to2[5][0] = -0.21333748447208; fWeightMatrix1to2[6][0] = 1.58944388957467; fWeightMatrix1to2[7][0] = 1.65919707073936; fWeightMatrix1to2[8][0] = 1.61085409112487; fWeightMatrix1to2[9][0] = -1.94054211124858; fWeightMatrix1to2[10][0] = -0.377439690828793; fWeightMatrix1to2[0][1] = 0.720781463380943; fWeightMatrix1to2[1][1] = -1.5525026404778; fWeightMatrix1to2[2][1] = -0.944260135388997; fWeightMatrix1to2[3][1] = -0.0592950762342603; fWeightMatrix1to2[4][1] = 1.44529565767567; fWeightMatrix1to2[5][1] = -1.91829715865128; fWeightMatrix1to2[6][1] = -0.77204577753434; fWeightMatrix1to2[7][1] = 0.0608678820917878; fWeightMatrix1to2[8][1] = -1.37076557204046; fWeightMatrix1to2[9][1] = -2.02892542288919; fWeightMatrix1to2[10][1] = 1.21725204627027; fWeightMatrix1to2[0][2] = 0.533339541419098; fWeightMatrix1to2[1][2] = -0.113042335879419; fWeightMatrix1to2[2][2] = -0.933195949205692; fWeightMatrix1to2[3][2] = -0.144312905385284; fWeightMatrix1to2[4][2] = 1.03847673928725; fWeightMatrix1to2[5][2] = -1.44957876836251; fWeightMatrix1to2[6][2] = 1.00627288691113; fWeightMatrix1to2[7][2] = -2.00149571179075; fWeightMatrix1to2[8][2] = 0.25973768888349; fWeightMatrix1to2[9][2] = -0.114584347258752; fWeightMatrix1to2[10][2] = -0.471001528604517; fWeightMatrix1to2[0][3] = -1.24264878811548; fWeightMatrix1to2[1][3] = 1.00396420278131; fWeightMatrix1to2[2][3] = 0.909049690437961; fWeightMatrix1to2[3][3] = -2.1097969807082; fWeightMatrix1to2[4][3] = 0.594035806627104; fWeightMatrix1to2[5][3] = 0.293255648567016; fWeightMatrix1to2[6][3] = 1.61509860237825; fWeightMatrix1to2[7][3] = 0.27676385533113; fWeightMatrix1to2[8][3] = -1.23842027849883; fWeightMatrix1to2[9][3] = -1.56665766788586; fWeightMatrix1to2[10][3] = 0.141770253825407; fWeightMatrix1to2[0][4] = 0.622551358757395; fWeightMatrix1to2[1][4] = 0.426458335673572; fWeightMatrix1to2[2][4] = 0.380497935751377; fWeightMatrix1to2[3][4] = -0.865286756364957; fWeightMatrix1to2[4][4] = 0.331157451241065; fWeightMatrix1to2[5][4] = -0.485141225589176; fWeightMatrix1to2[6][4] = -1.58046599348042; fWeightMatrix1to2[7][4] = 0.278150472348462; fWeightMatrix1to2[8][4] = 0.923537394177008; fWeightMatrix1to2[9][4] = -0.0939593468247158; fWeightMatrix1to2[10][4] = 0.623266383668055; fWeightMatrix1to2[0][5] = -0.149745497593903; fWeightMatrix1to2[1][5] = 0.899193093299354; fWeightMatrix1to2[2][5] = -0.334595408361763; fWeightMatrix1to2[3][5] = -0.574162642714199; fWeightMatrix1to2[4][5] = -2.29416246490348; fWeightMatrix1to2[5][5] = -1.40408998350983; fWeightMatrix1to2[6][5] = -0.915244140365508; fWeightMatrix1to2[7][5] = 0.528315202661693; fWeightMatrix1to2[8][5] = -2.2805045111585; fWeightMatrix1to2[9][5] = 1.41733005097026; fWeightMatrix1to2[10][5] = -0.56525282191677; fWeightMatrix1to2[0][6] = -1.07095556789927; fWeightMatrix1to2[1][6] = 1.39905326022826; fWeightMatrix1to2[2][6] = -1.27665969483918; fWeightMatrix1to2[3][6] = -1.86225415715224; fWeightMatrix1to2[4][6] = 0.835652224321386; fWeightMatrix1to2[5][6] = -0.907710765300982; fWeightMatrix1to2[6][6] = -0.0572914165390077; fWeightMatrix1to2[7][6] = -2.34451595149624; fWeightMatrix1to2[8][6] = -0.892000036886155; fWeightMatrix1to2[9][6] = 1.88578874508975; fWeightMatrix1to2[10][6] = -3.7967866594485; fWeightMatrix1to2[0][7] = 1.48625443624101; fWeightMatrix1to2[1][7] = 0.645113921808004; fWeightMatrix1to2[2][7] = -0.364622994803198; fWeightMatrix1to2[3][7] = -1.10644833503668; fWeightMatrix1to2[4][7] = -1.37038239462621; fWeightMatrix1to2[5][7] = -0.867636875213182; fWeightMatrix1to2[6][7] = -0.89459776990971; fWeightMatrix1to2[7][7] = -1.62415720557937; fWeightMatrix1to2[8][7] = -0.0169522138991975; fWeightMatrix1to2[9][7] = -1.22410359695801; fWeightMatrix1to2[10][7] = 1.7649009444086; fWeightMatrix1to2[0][8] = -1.30849134993285; fWeightMatrix1to2[1][8] = 0.568273141941294; fWeightMatrix1to2[2][8] = 0.790925949863102; fWeightMatrix1to2[3][8] = 0.775698804790897; fWeightMatrix1to2[4][8] = -1.85566818945017; fWeightMatrix1to2[5][8] = 0.71441396023427; fWeightMatrix1to2[6][8] = -2.25079020103165; fWeightMatrix1to2[7][8] = 0.254734126110145; fWeightMatrix1to2[8][8] = -0.557552881558574; fWeightMatrix1to2[9][8] = -2.16774220681125; fWeightMatrix1to2[10][8] = 1.81875553226921; fWeightMatrix1to2[0][9] = 0.0379375156500776; fWeightMatrix1to2[1][9] = 1.29207718991557; fWeightMatrix1to2[2][9] = 1.51013575289106; fWeightMatrix1to2[3][9] = 0.127270250880133; fWeightMatrix1to2[4][9] = 1.02316284297644; fWeightMatrix1to2[5][9] = -1.21724487367421; fWeightMatrix1to2[6][9] = -1.28360236575166; fWeightMatrix1to2[7][9] = -0.517124505658746; fWeightMatrix1to2[8][9] = -0.437396175358191; fWeightMatrix1to2[9][9] = -1.63495764911954; fWeightMatrix1to2[10][9] = -0.978373772342783; fWeightMatrix1to2[0][10] = 1.14155609591925; fWeightMatrix1to2[1][10] = 0.0473711332507376; fWeightMatrix1to2[2][10] = -1.95716839975317; fWeightMatrix1to2[3][10] = 1.29040199776563; fWeightMatrix1to2[4][10] = -1.67495038263125; fWeightMatrix1to2[5][10] = -0.65756745927251; fWeightMatrix1to2[6][10] = -1.37114607512359; fWeightMatrix1to2[7][10] = 0.652014382395629; fWeightMatrix1to2[8][10] = -0.343047405770572; fWeightMatrix1to2[9][10] = -0.355088448420129; fWeightMatrix1to2[10][10] = -0.834137137529395; fWeightMatrix1to2[0][11] = 0.649399238483675; fWeightMatrix1to2[1][11] = -0.0855145384110076; fWeightMatrix1to2[2][11] = 0.142060698010815; fWeightMatrix1to2[3][11] = -1.417204885596; fWeightMatrix1to2[4][11] = -0.632779109846491; fWeightMatrix1to2[5][11] = -0.775869902340652; fWeightMatrix1to2[6][11] = 0.918950899508775; fWeightMatrix1to2[7][11] = 1.10274849810322; fWeightMatrix1to2[8][11] = 0.0296925498296716; fWeightMatrix1to2[9][11] = 1.1344697013168; fWeightMatrix1to2[10][11] = 0.115485684443586; fWeightMatrix1to2[0][12] = 0.101773151799514; fWeightMatrix1to2[1][12] = 0.809255382495267; fWeightMatrix1to2[2][12] = -0.724195105554074; fWeightMatrix1to2[3][12] = -0.442465836579917; fWeightMatrix1to2[4][12] = 1.26938576802467; fWeightMatrix1to2[5][12] = 0.649919349863825; fWeightMatrix1to2[6][12] = -1.78474438512697; fWeightMatrix1to2[7][12] = -2.01041989901341; fWeightMatrix1to2[8][12] = 0.197815567571606; fWeightMatrix1to2[9][12] = -0.245331404816149; fWeightMatrix1to2[10][12] = 1.78803124451618; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.253301425445233; fWeightMatrix2to3[0][1] = 0.466607184542433; fWeightMatrix2to3[0][2] = -0.250370048128024; fWeightMatrix2to3[0][3] = -0.361160904792614; fWeightMatrix2to3[0][4] = -0.765270572710058; fWeightMatrix2to3[0][5] = -1.45423561895882; fWeightMatrix2to3[0][6] = -1.79892366433004; fWeightMatrix2to3[0][7] = 0.234447070363376; fWeightMatrix2to3[0][8] = 0.859657012334496; fWeightMatrix2to3[0][9] = 1.33940526199404; fWeightMatrix2to3[0][10] = -0.995900695494867; fWeightMatrix2to3[0][11] = 0.795970024288521; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l