// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:29:09 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 22164 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [0,1.00408148766] LRWg LRWg 'F' [0,0.99993622303] LRWj LRWj 'F' [-0.00408143829554,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.0160713196,264.722930908] MetDelPhi MetDelPhi 'F' [0.143834546208,3.13655138016] MetSpec MetSpec 'F' [15.0028028488,188.142364502] dPhiLeptons dPhiLeptons 'F' [3.69548797607e-05,3.14073061943] dRLeptons dRLeptons 'F' [0.341997534037,4.52171230316] dimass dimass 'F' [16.0177326202,652.247131348] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 0; fVmax[1] = 1.00408148765564; fVmin[2] = 0; fVmax[2] = 0.99993622303009; fVmin[3] = -0.00408143829554319; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.0160713195801; fVmax[5] = 264.722930908203; fVmin[6] = 0.143834546208382; fVmax[6] = 3.13655138015747; fVmin[7] = 15.0028028488159; fVmax[7] = 188.142364501953; fVmin[8] = 3.69548797607422e-05; fVmax[8] = 3.14073061943054; fVmin[9] = 0.341997534036636; fVmax[9] = 4.52171230316162; fVmin[10] = 16.0177326202393; fVmax[10] = 652.247131347656; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 0.613211447974565; fWeightMatrix0to1[1][0] = 1.19735516581711; fWeightMatrix0to1[2][0] = -0.284950773417979; fWeightMatrix0to1[3][0] = 2.02144896293521; fWeightMatrix0to1[4][0] = 0.526111480481461; fWeightMatrix0to1[5][0] = 2.06492356266353; fWeightMatrix0to1[6][0] = 2.32350540552265; fWeightMatrix0to1[7][0] = -0.127370742046864; fWeightMatrix0to1[8][0] = 0.534331513549976; fWeightMatrix0to1[9][0] = 1.41205858377097; fWeightMatrix0to1[10][0] = 1.52538981549126; fWeightMatrix0to1[11][0] = 0.493122096237733; fWeightMatrix0to1[0][1] = 1.84974108362766; fWeightMatrix0to1[1][1] = 0.607164692544204; fWeightMatrix0to1[2][1] = 1.11661798695218; fWeightMatrix0to1[3][1] = -1.37991731681077; fWeightMatrix0to1[4][1] = 1.35187033483004; fWeightMatrix0to1[5][1] = 0.0117102772851026; fWeightMatrix0to1[6][1] = -1.80587655580226; fWeightMatrix0to1[7][1] = 1.36216519095313; fWeightMatrix0to1[8][1] = 0.973902826937963; fWeightMatrix0to1[9][1] = 0.849953442059277; fWeightMatrix0to1[10][1] = 0.379670944850597; fWeightMatrix0to1[11][1] = -0.306962033395781; fWeightMatrix0to1[0][2] = 0.399610238744364; fWeightMatrix0to1[1][2] = 0.428430320365192; fWeightMatrix0to1[2][2] = 0.594175248012253; fWeightMatrix0to1[3][2] = -1.73371023772627; fWeightMatrix0to1[4][2] = 1.97215174750525; fWeightMatrix0to1[5][2] = 0.19059502836454; fWeightMatrix0to1[6][2] = -0.716052475019457; fWeightMatrix0to1[7][2] = -0.794407687673304; fWeightMatrix0to1[8][2] = -0.674840147095099; fWeightMatrix0to1[9][2] = -1.31996136902879; fWeightMatrix0to1[10][2] = -0.243628611619097; fWeightMatrix0to1[11][2] = 1.05899523435877; fWeightMatrix0to1[0][3] = 0.753997349618224; fWeightMatrix0to1[1][3] = 0.601533965732406; fWeightMatrix0to1[2][3] = 0.831262810531367; fWeightMatrix0to1[3][3] = 2.1146202325515; fWeightMatrix0to1[4][3] = -1.15030759062056; fWeightMatrix0to1[5][3] = -0.141248278227161; fWeightMatrix0to1[6][3] = -0.395360446329566; fWeightMatrix0to1[7][3] = -1.6829191205804; fWeightMatrix0to1[8][3] = -0.152786568771739; fWeightMatrix0to1[9][3] = -1.2122904995859; fWeightMatrix0to1[10][3] = -0.313415818273734; fWeightMatrix0to1[11][3] = 0.695698330850134; fWeightMatrix0to1[0][4] = 0.628028645647161; fWeightMatrix0to1[1][4] = 0.117681721214366; fWeightMatrix0to1[2][4] = 0.731592232922859; fWeightMatrix0to1[3][4] = -1.54404309253471; fWeightMatrix0to1[4][4] = -0.0715655741092681; fWeightMatrix0to1[5][4] = -1.49351053484489; fWeightMatrix0to1[6][4] = -0.235531936255798; fWeightMatrix0to1[7][4] = 0.283351892191331; fWeightMatrix0to1[8][4] = -0.343548842038701; fWeightMatrix0to1[9][4] = 0.532949446497238; fWeightMatrix0to1[10][4] = -0.530597754187149; fWeightMatrix0to1[11][4] = -0.0640064446847375; fWeightMatrix0to1[0][5] = -0.873592751255708; fWeightMatrix0to1[1][5] = -0.3022608856101; fWeightMatrix0to1[2][5] = -2.06230863140504; fWeightMatrix0to1[3][5] = 0.710933711047074; fWeightMatrix0to1[4][5] = 1.13514159782997; fWeightMatrix0to1[5][5] = -1.8936092635865; fWeightMatrix0to1[6][5] = 1.38575740152642; fWeightMatrix0to1[7][5] = -0.555099530999534; fWeightMatrix0to1[8][5] = -0.002414459197943; fWeightMatrix0to1[9][5] = 1.08065143529171; fWeightMatrix0to1[10][5] = -0.714398987929779; fWeightMatrix0to1[11][5] = -1.19192783024985; fWeightMatrix0to1[0][6] = -1.29568226279354; fWeightMatrix0to1[1][6] = -0.854528030293082; fWeightMatrix0to1[2][6] = 1.47058148877292; fWeightMatrix0to1[3][6] = -0.756176698128333; fWeightMatrix0to1[4][6] = -0.575581759839858; fWeightMatrix0to1[5][6] = -0.180505603448863; fWeightMatrix0to1[6][6] = -1.95936810110296; fWeightMatrix0to1[7][6] = 2.31656661601497; fWeightMatrix0to1[8][6] = -0.484163689345917; fWeightMatrix0to1[9][6] = -0.192101836225084; fWeightMatrix0to1[10][6] = 1.01691857498247; fWeightMatrix0to1[11][6] = 1.10245128175167; fWeightMatrix0to1[0][7] = -0.177823370439616; fWeightMatrix0to1[1][7] = 0.613962660046592; fWeightMatrix0to1[2][7] = 1.798679903031; fWeightMatrix0to1[3][7] = -0.468492026762919; fWeightMatrix0to1[4][7] = 1.29587451335024; fWeightMatrix0to1[5][7] = -0.0405904865757386; fWeightMatrix0to1[6][7] = 0.773335162750068; fWeightMatrix0to1[7][7] = -0.362793577711158; fWeightMatrix0to1[8][7] = -0.0522299535828192; fWeightMatrix0to1[9][7] = 1.91482587014237; fWeightMatrix0to1[10][7] = -2.11480681209309; fWeightMatrix0to1[11][7] = 2.12232070938685; fWeightMatrix0to1[0][8] = -0.464369613731379; fWeightMatrix0to1[1][8] = -0.829457337265395; fWeightMatrix0to1[2][8] = 2.08334686776067; fWeightMatrix0to1[3][8] = -2.0695967915939; fWeightMatrix0to1[4][8] = 0.638409531585905; fWeightMatrix0to1[5][8] = 0.539673366381435; fWeightMatrix0to1[6][8] = 0.547483744678437; fWeightMatrix0to1[7][8] = -1.5147734489714; fWeightMatrix0to1[8][8] = 2.80012165969375; fWeightMatrix0to1[9][8] = -0.336072113904179; fWeightMatrix0to1[10][8] = 0.582729844289055; fWeightMatrix0to1[11][8] = -0.778775457602973; fWeightMatrix0to1[0][9] = 0.404066275613526; fWeightMatrix0to1[1][9] = -2.95644467690105; fWeightMatrix0to1[2][9] = 1.79850461743867; fWeightMatrix0to1[3][9] = -1.23835233137084; fWeightMatrix0to1[4][9] = 0.585583206721445; fWeightMatrix0to1[5][9] = 0.512605124747172; fWeightMatrix0to1[6][9] = -0.156446608355927; fWeightMatrix0to1[7][9] = 0.016068656806824; fWeightMatrix0to1[8][9] = -0.476989590763905; fWeightMatrix0to1[9][9] = -2.22291625600311; fWeightMatrix0to1[10][9] = -0.89380528786807; fWeightMatrix0to1[11][9] = 0.331305708122069; fWeightMatrix0to1[0][10] = -0.788527307488516; fWeightMatrix0to1[1][10] = 0.697946493982259; fWeightMatrix0to1[2][10] = 1.58765059188586; fWeightMatrix0to1[3][10] = -1.66852208769396; fWeightMatrix0to1[4][10] = 1.14304144469938; fWeightMatrix0to1[5][10] = -1.74402271083579; fWeightMatrix0to1[6][10] = -1.8414916712973; fWeightMatrix0to1[7][10] = 0.183960529224865; fWeightMatrix0to1[8][10] = -0.427029194140254; fWeightMatrix0to1[9][10] = -0.504442466394742; fWeightMatrix0to1[10][10] = -1.19069091699479; fWeightMatrix0to1[11][10] = -1.61554772928758; fWeightMatrix0to1[0][11] = -1.38472974348894; fWeightMatrix0to1[1][11] = -0.142956180263834; fWeightMatrix0to1[2][11] = -0.624830824812566; fWeightMatrix0to1[3][11] = 0.988326596332867; fWeightMatrix0to1[4][11] = 0.884184294171006; fWeightMatrix0to1[5][11] = 1.53384927957735; fWeightMatrix0to1[6][11] = -0.144685889412332; fWeightMatrix0to1[7][11] = -0.917633068958881; fWeightMatrix0to1[8][11] = 2.20816487275575; fWeightMatrix0to1[9][11] = -1.94439837229924; fWeightMatrix0to1[10][11] = 0.0288639992994691; fWeightMatrix0to1[11][11] = -0.279190310800803; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.62065223669388; fWeightMatrix1to2[1][0] = 0.108307387426092; fWeightMatrix1to2[2][0] = -0.880033062639659; fWeightMatrix1to2[3][0] = -0.871390091032661; fWeightMatrix1to2[4][0] = -1.9415738735622; fWeightMatrix1to2[5][0] = 0.850582028676234; fWeightMatrix1to2[6][0] = 1.56463733525027; fWeightMatrix1to2[7][0] = 1.62514442521981; fWeightMatrix1to2[8][0] = 1.66488231001733; fWeightMatrix1to2[9][0] = -2.07229722212258; fWeightMatrix1to2[10][0] = -0.681503815517453; fWeightMatrix1to2[0][1] = 0.819163889576295; fWeightMatrix1to2[1][1] = -1.85601080264611; fWeightMatrix1to2[2][1] = -1.03456235515161; fWeightMatrix1to2[3][1] = -0.0530571273269996; fWeightMatrix1to2[4][1] = 1.36953310861049; fWeightMatrix1to2[5][1] = -2.31313582851696; fWeightMatrix1to2[6][1] = -0.792069285656398; fWeightMatrix1to2[7][1] = 0.0405370848170548; fWeightMatrix1to2[8][1] = -1.45005213949101; fWeightMatrix1to2[9][1] = -1.32771772055874; fWeightMatrix1to2[10][1] = -0.649951035141543; fWeightMatrix1to2[0][2] = 0.499181422646766; fWeightMatrix1to2[1][2] = -0.225830161430845; fWeightMatrix1to2[2][2] = -0.868693333704599; fWeightMatrix1to2[3][2] = -0.123281995504986; fWeightMatrix1to2[4][2] = 1.52251699296038; fWeightMatrix1to2[5][2] = -0.269262729416618; fWeightMatrix1to2[6][2] = 0.990941550950125; fWeightMatrix1to2[7][2] = -1.97179099573198; fWeightMatrix1to2[8][2] = 0.55936785880395; fWeightMatrix1to2[9][2] = -0.222616364575549; fWeightMatrix1to2[10][2] = -0.0599358392800542; fWeightMatrix1to2[0][3] = -1.55060332530937; fWeightMatrix1to2[1][3] = 0.0745084788727896; fWeightMatrix1to2[2][3] = 0.753903415234065; fWeightMatrix1to2[3][3] = -2.29208511858721; fWeightMatrix1to2[4][3] = 0.9350907803128; fWeightMatrix1to2[5][3] = 0.0939830108695377; fWeightMatrix1to2[6][3] = 1.01124350689179; fWeightMatrix1to2[7][3] = 0.145202222986952; fWeightMatrix1to2[8][3] = -1.90444405451868; fWeightMatrix1to2[9][3] = -2.12554248970524; fWeightMatrix1to2[10][3] = -0.648603034060713; fWeightMatrix1to2[0][4] = 0.173894824650667; fWeightMatrix1to2[1][4] = 0.0145700454761646; fWeightMatrix1to2[2][4] = 0.15925581582096; fWeightMatrix1to2[3][4] = -0.945216654461261; fWeightMatrix1to2[4][4] = 1.1293766193435; fWeightMatrix1to2[5][4] = 0.79629982988128; fWeightMatrix1to2[6][4] = -1.34614573330032; fWeightMatrix1to2[7][4] = -0.132519281352992; fWeightMatrix1to2[8][4] = 1.13886264827561; fWeightMatrix1to2[9][4] = 0.671752612794543; fWeightMatrix1to2[10][4] = -0.638518123359932; fWeightMatrix1to2[0][5] = -0.793124708375802; fWeightMatrix1to2[1][5] = -0.117602012124577; fWeightMatrix1to2[2][5] = -0.437498681745087; fWeightMatrix1to2[3][5] = -0.790834649238869; fWeightMatrix1to2[4][5] = -2.19172654370689; fWeightMatrix1to2[5][5] = -1.23221361203798; fWeightMatrix1to2[6][5] = -1.01071529897712; fWeightMatrix1to2[7][5] = 0.196007443585614; fWeightMatrix1to2[8][5] = -1.87127472768271; fWeightMatrix1to2[9][5] = 1.4931307899727; fWeightMatrix1to2[10][5] = -1.02337662167422; fWeightMatrix1to2[0][6] = -1.18934805340973; fWeightMatrix1to2[1][6] = 1.00144309024076; fWeightMatrix1to2[2][6] = -1.7549709923021; fWeightMatrix1to2[3][6] = -1.92656173617329; fWeightMatrix1to2[4][6] = 1.06059389012167; fWeightMatrix1to2[5][6] = -0.867723776349739; fWeightMatrix1to2[6][6] = 0.410508682622565; fWeightMatrix1to2[7][6] = -2.50244568356428; fWeightMatrix1to2[8][6] = -1.03421777303862; fWeightMatrix1to2[9][6] = 0.807992471943973; fWeightMatrix1to2[10][6] = -3.05700374118392; fWeightMatrix1to2[0][7] = 0.841363645974351; fWeightMatrix1to2[1][7] = 0.0399899870335095; fWeightMatrix1to2[2][7] = -0.841100780000778; fWeightMatrix1to2[3][7] = -1.30648826219342; fWeightMatrix1to2[4][7] = -1.02037383373538; fWeightMatrix1to2[5][7] = -1.00595648877921; fWeightMatrix1to2[6][7] = -0.733190803096036; fWeightMatrix1to2[7][7] = -2.02323146567523; fWeightMatrix1to2[8][7] = -0.203709906629711; fWeightMatrix1to2[9][7] = -1.12974514224146; fWeightMatrix1to2[10][7] = 0.896938355877366; fWeightMatrix1to2[0][8] = -1.73009442841515; fWeightMatrix1to2[1][8] = -0.529596594821289; fWeightMatrix1to2[2][8] = 1.07419807131525; fWeightMatrix1to2[3][8] = 0.602882287824612; fWeightMatrix1to2[4][8] = -1.09188138167671; fWeightMatrix1to2[5][8] = 1.64986431158832; fWeightMatrix1to2[6][8] = -2.25090058672498; fWeightMatrix1to2[7][8] = 0.130191555635028; fWeightMatrix1to2[8][8] = -0.521951378468118; fWeightMatrix1to2[9][8] = -1.8423568521375; fWeightMatrix1to2[10][8] = 1.39814006526896; fWeightMatrix1to2[0][9] = -0.561376849309435; fWeightMatrix1to2[1][9] = 1.03361404898218; fWeightMatrix1to2[2][9] = 0.98309440975295; fWeightMatrix1to2[3][9] = 0.0523749107753963; fWeightMatrix1to2[4][9] = 1.80553442105013; fWeightMatrix1to2[5][9] = -1.40454340032232; fWeightMatrix1to2[6][9] = -1.23647328524511; fWeightMatrix1to2[7][9] = -0.712377458386237; fWeightMatrix1to2[8][9] = -0.598875320443104; fWeightMatrix1to2[9][9] = -1.25455445890062; fWeightMatrix1to2[10][9] = -0.35529210918877; fWeightMatrix1to2[0][10] = 0.617905878888438; fWeightMatrix1to2[1][10] = -0.98004154415098; fWeightMatrix1to2[2][10] = -2.17093375477283; fWeightMatrix1to2[3][10] = 1.12600440144648; fWeightMatrix1to2[4][10] = -1.02164483718247; fWeightMatrix1to2[5][10] = -1.24015508784935; fWeightMatrix1to2[6][10] = -1.23767055532727; fWeightMatrix1to2[7][10] = 0.193632341496243; fWeightMatrix1to2[8][10] = -0.250506091960279; fWeightMatrix1to2[9][10] = -0.801532846020733; fWeightMatrix1to2[10][10] = -1.54423125670701; fWeightMatrix1to2[0][11] = 0.662484960756071; fWeightMatrix1to2[1][11] = -0.525213886868657; fWeightMatrix1to2[2][11] = 0.231917188392508; fWeightMatrix1to2[3][11] = -1.33991818744167; fWeightMatrix1to2[4][11] = -0.945985499198927; fWeightMatrix1to2[5][11] = -1.45341888865879; fWeightMatrix1to2[6][11] = 0.765292018928697; fWeightMatrix1to2[7][11] = 0.77812517331726; fWeightMatrix1to2[8][11] = -0.390528983881897; fWeightMatrix1to2[9][11] = 0.665645310131135; fWeightMatrix1to2[10][11] = 0.699465166617412; fWeightMatrix1to2[0][12] = -0.319285106160488; fWeightMatrix1to2[1][12] = -0.264560935143222; fWeightMatrix1to2[2][12] = -0.700309173142946; fWeightMatrix1to2[3][12] = -0.591424974588088; fWeightMatrix1to2[4][12] = 1.53472666819057; fWeightMatrix1to2[5][12] = 1.03859033488307; fWeightMatrix1to2[6][12] = -2.28754807757538; fWeightMatrix1to2[7][12] = -2.23412176361639; fWeightMatrix1to2[8][12] = 0.172822374652678; fWeightMatrix1to2[9][12] = -0.243561177580136; fWeightMatrix1to2[10][12] = 0.500217589045797; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.304380597562649; fWeightMatrix2to3[0][1] = -0.186821194346531; fWeightMatrix2to3[0][2] = -0.795371676144378; fWeightMatrix2to3[0][3] = 0.0457872109290413; fWeightMatrix2to3[0][4] = 0.626302241740498; fWeightMatrix2to3[0][5] = -1.01093345950281; fWeightMatrix2to3[0][6] = -1.19795363458502; fWeightMatrix2to3[0][7] = -0.295128108144209; fWeightMatrix2to3[0][8] = 0.959445856139188; fWeightMatrix2to3[0][9] = -0.243959132781375; fWeightMatrix2to3[0][10] = -0.881994132836739; fWeightMatrix2to3[0][11] = 0.694185925583831; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l