// Class: ReadH6AONN5MEMLP // Automatically generated by MethodBase::MakeClass // /* configuration options ===================================================== #GEN -*-*-*-*-*-*-*-*-*-*-*- general info -*-*-*-*-*-*-*-*-*-*-*- Method : MLP::H6AONN5MEMLP TMVA Release : 3.8.6 [198662] ROOT Release : 5.10/00 [330240] Creator : thompson Date : Tue Jan 22 16:16:23 2008 Host : Linux patlx3.fnal.gov 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:13:44 CDT 2005 i686 i686 i386 GNU/Linux Dir : /data/cdf04/thompson/hww_1fb/hwwcdf6.1.4v3_43/Hww/TMVAAna Training events: 15118 #OPT -*-*-*-*-*-*-*-*-*-*-*-*- options -*-*-*-*-*-*-*-*-*-*-*-*- # Set by User: V: "False" [Verbose mode] NCycles: "1000" [Number of training cycles] HiddenLayers: "N+1,N" [Specification of hidden layer architecture] # Default: D: "False" [use-decorrelated-variables flag (depreciated)] Normalise: "True" [Normalise input variables] VarTransform: "None" [Variable transformation method] VarTransformType: "Signal" [Use signal or background events for var transform] NbinsMVAPdf: "60" [Number of bins used to create MVA PDF] NsmoothMVAPdf: "2" [Number of smoothing iterations for MVA PDF] VerboseLevel: "Info" [Verbosity level] H: "False" [Print classifier-specific help message] CreateMVAPdfs: "False" [Create PDFs for classifier outputs] TxtWeightFilesOnly: "True" [if True, write all weights as text files] NeuronType: "sigmoid" [Neuron activation function type] NeuronInputType: "sum" [Neuron input function type] TrainingMethod: "BP" [Train with Back-Propagation (BP - default) or Genetic Algorithm (GA - slower and worse)] LearningRate: "0.02" [ANN learning rate parameter] DecayRate: "0.01" [Decay rate for learning parameter] TestRate: "10" [Test for overtraining performed at each #th epochs] BPMode: "sequential" [Back-propagation learning mode: sequential or batch] BatchSize: "-1" [Batch size: number of events/batch, only set if in Batch Mode, -1 for BatchSize=number_of_events] ## #VAR -*-*-*-*-*-*-*-*-*-*-*-* variables *-*-*-*-*-*-*-*-*-*-*-*- NVar 11 LRHWW LRHWW 'F' [0,1] LRWW LRWW 'F' [1.1345506428e-34,1.001734972] LRWg LRWg 'F' [0,0.999915719032] LRWj LRWj 'F' [-0.00173494115006,1] LRZZ LRZZ 'F' [0,1] Met Met 'F' [15.1351919174,204.075332642] MetDelPhi MetDelPhi 'F' [0.143834546208,3.13630962372] MetSpec MetSpec 'F' [15.0015153885,153.344940186] dPhiLeptons dPhiLeptons 'F' [7.39097595215e-05,3.14107751846] dRLeptons dRLeptons 'F' [0.415245831013,4.52171230316] dimass dimass 'F' [16.0030345917,490.262542725] ============================================================================ */ #include #include #include #include #ifndef IClassifierReader__def #define IClassifierReader__def class IClassifierReader { public: // constructor IClassifierReader() {} virtual ~IClassifierReader() {} // return classifier response virtual double GetMvaValue( const std::vector& inputValues ) const = 0; }; #endif class ReadH6AONN5MEMLP : public IClassifierReader { public: // constructor ReadH6AONN5MEMLP( std::vector& theInputVars ) : IClassifierReader(), fClassName( "ReadH6AONN5MEMLP" ), fStatusIsClean( true ), fNvars( 11 ), fIsNormalised( true ) { // the training input variables const char* inputVars[] = { "LRHWW", "LRWW", "LRWg", "LRWj", "LRZZ", "Met", "MetDelPhi", "MetSpec", "dPhiLeptons", "dRLeptons", "dimass" }; // sanity checks if (theInputVars.size() <= 0) { std::cout << "Problem in class \"" << fClassName << "\": empty input vector" << std::endl; fStatusIsClean = false; } if (theInputVars.size() != fNvars) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in number of input values: " << theInputVars.size() << " != " << fNvars << std::endl; fStatusIsClean = false; } // validate input variables for (size_t ivar = 0; ivar < theInputVars.size(); ivar++) { if (theInputVars[ivar] != inputVars[ivar]) { std::cout << "Problem in class \"" << fClassName << "\": mismatch in input variable names" << std::endl << " for variable [" << ivar << "]: " << theInputVars[ivar].c_str() << " != " << inputVars[ivar] << std::endl; fStatusIsClean = false; } } // initialize min and max vectors (for normalisation) fVmin[0] = 0; fVmax[0] = 1; fVmin[1] = 1.13455064280395e-34; fVmax[1] = 1.00173497200012; fVmin[2] = 0; fVmax[2] = 0.999915719032288; fVmin[3] = -0.00173494115006179; fVmax[3] = 1; fVmin[4] = 0; fVmax[4] = 1; fVmin[5] = 15.1351919174194; fVmax[5] = 204.075332641602; fVmin[6] = 0.143834546208382; fVmax[6] = 3.13630962371826; fVmin[7] = 15.0015153884888; fVmax[7] = 153.344940185547; fVmin[8] = 7.39097595214844e-05; fVmax[8] = 3.14107751846313; fVmin[9] = 0.415245831012726; fVmax[9] = 4.52171230316162; fVmin[10] = 16.0030345916748; fVmax[10] = 490.262542724609; // initialize input variable types fType[0] = 'F'; fType[1] = 'F'; fType[2] = 'F'; fType[3] = 'F'; fType[4] = 'F'; fType[5] = 'F'; fType[6] = 'F'; fType[7] = 'F'; fType[8] = 'F'; fType[9] = 'F'; fType[10] = 'F'; // initialize constants Initialize(); } // destructor virtual ~ReadH6AONN5MEMLP() { Clear(); // method-specific } // the classifier response // "inputValues" is a vector of input values in the same order as the // variables given to the constructor double GetMvaValue( const std::vector& inputValues ) const { // classifier response value double retval = 0; // classifier response, sanity check first if (!fStatusIsClean) { std::cout << "Problem in class \"" << fClassName << "\": cannot return classifier response" << " because status is dirty" << std::endl; retval = 0; } else { if (IsNormalised()) { // normalise variables std::vector iV; int ivar = 0; for (std::vector::const_iterator varIt = inputValues.begin(); varIt != inputValues.end(); varIt++, ivar++) { iV.push_back(NormVariable( *varIt, fVmin[ivar], fVmax[ivar] )); } retval = GetMvaValue__( iV ); } else { retval = GetMvaValue__( inputValues ); } } return retval; } private: // method-specific destructor void Clear(); // common member variables const char* fClassName; bool fStatusIsClean; const size_t fNvars; size_t GetNvar() const { return fNvars; } char GetType( int ivar ) const { return fType[ivar]; } // normalisation of input variables const bool fIsNormalised; bool IsNormalised() const { return fIsNormalised; } double fVmin[11]; double fVmax[11]; double NormVariable( double x, double xmin, double xmax ) const { // normalise to output range: [-1, 1] return 2*(x - xmin)/(xmax - xmin) - 1.0; } // type of input variable: 'F' or 'I' char fType[11]; // initialize internal variables void Initialize(); double GetMvaValue__( const std::vector& inputValues ) const; // private members (method specific) double ActivationFnc(double x) const; int fLayers; int fLayerSize[4]; double fWeightMatrix0to1[13][12]; // weight matrix from layer 0 to 1 double fWeightMatrix1to2[12][13]; // weight matrix from layer 1 to 2 double fWeightMatrix2to3[1][12]; // weight matrix from layer 2 to 3 double * fWeights[4]; }; inline void ReadH6AONN5MEMLP::Initialize() { // build network structure fLayers = 4; fLayerSize[0] = 12; fWeights[0] = new double[12]; fLayerSize[1] = 13; fWeights[1] = new double[13]; fLayerSize[2] = 12; fWeights[2] = new double[12]; fLayerSize[3] = 1; fWeights[3] = new double[1]; // weight matrix from layer 0 to 1 fWeightMatrix0to1[0][0] = 1.93203759799319; fWeightMatrix0to1[1][0] = 1.91912635636299; fWeightMatrix0to1[2][0] = -0.392113958479782; fWeightMatrix0to1[3][0] = 1.90938116937255; fWeightMatrix0to1[4][0] = 0.610967390316145; fWeightMatrix0to1[5][0] = 3.98901145643896; fWeightMatrix0to1[6][0] = 1.04713349754688; fWeightMatrix0to1[7][0] = -0.956899689819063; fWeightMatrix0to1[8][0] = 0.584368982078557; fWeightMatrix0to1[9][0] = 1.64473375809677; fWeightMatrix0to1[10][0] = 1.06779022329932; fWeightMatrix0to1[11][0] = -0.270809143454584; fWeightMatrix0to1[0][1] = 1.46964274325752; fWeightMatrix0to1[1][1] = 1.19344113113818; fWeightMatrix0to1[2][1] = 1.37360108449053; fWeightMatrix0to1[3][1] = -2.01361149127138; fWeightMatrix0to1[4][1] = 1.13986238153604; fWeightMatrix0to1[5][1] = -0.0224775775457241; fWeightMatrix0to1[6][1] = -1.19321280017954; fWeightMatrix0to1[7][1] = 1.16080191109432; fWeightMatrix0to1[8][1] = 1.78542792871395; fWeightMatrix0to1[9][1] = 1.38404108411049; fWeightMatrix0to1[10][1] = 0.523486664440577; fWeightMatrix0to1[11][1] = -0.527290417011897; fWeightMatrix0to1[0][2] = 0.49844960663685; fWeightMatrix0to1[1][2] = 1.55074378275813; fWeightMatrix0to1[2][2] = -0.073639068536119; fWeightMatrix0to1[3][2] = -1.59282708951474; fWeightMatrix0to1[4][2] = 1.99733963889759; fWeightMatrix0to1[5][2] = -0.42487313575433; fWeightMatrix0to1[6][2] = -0.0841734716844268; fWeightMatrix0to1[7][2] = -0.836107363138932; fWeightMatrix0to1[8][2] = -1.06162923622238; fWeightMatrix0to1[9][2] = -0.976739890599993; fWeightMatrix0to1[10][2] = -1.16616702215691; fWeightMatrix0to1[11][2] = 0.228776462736159; fWeightMatrix0to1[0][3] = 0.412609358737418; fWeightMatrix0to1[1][3] = 0.527446881382355; fWeightMatrix0to1[2][3] = 1.24237402871057; fWeightMatrix0to1[3][3] = 1.50763314827473; fWeightMatrix0to1[4][3] = -1.12357059639701; fWeightMatrix0to1[5][3] = -0.24010257730797; fWeightMatrix0to1[6][3] = -0.46243139907571; fWeightMatrix0to1[7][3] = -1.17901057303199; fWeightMatrix0to1[8][3] = -0.298984698723075; fWeightMatrix0to1[9][3] = -1.03557853595708; fWeightMatrix0to1[10][3] = 0.547826547254072; fWeightMatrix0to1[11][3] = 1.35478763253096; fWeightMatrix0to1[0][4] = 0.325817608715079; fWeightMatrix0to1[1][4] = -0.237726482405437; fWeightMatrix0to1[2][4] = 0.979755959798609; fWeightMatrix0to1[3][4] = -0.606392777100619; fWeightMatrix0to1[4][4] = -0.252586863312885; fWeightMatrix0to1[5][4] = -0.716491389818989; fWeightMatrix0to1[6][4] = -0.177094231973646; fWeightMatrix0to1[7][4] = 0.476685067497817; fWeightMatrix0to1[8][4] = -0.30612121669194; fWeightMatrix0to1[9][4] = 0.488151334752025; fWeightMatrix0to1[10][4] = -0.28139262660489; fWeightMatrix0to1[11][4] = -0.215152781691154; fWeightMatrix0to1[0][5] = -0.740909959824854; fWeightMatrix0to1[1][5] = -0.542742566904397; fWeightMatrix0to1[2][5] = -1.84909213983838; fWeightMatrix0to1[3][5] = 0.600611709530786; fWeightMatrix0to1[4][5] = 0.935618041219675; fWeightMatrix0to1[5][5] = -1.51687495022817; fWeightMatrix0to1[6][5] = 1.51652918822746; fWeightMatrix0to1[7][5] = -0.324809537702203; fWeightMatrix0to1[8][5] = -0.407477986298439; fWeightMatrix0to1[9][5] = 0.112193261177526; fWeightMatrix0to1[10][5] = 0.308532888867332; fWeightMatrix0to1[11][5] = -0.500262250228886; fWeightMatrix0to1[0][6] = -1.02416508151962; fWeightMatrix0to1[1][6] = 0.161368935928657; fWeightMatrix0to1[2][6] = 0.802879510848575; fWeightMatrix0to1[3][6] = -1.13455606716027; fWeightMatrix0to1[4][6] = -0.178834123593322; fWeightMatrix0to1[5][6] = -0.306863698258807; fWeightMatrix0to1[6][6] = -1.50607398793705; fWeightMatrix0to1[7][6] = 1.49811768288744; fWeightMatrix0to1[8][6] = -0.169922166761373; fWeightMatrix0to1[9][6] = 1.5822337523983; fWeightMatrix0to1[10][6] = 1.25357713345991; fWeightMatrix0to1[11][6] = 0.910094113161994; fWeightMatrix0to1[0][7] = -0.570587216268508; fWeightMatrix0to1[1][7] = 0.650571458015519; fWeightMatrix0to1[2][7] = 1.89289256678143; fWeightMatrix0to1[3][7] = -0.480251691650232; fWeightMatrix0to1[4][7] = 1.26999742460296; fWeightMatrix0to1[5][7] = -0.655074197106109; fWeightMatrix0to1[6][7] = -0.403463943575848; fWeightMatrix0to1[7][7] = -0.338636167877171; fWeightMatrix0to1[8][7] = -0.112894843486339; fWeightMatrix0to1[9][7] = 1.50221031248101; fWeightMatrix0to1[10][7] = -1.26073740190984; fWeightMatrix0to1[11][7] = 2.89433038450622; fWeightMatrix0to1[0][8] = -0.643775205527979; fWeightMatrix0to1[1][8] = -0.38092938140884; fWeightMatrix0to1[2][8] = 1.90653863631467; fWeightMatrix0to1[3][8] = -1.97009023547058; fWeightMatrix0to1[4][8] = -0.145859253951973; fWeightMatrix0to1[5][8] = -0.125035754237898; fWeightMatrix0to1[6][8] = -0.794175773959813; fWeightMatrix0to1[7][8] = -0.343581778867436; fWeightMatrix0to1[8][8] = 1.43275816850124; fWeightMatrix0to1[9][8] = -0.300443074787136; fWeightMatrix0to1[10][8] = 1.61817300181778; fWeightMatrix0to1[11][8] = -0.769556970656873; fWeightMatrix0to1[0][9] = 0.237553743656175; fWeightMatrix0to1[1][9] = -1.45448744852972; fWeightMatrix0to1[2][9] = 1.95864849754126; fWeightMatrix0to1[3][9] = -1.13362821836721; fWeightMatrix0to1[4][9] = -0.20236304158674; fWeightMatrix0to1[5][9] = -0.100873457462269; fWeightMatrix0to1[6][9] = -0.562916394356626; fWeightMatrix0to1[7][9] = 1.07902990020386; fWeightMatrix0to1[8][9] = -0.83103257152355; fWeightMatrix0to1[9][9] = -1.98453733210181; fWeightMatrix0to1[10][9] = -1.08995626322397; fWeightMatrix0to1[11][9] = 0.130313616191456; fWeightMatrix0to1[0][10] = -0.437932799145754; fWeightMatrix0to1[1][10] = 0.708218779363563; fWeightMatrix0to1[2][10] = 1.76458731865496; fWeightMatrix0to1[3][10] = -1.87237378358298; fWeightMatrix0to1[4][10] = 0.883026214261302; fWeightMatrix0to1[5][10] = -1.58737611838717; fWeightMatrix0to1[6][10] = -3.66620129589852; fWeightMatrix0to1[7][10] = 0.998865934715779; fWeightMatrix0to1[8][10] = -0.111072088660156; fWeightMatrix0to1[9][10] = -1.25727834395552; fWeightMatrix0to1[10][10] = -1.26426824219477; fWeightMatrix0to1[11][10] = -1.24982527692549; fWeightMatrix0to1[0][11] = -1.15894310080138; fWeightMatrix0to1[1][11] = -0.410346088541016; fWeightMatrix0to1[2][11] = -0.287618459598496; fWeightMatrix0to1[3][11] = 0.618089812538713; fWeightMatrix0to1[4][11] = 1.18391149096874; fWeightMatrix0to1[5][11] = 1.07752015299047; fWeightMatrix0to1[6][11] = -2.43894161912399; fWeightMatrix0to1[7][11] = -1.18703565321713; fWeightMatrix0to1[8][11] = 2.6384177554558; fWeightMatrix0to1[9][11] = -1.44301262223822; fWeightMatrix0to1[10][11] = 0.543794347940731; fWeightMatrix0to1[11][11] = -0.414609410484118; // weight matrix from layer 1 to 2 fWeightMatrix1to2[0][0] = -1.76018492150111; fWeightMatrix1to2[1][0] = 0.389414358640214; fWeightMatrix1to2[2][0] = -1.00899974081007; fWeightMatrix1to2[3][0] = -0.815426006637787; fWeightMatrix1to2[4][0] = -2.25784795374289; fWeightMatrix1to2[5][0] = 0.921066145000764; fWeightMatrix1to2[6][0] = 1.41768165263817; fWeightMatrix1to2[7][0] = 1.38302117170495; fWeightMatrix1to2[8][0] = 1.62474891737382; fWeightMatrix1to2[9][0] = -2.4973191251141; fWeightMatrix1to2[10][0] = -0.54302142505725; fWeightMatrix1to2[0][1] = 0.641026197523823; fWeightMatrix1to2[1][1] = -1.48447834706022; fWeightMatrix1to2[2][1] = -0.713669102073791; fWeightMatrix1to2[3][1] = -0.0402796670707248; fWeightMatrix1to2[4][1] = 1.20028989029587; fWeightMatrix1to2[5][1] = -1.86908141564337; fWeightMatrix1to2[6][1] = -0.93842502301962; fWeightMatrix1to2[7][1] = -0.0450266049999273; fWeightMatrix1to2[8][1] = -1.16922225563281; fWeightMatrix1to2[9][1] = -2.07699535001575; fWeightMatrix1to2[10][1] = 0.303964983913458; fWeightMatrix1to2[0][2] = 0.614282768176175; fWeightMatrix1to2[1][2] = -0.152083252956299; fWeightMatrix1to2[2][2] = -1.07804167001128; fWeightMatrix1to2[3][2] = -0.114541973148773; fWeightMatrix1to2[4][2] = 1.12351978788052; fWeightMatrix1to2[5][2] = -1.07012030222861; fWeightMatrix1to2[6][2] = 0.884086591338557; fWeightMatrix1to2[7][2] = -2.03679444359884; fWeightMatrix1to2[8][2] = 0.190893089851754; fWeightMatrix1to2[9][2] = -0.0719501479458695; fWeightMatrix1to2[10][2] = -0.304448378794587; fWeightMatrix1to2[0][3] = -1.33991539700354; fWeightMatrix1to2[1][3] = 0.876767235482547; fWeightMatrix1to2[2][3] = 1.16285891912031; fWeightMatrix1to2[3][3] = -2.04235102014539; fWeightMatrix1to2[4][3] = 0.65525250258936; fWeightMatrix1to2[5][3] = 0.27957501331953; fWeightMatrix1to2[6][3] = 1.14019236858607; fWeightMatrix1to2[7][3] = 0.196044687720233; fWeightMatrix1to2[8][3] = -1.13349666108454; fWeightMatrix1to2[9][3] = -1.77184661664192; fWeightMatrix1to2[10][3] = 0.01477614157379; fWeightMatrix1to2[0][4] = 0.0748117025756278; fWeightMatrix1to2[1][4] = 0.440318145197076; fWeightMatrix1to2[2][4] = 0.109373321949896; fWeightMatrix1to2[3][4] = -0.94395622341143; fWeightMatrix1to2[4][4] = 0.611806775768655; fWeightMatrix1to2[5][4] = 0.17460506402621; fWeightMatrix1to2[6][4] = -1.37163649440483; fWeightMatrix1to2[7][4] = -0.105083196919337; fWeightMatrix1to2[8][4] = 0.974362170391051; fWeightMatrix1to2[9][4] = -0.587817942551025; fWeightMatrix1to2[10][4] = 0.494750175740208; fWeightMatrix1to2[0][5] = -0.355868797990522; fWeightMatrix1to2[1][5] = 0.79846637030286; fWeightMatrix1to2[2][5] = -0.301317740464462; fWeightMatrix1to2[3][5] = -0.470774353287799; fWeightMatrix1to2[4][5] = -2.10205945885006; fWeightMatrix1to2[5][5] = -0.941291976517847; fWeightMatrix1to2[6][5] = -1.01517713065258; fWeightMatrix1to2[7][5] = 0.366903301312326; fWeightMatrix1to2[8][5] = -2.30991532948079; fWeightMatrix1to2[9][5] = 1.51449865833333; fWeightMatrix1to2[10][5] = -1.19139712081949; fWeightMatrix1to2[0][6] = -1.14081935226427; fWeightMatrix1to2[1][6] = 1.38350690751553; fWeightMatrix1to2[2][6] = -1.25454546236917; fWeightMatrix1to2[3][6] = -1.82473310850187; fWeightMatrix1to2[4][6] = 0.649274024831305; fWeightMatrix1to2[5][6] = -0.362544145589969; fWeightMatrix1to2[6][6] = -0.28380052836361; fWeightMatrix1to2[7][6] = -2.53681588776429; fWeightMatrix1to2[8][6] = -1.25643135107893; fWeightMatrix1to2[9][6] = 1.78228850153431; fWeightMatrix1to2[10][6] = -3.79000825863181; fWeightMatrix1to2[0][7] = 1.30541655269081; fWeightMatrix1to2[1][7] = 0.609669952767645; fWeightMatrix1to2[2][7] = -0.997894429776477; fWeightMatrix1to2[3][7] = -1.05112890871595; fWeightMatrix1to2[4][7] = -1.32437858770468; fWeightMatrix1to2[5][7] = -0.953717071555572; fWeightMatrix1to2[6][7] = -0.827524604245094; fWeightMatrix1to2[7][7] = -1.78685707522461; fWeightMatrix1to2[8][7] = -0.401958605310761; fWeightMatrix1to2[9][7] = -1.13144432949552; fWeightMatrix1to2[10][7] = 0.982098067108535; fWeightMatrix1to2[0][8] = -1.51681397527687; fWeightMatrix1to2[1][8] = 0.486858955773257; fWeightMatrix1to2[2][8] = 0.554801189685408; fWeightMatrix1to2[3][8] = 0.804493536567536; fWeightMatrix1to2[4][8] = -1.73586118300774; fWeightMatrix1to2[5][8] = 0.854864287335035; fWeightMatrix1to2[6][8] = -2.32931387950697; fWeightMatrix1to2[7][8] = 0.0677493731318822; fWeightMatrix1to2[8][8] = -0.325424308099081; fWeightMatrix1to2[9][8] = -2.07776530907807; fWeightMatrix1to2[10][8] = 1.54106282269192; fWeightMatrix1to2[0][9] = 0.108639302535881; fWeightMatrix1to2[1][9] = 1.19541178220462; fWeightMatrix1to2[2][9] = 1.22030470103182; fWeightMatrix1to2[3][9] = 0.175158060471842; fWeightMatrix1to2[4][9] = 1.08248985919414; fWeightMatrix1to2[5][9] = -1.51214619356014; fWeightMatrix1to2[6][9] = -1.25111249851883; fWeightMatrix1to2[7][9] = -0.607143000575685; fWeightMatrix1to2[8][9] = -0.798680428930455; fWeightMatrix1to2[9][9] = -1.69956162014315; fWeightMatrix1to2[10][9] = -0.436039683170901; fWeightMatrix1to2[0][10] = 1.02038062300743; fWeightMatrix1to2[1][10] = -0.0874114651714851; fWeightMatrix1to2[2][10] = -1.9513630966022; fWeightMatrix1to2[3][10] = 1.38187868298096; fWeightMatrix1to2[4][10] = -1.59399543553275; fWeightMatrix1to2[5][10] = -0.66359326293901; fWeightMatrix1to2[6][10] = -1.23844253631972; fWeightMatrix1to2[7][10] = 0.423221849103614; fWeightMatrix1to2[8][10] = -0.245928643708205; fWeightMatrix1to2[9][10] = -0.230892445198297; fWeightMatrix1to2[10][10] = -0.767730918555554; fWeightMatrix1to2[0][11] = 0.919236340540877; fWeightMatrix1to2[1][11] = -0.181012807630265; fWeightMatrix1to2[2][11] = 0.73359701296864; fWeightMatrix1to2[3][11] = -1.28520151566973; fWeightMatrix1to2[4][11] = -0.529107755428057; fWeightMatrix1to2[5][11] = -1.61816966813308; fWeightMatrix1to2[6][11] = 0.857942955182592; fWeightMatrix1to2[7][11] = 0.932366702391555; fWeightMatrix1to2[8][11] = -0.0215924908216973; fWeightMatrix1to2[9][11] = 1.05555548833913; fWeightMatrix1to2[10][11] = 0.829107187626085; fWeightMatrix1to2[0][12] = -0.107281481126566; fWeightMatrix1to2[1][12] = 0.720913998431686; fWeightMatrix1to2[2][12] = -0.75016346021492; fWeightMatrix1to2[3][12] = -0.412157844683154; fWeightMatrix1to2[4][12] = 1.43423800590101; fWeightMatrix1to2[5][12] = 0.841197894984392; fWeightMatrix1to2[6][12] = -2.22033519047144; fWeightMatrix1to2[7][12] = -2.18346526138037; fWeightMatrix1to2[8][12] = 0.323067961383045; fWeightMatrix1to2[9][12] = -0.183165788868951; fWeightMatrix1to2[10][12] = 1.6369524561641; // weight matrix from layer 2 to 3 fWeightMatrix2to3[0][0] = -0.621010055337354; fWeightMatrix2to3[0][1] = 0.317622869301767; fWeightMatrix2to3[0][2] = -0.71282110827552; fWeightMatrix2to3[0][3] = -0.153334496289368; fWeightMatrix2to3[0][4] = -0.416810972484272; fWeightMatrix2to3[0][5] = -1.13069444625229; fWeightMatrix2to3[0][6] = -1.29131606340091; fWeightMatrix2to3[0][7] = -0.38690707138915; fWeightMatrix2to3[0][8] = 1.10629872537339; fWeightMatrix2to3[0][9] = 1.33656812807077; fWeightMatrix2to3[0][10] = -0.7607902790011; fWeightMatrix2to3[0][11] = 0.924709720358759; } inline double ReadH6AONN5MEMLP::GetMvaValue__( const std::vector& inputValues ) const { if (inputValues.size() != (unsigned int)fLayerSize[0]-1) { std::cout << "Input vector needs to be of size " << fLayerSize[0]-1 << std::endl; return 0; } for (int l=0; l