NNInput NNInputs_130.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 40610 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 10166 nbkg = 30444 Bkg Entries: 30444 Sig Entries: 10166 Chosen entries: 10166 Signal fraction: 1 Background fraction: 0.333925 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 30444 Actual Signal Entries: 10166 Entries to split: 10166 Test with : 5083 Train with : 5083 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 10166 for Signal Prepared event 0 for Signal with 10166 events ====Entry 0 Variable Ht : 171.663 Variable LepAPt : 34.7432 Variable LepBPt : 21.8204 Variable MetSigLeptonsJets : 5.46879 Variable MetSpec : 58.2422 Variable SumEtLeptonsJets : 113.421 Variable VSumJetLeptonsPt : 68.761 Variable addEt : 114.806 Variable dPhiLepSumMet : 2.3177 Variable dPhiLeptons : 0.165649 Variable dRLeptons : 0.508746 Variable lep1_E : 42.316 Variable lep2_E : 22.1287 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2130 Ht = 171.664 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 34.7434 LepAPt = 34.7432 LepBEt = 21.8209 LepBPt = 21.8204 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 58.2422 MetDelPhi = 2.12969 MetSig = 5.13999 MetSigLeptonsJets = 5.46879 MetSpec = 58.2422 Mjj = 0 MostCentralJetEta = -0.070363 MtllMet = 119.488 Njets = 1 SB = 0 SumEt = 128.396 SumEtJets = 0 SumEtLeptonsJets = 113.421 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 68.761 addEt = 114.806 dPhiLepSumMet = 2.3177 dPhiLeptons = 0.165649 dRLeptons = 0.508746 diltype = 28 dimass = 14.1303 event = 477 jet1_Et = 56.8571 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 42.316 lep2_E = 22.1287 rand = 0.999742 run = 195308 weight = 2.19452e-06 ===Show End Prepared event 10000 for Signal with 10166 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 30444 for Background Prepared event 0 for Background with 30444 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 1.24949 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 30444 events Prepared event 20000 for Background with 30444 events Prepared event 30000 for Background with 30444 events Warning: found 985 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 40610 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 985 negative weights. Signal fraction: 62.3161545 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 56.0800056 61.9221954 63.9100037 65.7300568 67.5141144 68.8980713 70.3786316 71.7165833 73.3626175 74.6322021 76.2419739 77.6222687 78.903923 80.340683 81.5235138 82.8129425 84.0538177 85.3309326 86.3636017 87.6284027 88.9477386 90.2453308 91.2144241 92.124115 93.2895355 93.9583893 94.9867325 96.0972595 97.0477295 97.9887695 98.7812729 99.638298 100.690491 101.674164 102.661415 103.686615 104.380379 105.343704 106.299393 106.95887 107.839661 108.70192 109.590889 110.417191 111.386833 112.124329 113.076813 114.016991 114.738823 115.624664 116.576424 117.391312 118.289467 119.222656 120.179672 120.969864 122.000824 122.82634 123.626846 124.580185 125.595993 126.57843 127.588745 128.654419 129.954041 131.132919 132.348053 133.508163 134.936859 136.329361 137.761536 139.37793 140.717316 142.171188 143.848846 145.471298 147.297928 149.139618 150.684662 152.7995 154.543823 156.614746 159.366425 162.100143 164.573181 167.875534 171.333344 174.772125 178.42804 182.164612 187.373322 192.414948 197.274414 203.018158 209.047623 219.552032 230.295044 241.953918 270.505127 305.144501 679.506348 ------------------------------ Transdef: Tab for variable 3 20.0014019 20.4290371 20.8158131 21.1792507 21.507761 21.8578377 22.1611824 22.4347305 22.7472191 23.0650578 23.3283749 23.5598354 23.8365631 24.1369438 24.3664474 24.6479492 24.9474106 25.1312485 25.3575172 25.6338253 25.8769531 26.1124229 26.2902889 26.5589123 26.7671528 26.9638519 27.1969185 27.4490948 27.678606 27.9259949 28.1328163 28.3370628 28.599411 28.7800484 28.9365063 29.1840897 29.3899994 29.5872498 29.7901001 29.998661 30.226181 30.4324398 30.6750965 30.8728752 31.1221008 31.3416939 31.5795097 31.8123245 32.0262604 32.2132492 32.4177132 32.6130524 32.8121147 33.0949554 33.3838043 33.6195602 33.8792305 34.1693954 34.4196854 34.6457367 34.8850021 35.1313248 35.4256592 35.7060242 36.0493698 36.3579102 36.6423569 36.9062576 37.2329483 37.5040054 37.8492584 38.1726761 38.4788055 38.8011742 39.1620102 39.4645233 39.9423065 40.3438034 40.7915497 41.2265396 41.6615982 42.143116 42.5611115 43.0839767 43.7641144 44.429882 45.1443558 45.8416367 46.4923477 47.325531 48.2789001 49.2839394 50.4132919 51.9474335 53.292984 55.4153061 57.4283447 60.7603378 65.1212997 73.5015259 162.429443 ------------------------------ Transdef: Tab for variable 4 10.0011187 10.1834269 10.3261299 10.4604549 10.6131706 10.813324 10.9939938 11.177351 11.3573818 11.5744858 11.7633619 11.945035 12.1223907 12.2897282 12.4823875 12.6335497 12.7833529 12.979991 13.1604786 13.3554459 13.5022421 13.6554508 13.8668213 14.0941858 14.2592459 14.4364815 14.6165543 14.7852383 14.9697599 15.1685324 15.3624821 15.5343094 15.7369499 15.9182882 16.1026993 16.2913589 16.5027122 16.6860657 16.875824 17.0417557 17.2067871 17.3833046 17.5649033 17.7442932 17.9606895 18.1533737 18.3705597 18.554451 18.732254 18.9435921 19.1470337 19.3574371 19.5446892 19.7568321 19.941452 20.1189461 20.3162117 20.4655018 20.6626282 20.8303337 20.9736481 21.1090698 21.3068695 21.4940033 21.6569939 21.8679771 22.0636177 22.2898312 22.4855595 22.6470451 22.8729115 23.0677948 23.3080444 23.499157 23.7238064 23.9610863 24.1539421 24.3860931 24.6302185 24.8588276 25.1614151 25.4682312 25.8157387 26.123312 26.4461765 26.7217636 27.0895386 27.4305363 27.8362198 28.2694702 28.6785011 29.2163811 29.7993889 30.4209061 31.0506859 31.9886971 33.1777077 34.6486893 36.7600632 40.2930908 68.4152985 ------------------------------ Transdef: Tab for variable 5 1.55223811 2.6093142 2.91325092 3.06762409 3.20402956 3.35425615 3.49821424 3.61487579 3.71913481 3.84160757 3.92737937 4.00744915 4.07539177 4.14275551 4.23210764 4.30355072 4.36810493 4.44079494 4.50432205 4.57559204 4.6522336 4.71659565 4.79852962 4.85366249 4.91914749 4.99398708 5.07439423 5.14933825 5.21920538 5.29319811 5.35018635 5.41875029 5.4772768 5.53099155 5.58700371 5.64763451 5.71376324 5.78222179 5.84413624 5.89421844 5.9551115 6.01652431 6.07380152 6.1261282 6.1777029 6.22650862 6.27150631 6.31875896 6.37574482 6.41044044 6.46068525 6.50363016 6.55109835 6.58572245 6.63902617 6.67968464 6.72404099 6.76720238 6.81218338 6.86151361 6.89568806 6.94139576 6.98361778 7.02802849 7.08073711 7.1302166 7.17806816 7.21989059 7.25835133 7.2912178 7.33285046 7.37949657 7.42045212 7.46355343 7.50491762 7.54611111 7.5974679 7.65267181 7.70615005 7.75024605 7.80371666 7.86113262 7.91673946 7.97472858 8.03949642 8.10386086 8.17366123 8.23786831 8.31778145 8.40049458 8.48389816 8.57640839 8.68561363 8.80060577 8.92871284 9.08219337 9.25846291 9.54364586 9.88881683 10.4677896 15.0165958 ------------------------------ Transdef: Tab for variable 6 25.0005894 25.3158951 25.6305199 25.9520683 26.3336792 26.7800407 27.2371254 27.698122 28.1587982 28.5874538 29.1110077 29.6707916 30.3024578 30.8291779 31.4645042 32.023613 32.5447998 33.0545883 33.7123184 34.2746544 34.8834229 35.4815979 36.1049576 36.8061523 37.3127975 37.7941437 38.361721 38.9809914 39.4128571 39.9476929 40.5309334 40.9708939 41.4667511 41.9219894 42.4034042 42.9982643 43.4860458 43.9675446 44.4718475 44.9196739 45.3960876 45.789196 46.2159195 46.5580177 47.0855865 47.53302 48.0140419 48.4708099 48.9213829 49.2926826 49.6882858 50.141777 50.5834007 51.1004028 51.4962234 51.9404831 52.3911057 52.7769356 53.1812515 53.521553 53.9888382 54.4679718 54.9088593 55.3750267 55.8022385 56.272171 56.7116508 57.1467094 57.6100006 58.0988617 58.5337448 58.9869232 59.5322762 60.0341873 60.5232315 61.001297 61.5809631 62.2526703 62.7790298 63.4305611 64.0014267 64.5871582 65.2540131 65.8977814 66.6114044 67.5273438 68.515152 69.398468 70.4382782 71.5982971 72.7594299 74.0311127 75.8426437 77.8179626 79.846077 82.3964996 86.0459671 90.2989502 96.2644043 107.571144 218.598831 ------------------------------ Transdef: Tab for variable 7 30.1229668 33.3531494 34.7365341 35.6288185 36.4840012 37.2926903 37.9634018 38.6851196 39.3750076 39.9921455 40.5027428 41.2388687 41.8591385 42.3930702 42.8848991 43.4921265 44.1667404 44.6172943 45.2792511 45.8234177 46.32267 46.7938538 47.3354797 47.989212 48.4374275 49.0537872 49.6500854 50.079895 50.6756821 51.1520157 51.6422806 52.1130104 52.6238747 53.119297 53.6105614 54.1590767 54.7033539 55.1768875 55.7303925 56.2271194 56.7685623 57.1802902 57.6365204 58.2079926 58.7088737 59.1656303 59.7690315 60.3246956 60.9939423 61.5248337 62.111763 62.6264763 63.2750092 63.8402405 64.4460907 65.1116562 65.8029022 66.5173645 67.3677673 68.1523438 68.9914932 69.9234772 71.1057587 72.2828293 73.2154999 74.1551514 75.2087097 76.0705566 77.2771454 78.5258942 79.8008423 81.0000153 82.3070679 83.8713455 85.3132095 86.7897339 88.4271317 89.7950974 91.9384079 93.8333206 95.5874023 97.1038666 99.0446625 101.123177 103.294907 105.933105 109.250336 112.097031 115.131134 118.428741 122.75061 126.994804 131.481613 137.566742 143.762665 151.984161 161.302673 171.735809 188.803741 213.407227 395.756897 ------------------------------ Transdef: Tab for variable 8 4.8277607 23.2683334 28.2964249 30.0772285 31.5673141 32.5677567 33.4661179 34.3792953 35.0148773 35.5366592 36.0874634 36.5750504 37.0863266 37.4702225 37.9982376 38.473526 39.022316 39.4191399 39.8347397 40.2312546 40.6892624 41.1018524 41.4973373 41.8796387 42.3013802 42.6480026 43.0944977 43.4857635 43.870018 44.2381744 44.6525841 45.0566483 45.4201736 45.7526627 46.1863327 46.578949 47.0192871 47.3147202 47.6917572 48.0371246 48.455471 48.8157234 49.2096024 49.6007576 49.9478531 50.305481 50.6645126 51.0427551 51.4536896 51.8025818 52.1146088 52.5111694 52.9001007 53.2167282 53.6246643 54.0032654 54.4322433 54.7900047 55.1299248 55.5524712 55.9915924 56.3962517 56.7474899 57.0850601 57.5382652 57.9329376 58.3382263 58.8012466 59.2799683 59.7159195 60.1804619 60.6430054 61.1296158 61.5699692 62.037796 62.4599152 62.9435501 63.4385643 63.9483795 64.4988937 65.1165009 65.7666779 66.5520935 67.2231598 68.0209503 68.8728027 69.8129883 70.8385849 71.9473724 73.252594 74.5210876 76.0248947 77.696785 79.5679016 81.9319305 84.9667587 88.2891769 93.6183472 100.044144 114.025238 289.322327 ------------------------------ Transdef: Tab for variable 9 56.0800056 61.5921402 63.655323 64.90728 66.3503723 67.58564 68.6929016 69.9075928 70.9649811 72.0276718 73.0258408 74.1527557 75.1165924 76.4128265 77.4184036 78.3478088 79.4535065 80.559906 81.425293 82.3977356 83.2376099 84.0855789 85.0777893 85.8181 86.7266846 87.6907959 88.5170593 89.3642883 90.2891846 91.0194397 91.7554398 92.524231 93.3695145 93.964325 94.6521454 95.4762573 96.2310181 97.0477295 97.7763214 98.4784393 99.1042786 99.8422241 100.710938 101.469505 102.256561 102.876617 103.661346 104.308502 104.916382 105.645447 106.356125 106.953033 107.628906 108.224045 108.970993 109.616577 110.252373 110.855988 111.464233 112.086723 112.920105 113.661499 114.295036 114.830765 115.575615 116.214493 116.876053 117.533112 118.303253 118.891289 119.686485 120.357697 120.988701 121.641312 122.337555 122.95401 123.694389 124.439697 125.297684 126.05661 126.714325 127.466904 128.196503 129.257904 130.069458 131.13385 132.156586 133.278748 134.729218 136.178406 137.833069 139.867126 142.170654 144.56395 147.277893 150.592194 155.583435 162.352448 170.851501 189.272141 382.729065 ------------------------------ Transdef: Tab for variable 10 0.0050833188 0.932750285 1.16210186 1.35172904 1.49037683 1.61299443 1.71755624 1.78949928 1.88803077 1.95467138 2.02334213 2.08547878 2.1353693 2.18755436 2.23212481 2.27623034 2.31454182 2.35624385 2.38668609 2.41598749 2.44998455 2.47950125 2.50188446 2.53011298 2.55575895 2.58042645 2.60559678 2.6269145 2.64938164 2.67065144 2.69003105 2.70682669 2.72759151 2.74367189 2.76189375 2.77987313 2.79306364 2.80817842 2.82132292 2.83385348 2.84800029 2.85787296 2.869802 2.88110352 2.89209771 2.90057039 2.90991712 2.91815472 2.92484212 2.93439174 2.9419055 2.94864368 2.95579863 2.96222401 2.96815634 2.97320414 2.97834492 2.98464823 2.99058962 2.99626827 3.00174713 3.00761557 3.01153207 3.01654863 3.0212636 3.02626705 3.03098917 3.03510094 3.03936839 3.04333162 3.04703474 3.05171013 3.05603886 3.06034422 3.06429076 3.06798172 3.07177496 3.07500505 3.07829857 3.08137369 3.08429503 3.08741522 3.09075069 3.09425688 3.09688997 3.09996557 3.10303164 3.10600948 3.10855865 3.11149979 3.11445141 3.11738992 3.12071228 3.12357473 3.12591076 3.12832785 3.13079309 3.13366985 3.13648033 3.1390543 3.14158249 ------------------------------ Transdef: Tab for variable 11 9.75131989E-05 0.00708913803 0.0170280933 0.0265910327 0.0352299213 0.044329524 0.0515897274 0.0584961176 0.0658019781 0.0732005984 0.0803028345 0.0871524513 0.0945317745 0.100392103 0.106379032 0.113418818 0.121537209 0.12777698 0.134264901 0.14068231 0.145148963 0.151047796 0.157214999 0.163029268 0.170622617 0.176901579 0.183022171 0.189017251 0.194710493 0.199269533 0.204230726 0.209585726 0.214799911 0.219475031 0.223698378 0.229053855 0.233813286 0.237279475 0.241654456 0.245920658 0.250203848 0.254229665 0.258891135 0.26314047 0.266979158 0.271949291 0.276057124 0.281024754 0.285555691 0.291211486 0.296716601 0.30139932 0.306092978 0.310758889 0.316553593 0.322108746 0.327835858 0.332736969 0.33817488 0.343371272 0.348202646 0.353035659 0.358486652 0.363708913 0.369042099 0.374574363 0.380159914 0.386937201 0.393599927 0.399656892 0.405863047 0.411065161 0.417095661 0.422667086 0.429711103 0.436506748 0.442489177 0.44927299 0.455357671 0.462492496 0.47165668 0.478000998 0.484988213 0.493332922 0.50190872 0.511858344 0.520088375 0.53065443 0.540171027 0.55217433 0.565008759 0.576566219 0.591882825 0.60723865 0.626717031 0.650797069 0.678014278 0.709611058 0.755418181 0.822824955 1.08510637 ------------------------------ Transdef: Tab for variable 12 0.200005442 0.208050251 0.215447903 0.223542362 0.231524646 0.238799021 0.244969174 0.250573426 0.255793393 0.261833489 0.267581046 0.272219181 0.278150856 0.284358323 0.290383577 0.296125352 0.300811887 0.30625096 0.312135756 0.31673038 0.322164297 0.326921046 0.331610173 0.336018056 0.340607941 0.345658481 0.350444138 0.355365753 0.359247446 0.363765687 0.368078649 0.373319119 0.377551675 0.381339997 0.386026591 0.390232146 0.394556612 0.39923355 0.403393477 0.407230675 0.411488503 0.415405035 0.419554085 0.423330873 0.426600307 0.430455953 0.434006006 0.438416839 0.44257319 0.446410328 0.450591326 0.454440057 0.458179176 0.462862462 0.467654675 0.472182184 0.47654742 0.480582297 0.484582305 0.48897928 0.492755651 0.497583508 0.501810968 0.506914854 0.511641383 0.516443133 0.521829247 0.526256442 0.530596495 0.535112858 0.540201902 0.544305682 0.550014436 0.554706097 0.560166121 0.566366315 0.573630214 0.579795837 0.585570991 0.591697931 0.598352075 0.604765534 0.611619592 0.619941711 0.627760828 0.633322001 0.641892195 0.649247348 0.657859802 0.667137802 0.678901255 0.690580487 0.704092264 0.717731595 0.733388901 0.750189662 0.77485764 0.800969005 0.842572093 0.911488771 1.11530662 ------------------------------ Transdef: Tab for variable 13 20.0516586 21.319643 21.8949699 22.4432297 22.8789425 23.406765 23.8782234 24.329752 24.7548199 25.0234642 25.3973923 25.7603779 26.0245819 26.3705215 26.6723099 27.0081692 27.320076 27.6143551 27.910284 28.2217503 28.5563583 28.8418007 29.1364288 29.4327736 29.6770229 29.9410706 30.2043762 30.451622 30.7380104 30.9642773 31.2342796 31.5188427 31.7878342 32.0652046 32.3112488 32.4912033 32.7713776 33.0004044 33.2604065 33.4792328 33.7454758 34.0094986 34.3086815 34.5689583 34.8385544 35.0940323 35.4094505 35.6600723 35.9421425 36.213974 36.4361496 36.7507935 37.0331497 37.3238754 37.5835495 37.8945236 38.1683884 38.4753227 38.7357559 39.0296402 39.3810844 39.7342377 40.0918427 40.3868828 40.7487602 41.1105118 41.5012894 41.8321304 42.2096329 42.5771179 42.9645615 43.4086647 43.8446999 44.3504105 44.8099442 45.2588501 45.7119255 46.233284 46.7262001 47.220932 47.7571068 48.2613525 48.9600525 49.5907402 50.3037567 51.0979996 51.8824806 52.8310585 53.7302742 54.7750015 56.0569763 57.2455902 58.8373146 60.4543915 62.7627869 65.118454 67.3344727 70.7364426 76.228241 85.1623383 199.637161 ------------------------------ Transdef: Tab for variable 14 10.0034533 10.6328106 11.0901165 11.4221611 11.7268839 12.0202055 12.2614174 12.4587317 12.7325239 12.9464836 13.1883774 13.4905128 13.7038488 13.9307756 14.1585169 14.3670235 14.6320324 14.8509321 15.0901985 15.2810116 15.4871798 15.6346626 15.8609905 16.0537796 16.290081 16.469738 16.705555 16.9185104 17.1534634 17.304203 17.5521202 17.7468433 17.9901924 18.1951046 18.4441681 18.6291637 18.8047867 18.9827919 19.1763763 19.4237289 19.6647758 19.8435955 20.0743942 20.2762375 20.4971848 20.7076416 20.9269371 21.1373081 21.3301468 21.5266171 21.6901321 21.8428555 22.0695305 22.272831 22.4893875 22.6485023 22.8993378 23.1255035 23.3881226 23.6468353 23.8337593 24.0540123 24.2446098 24.4461021 24.6921978 24.9085579 25.1729851 25.3662567 25.6086559 25.8664322 26.1986618 26.4237766 26.6547718 26.9754314 27.2383461 27.5847931 27.8619843 28.2047596 28.5106468 28.8095474 29.2176895 29.5193691 29.8408775 30.2229156 30.6034622 31.0273685 31.5144577 32.074585 32.6621246 33.242115 33.8464966 34.4971237 35.3490257 36.2772141 37.2532616 38.4981079 40.0228577 41.7657547 44.6824265 48.6901245 88.1333923 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 35.9 16.4 15.4 11.1 25.8 31.7 19.0 29.3 -19.9 -6.7 -10.3 13.4 11.9 2 35.9 100.0 57.3 45.5 21.4 62.8 94.1 62.2 86.8 -46.7 -20.9 -35.8 52.6 39.5 3 16.4 57.3 100.0 24.5 -1.3 25.5 59.6 42.9 66.0 -8.9 -22.5 -41.0 91.3 19.8 4 15.4 45.5 24.5 100.0 5.0 24.7 45.4 34.9 52.9 -4.5 -25.8 -46.8 23.2 90.3 5 11.1 21.4 -1.3 5.0 100.0 82.9 -8.7 51.6 52.4 34.3 -1.8 0.2 -1.3 4.5 6 25.8 62.8 25.5 24.7 82.9 100.0 37.8 72.8 79.7 3.0 -11.0 -16.4 23.2 21.4 7 31.7 94.1 59.6 45.4 -8.7 37.8 100.0 51.4 71.8 -56.4 -21.0 -37.3 54.7 39.4 8 19.0 62.2 42.9 34.9 51.6 72.8 51.4 100.0 76.9 0.7 -18.4 -29.4 39.4 31.0 9 29.3 86.8 66.0 52.9 52.4 79.7 71.8 76.9 100.0 -12.7 -23.4 -40.9 60.8 46.4 10 -19.9 -46.7 -8.9 -4.5 34.3 3.0 -56.4 0.7 -12.7 100.0 2.7 4.6 -7.4 -3.2 11 -6.7 -20.9 -22.5 -25.8 -1.8 -11.0 -21.0 -18.4 -23.4 2.7 100.0 56.0 -19.7 -25.2 12 -10.3 -35.8 -41.0 -46.8 0.2 -16.4 -37.3 -29.4 -40.9 4.6 56.0 100.0 -37.9 -39.4 13 13.4 52.6 91.3 23.2 -1.3 23.2 54.7 39.4 60.8 -7.4 -19.7 -37.9 100.0 27.2 14 11.9 39.5 19.8 90.3 4.5 21.4 39.4 31.0 46.4 -3.2 -25.2 -39.4 27.2 100.0 TOTAL CORRELATION TO TARGET (diagonal) 75.3607543 TOTAL CORRELATION OF ALL VARIABLES 39.0515903 ROUND 1: MAX CORR ( 39.0508645) AFTER KILLING INPUT VARIABLE 8 CONTR 0.238094478 ROUND 2: MAX CORR ( 39.0498594) AFTER KILLING INPUT VARIABLE 11 CONTR 0.280174761 ROUND 3: MAX CORR ( 39.0425686) AFTER KILLING INPUT VARIABLE 12 CONTR 0.754554838 ROUND 4: MAX CORR ( 39.0048431) AFTER KILLING INPUT VARIABLE 14 CONTR 1.71592021 ROUND 5: MAX CORR ( 38.8829441) AFTER KILLING INPUT VARIABLE 5 CONTR 3.08130574 ROUND 6: MAX CORR ( 38.7459166) AFTER KILLING INPUT VARIABLE 13 CONTR 3.26148515 ROUND 7: MAX CORR ( 38.5530896) AFTER KILLING INPUT VARIABLE 10 CONTR 3.86074325 ROUND 8: MAX CORR ( 38.1074525) AFTER KILLING INPUT VARIABLE 6 CONTR 5.84489303 ROUND 9: MAX CORR ( 37.767078) AFTER KILLING INPUT VARIABLE 3 CONTR 5.08190494 ROUND 10: MAX CORR ( 37.5742321) AFTER KILLING INPUT VARIABLE 4 CONTR 3.81172706 ROUND 11: MAX CORR ( 36.4609031) AFTER KILLING INPUT VARIABLE 9 CONTR 9.07884696 ROUND 12: MAX CORR ( 35.9215109) AFTER KILLING INPUT VARIABLE 7 CONTR 6.24840069 LAST REMAINING VARIABLE: 2 total correlation to target: 39.0515903 % total significance: 35.373359 sigma correlations of single variables to target: variable 2: 35.9215109 % , in sigma: 32.5380987 variable 3: 16.3719748 % , in sigma: 14.8299144 variable 4: 15.4165645 % , in sigma: 13.9644933 variable 5: 11.0838944 % , in sigma: 10.0399132 variable 6: 25.8019018 % , in sigma: 23.3716458 variable 7: 31.7009624 % , in sigma: 28.7150796 variable 8: 18.9594055 % , in sigma: 17.1736375 variable 9: 29.3467595 % , in sigma: 26.5826168 variable 10: -19.8931709 % , in sigma: 18.0194525 variable 11: -6.69534859 % , in sigma: 6.06472021 variable 12: -10.2805728 % , in sigma: 9.31225569 variable 13: 13.4439458 % , in sigma: 12.1776736 variable 14: 11.9141639 % , in sigma: 10.7919804 variables sorted by significance: 1 most relevant variable 2 corr 35.9215126 , in sigma: 32.5381003 2 most relevant variable 7 corr 6.24840069 , in sigma: 5.6598699 3 most relevant variable 9 corr 9.07884693 , in sigma: 8.22371917 4 most relevant variable 4 corr 3.81172705 , in sigma: 3.45270418 5 most relevant variable 3 corr 5.08190489 , in sigma: 4.60324521 6 most relevant variable 6 corr 5.84489298 , in sigma: 5.29436819 7 most relevant variable 10 corr 3.86074328 , in sigma: 3.49710363 8 most relevant variable 13 corr 3.2614851 , in sigma: 2.95428899 9 most relevant variable 5 corr 3.08130574 , in sigma: 2.79108055 10 most relevant variable 14 corr 1.71592021 , in sigma: 1.55429935 11 most relevant variable 12 corr 0.754554868 , in sigma: 0.683484078 12 most relevant variable 11 corr 0.280174762 , in sigma: 0.253785374 13 most relevant variable 8 corr 0.238094479 , in sigma: 0.215668591 global correlations between input variables: variable 2: 99.336283 % variable 3: 95.9894362 % variable 4: 94.4536653 % variable 5: 96.3312048 % variable 6: 95.48846 % variable 7: 99.011888 % variable 8: 85.4455584 % variable 9: 98.8680555 % variable 10: 75.5646537 % variable 11: 57.3655505 % variable 12: 69.9349167 % variable 13: 93.4512499 % variable 14: 92.7834526 % significance loss when removing single variables: variable 2: corr = 11.9992751 % , sigma = 10.8690751 variable 3: corr = 7.00405822 % , sigma = 6.34435279 variable 4: corr = 6.33328324 % , sigma = 5.73675745 variable 5: corr = 3.07178438 % , sigma = 2.782456 variable 6: corr = 3.6334453 % , sigma = 3.29121462 variable 7: corr = 6.21910414 % , sigma = 5.63333276 variable 8: corr = 0.238094478 % , sigma = 0.21566859 variable 9: corr = 9.9545658 % , sigma = 9.01695493 variable 10: corr = 4.25583228 % , sigma = 3.85497957 variable 11: corr = 0.289370952 % , sigma = 0.262115383 variable 12: corr = 0.786701668 % , sigma = 0.712603003 variable 13: corr = 1.9289077 % , sigma = 1.74722576 variable 14: corr = 1.78264609 % , sigma = 1.61474039 Keep only 6 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 7 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 5 --> 12.6205082 sigma out 15 active outputs RANK 2 NODE 3 --> 10.1514034 sigma out 15 active outputs RANK 3 NODE 6 --> 9.86330223 sigma out 15 active outputs RANK 4 NODE 2 --> 8.97360325 sigma out 15 active outputs RANK 5 NODE 7 --> 8.92010975 sigma out 15 active outputs RANK 6 NODE 1 --> 8.71567154 sigma out 15 active outputs RANK 7 NODE 4 --> 7.31970263 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 13.2094622 sigma in 7act. ( 13.6534128 sig out 1act.) RANK 2 NODE 2 --> 10.1009903 sigma in 7act. ( 10.8106928 sig out 1act.) RANK 3 NODE 9 --> 8.96173382 sigma in 7act. ( 9.50099468 sig out 1act.) RANK 4 NODE 4 --> 8.23788357 sigma in 7act. ( 7.59608889 sig out 1act.) RANK 5 NODE 15 --> 7.19235468 sigma in 7act. ( 7.51569462 sig out 1act.) RANK 6 NODE 13 --> 6.43458271 sigma in 7act. ( 6.91707563 sig out 1act.) RANK 7 NODE 7 --> 5.9887104 sigma in 7act. ( 6.17441702 sig out 1act.) RANK 8 NODE 14 --> 5.2043786 sigma in 7act. ( 5.31947517 sig out 1act.) RANK 9 NODE 6 --> 4.87198114 sigma in 7act. ( 5.12789011 sig out 1act.) RANK 10 NODE 8 --> 3.91356301 sigma in 7act. ( 4.3511014 sig out 1act.) RANK 11 NODE 5 --> 3.15012312 sigma in 7act. ( 3.27506351 sig out 1act.) RANK 12 NODE 10 --> 3.05952311 sigma in 7act. ( 3.23450351 sig out 1act.) RANK 13 NODE 12 --> 2.46548343 sigma in 7act. ( 2.33379769 sig out 1act.) RANK 14 NODE 1 --> 1.43414152 sigma in 7act. ( 1.42122018 sig out 1act.) RANK 15 NODE 11 --> 1.41317058 sigma in 7act. ( 0.162610039 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 13.6534128 sigma out 1act.( 13.2094622 sig in 7act.) RANK 2 NODE 2 --> 10.8106928 sigma out 1act.( 10.1009903 sig in 7act.) RANK 3 NODE 9 --> 9.50099468 sigma out 1act.( 8.96173382 sig in 7act.) RANK 4 NODE 4 --> 7.59608889 sigma out 1act.( 8.23788357 sig in 7act.) RANK 5 NODE 15 --> 7.51569462 sigma out 1act.( 7.19235468 sig in 7act.) RANK 6 NODE 13 --> 6.91707563 sigma out 1act.( 6.43458271 sig in 7act.) RANK 7 NODE 7 --> 6.17441702 sigma out 1act.( 5.9887104 sig in 7act.) RANK 8 NODE 14 --> 5.31947517 sigma out 1act.( 5.2043786 sig in 7act.) RANK 9 NODE 6 --> 5.12789011 sigma out 1act.( 4.87198114 sig in 7act.) RANK 10 NODE 8 --> 4.3511014 sigma out 1act.( 3.91356301 sig in 7act.) RANK 11 NODE 5 --> 3.27506351 sigma out 1act.( 3.15012312 sig in 7act.) RANK 12 NODE 10 --> 3.23450351 sigma out 1act.( 3.05952311 sig in 7act.) RANK 13 NODE 12 --> 2.33379769 sigma out 1act.( 2.46548343 sig in 7act.) RANK 14 NODE 1 --> 1.42122018 sigma out 1act.( 1.43414152 sig in 7act.) RANK 15 NODE 11 --> 0.162610039 sigma out 1act.( 1.41317058 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 26.380228 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 5 --> 14.3710299 sigma out 15 active outputs RANK 2 NODE 3 --> 13.6803427 sigma out 15 active outputs RANK 3 NODE 6 --> 12.5940981 sigma out 15 active outputs RANK 4 NODE 1 --> 11.0647821 sigma out 15 active outputs RANK 5 NODE 7 --> 10.7345381 sigma out 15 active outputs RANK 6 NODE 2 --> 10.202733 sigma out 15 active outputs RANK 7 NODE 4 --> 8.29469204 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 13.3325529 sigma in 7act. ( 13.4963064 sig out 1act.) RANK 2 NODE 2 --> 11.127532 sigma in 7act. ( 10.7487907 sig out 1act.) RANK 3 NODE 9 --> 10.529315 sigma in 7act. ( 10.0485506 sig out 1act.) RANK 4 NODE 4 --> 10.1743841 sigma in 7act. ( 8.20974541 sig out 1act.) RANK 5 NODE 7 --> 8.5623045 sigma in 7act. ( 6.55700445 sig out 1act.) RANK 6 NODE 13 --> 8.44417191 sigma in 7act. ( 6.545506 sig out 1act.) RANK 7 NODE 15 --> 8.42742825 sigma in 7act. ( 7.23278999 sig out 1act.) RANK 8 NODE 6 --> 7.99772596 sigma in 7act. ( 4.68542814 sig out 1act.) RANK 9 NODE 8 --> 6.57613897 sigma in 7act. ( 4.55004644 sig out 1act.) RANK 10 NODE 5 --> 5.53373051 sigma in 7act. ( 3.47498894 sig out 1act.) RANK 11 NODE 14 --> 5.45591211 sigma in 7act. ( 5.08065605 sig out 1act.) RANK 12 NODE 1 --> 4.68625927 sigma in 7act. ( 1.79039526 sig out 1act.) RANK 13 NODE 11 --> 4.2588973 sigma in 7act. ( 0.494234025 sig out 1act.) RANK 14 NODE 10 --> 3.46114945 sigma in 7act. ( 3.10690308 sig out 1act.) RANK 15 NODE 12 --> 3.40112185 sigma in 7act. ( 2.0737288 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 13.4963064 sigma out 1act.( 13.3325529 sig in 7act.) RANK 2 NODE 2 --> 10.7487907 sigma out 1act.( 11.127532 sig in 7act.) RANK 3 NODE 9 --> 10.0485506 sigma out 1act.( 10.529315 sig in 7act.) RANK 4 NODE 4 --> 8.20974541 sigma out 1act.( 10.1743841 sig in 7act.) RANK 5 NODE 15 --> 7.23278999 sigma out 1act.( 8.42742825 sig in 7act.) RANK 6 NODE 7 --> 6.55700445 sigma out 1act.( 8.5623045 sig in 7act.) RANK 7 NODE 13 --> 6.545506 sigma out 1act.( 8.44417191 sig in 7act.) RANK 8 NODE 14 --> 5.08065605 sigma out 1act.( 5.45591211 sig in 7act.) RANK 9 NODE 6 --> 4.68542814 sigma out 1act.( 7.99772596 sig in 7act.) RANK 10 NODE 8 --> 4.55004644 sigma out 1act.( 6.57613897 sig in 7act.) RANK 11 NODE 5 --> 3.47498894 sigma out 1act.( 5.53373051 sig in 7act.) RANK 12 NODE 10 --> 3.10690308 sigma out 1act.( 3.46114945 sig in 7act.) RANK 13 NODE 12 --> 2.0737288 sigma out 1act.( 3.40112185 sig in 7act.) RANK 14 NODE 1 --> 1.79039526 sigma out 1act.( 4.68625927 sig in 7act.) RANK 15 NODE 11 --> 0.494234025 sigma out 1act.( 4.2588973 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 26.4982815 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.416323513 *** contribution from regularisation: 0.0104719196 *** contribution from error: -0.426795423 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.435749352 *** contribution from regularisation: 0.0064123678 *** contribution from error: -0.442161709 *********************************************** -----------------> Test sample ENTER BFGS code START -8849.92123 -0.607049644 0.391548038 EXIT FROM BFGS code FG_START 0. -0.607049644 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.441451222 *** contribution from regularisation: 0.00527369604 *** contribution from error: -0.446724921 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -8963.66704 -0.607049644 -38.4823341 EXIT FROM BFGS code FG_LNSRCH 0. -0.675234079 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.445902884 *** contribution from regularisation: 0.00749391504 *** contribution from error: -0.453396797 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9054.0582 -0.675234079 27.817997 EXIT FROM BFGS code NEW_X -9054.0582 -0.675234079 27.817997 ENTER BFGS code NEW_X -9054.0582 -0.675234079 27.817997 EXIT FROM BFGS code FG_LNSRCH 0. -0.651645541 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.451996773 *** contribution from regularisation: 0.00659431191 *** contribution from error: -0.458591074 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9177.79474 -0.651645541 7.1433301 EXIT FROM BFGS code NEW_X -9177.79474 -0.651645541 7.1433301 ENTER BFGS code NEW_X -9177.79474 -0.651645541 7.1433301 EXIT FROM BFGS code FG_LNSRCH 0. -0.646451712 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.452289432 *** contribution from regularisation: 0.00658574048 *** contribution from error: -0.458875179 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9183.73673 -0.646451712 9.13123703 EXIT FROM BFGS code NEW_X -9183.73673 -0.646451712 9.13123703 ENTER BFGS code NEW_X -9183.73673 -0.646451712 9.13123703 EXIT FROM BFGS code FG_LNSRCH 0. -0.602030635 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.452726066 *** contribution from regularisation: 0.00625179475 *** contribution from error: -0.458977848 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9192.60253 -0.602030635 19.8165493 EXIT FROM BFGS code NEW_X -9192.60253 -0.602030635 19.8165493 ENTER BFGS code NEW_X -9192.60253 -0.602030635 19.8165493 EXIT FROM BFGS code FG_LNSRCH 0. -0.572077096 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.453558534 *** contribution from regularisation: 0.00598347979 *** contribution from error: -0.459542006 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9209.5058 -0.572077096 11.2429523 EXIT FROM BFGS code NEW_X -9209.5058 -0.572077096 11.2429523 ENTER BFGS code NEW_X -9209.5058 -0.572077096 11.2429523 EXIT FROM BFGS code FG_LNSRCH 0. -0.432045072 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.455842853 *** contribution from regularisation: 0.00515100779 *** contribution from error: -0.460993856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9255.88924 -0.432045072 1.3444587 EXIT FROM BFGS code NEW_X -9255.88924 -0.432045072 1.3444587 ENTER BFGS code NEW_X -9255.88924 -0.432045072 1.3444587 EXIT FROM BFGS code FG_LNSRCH 0. -0.139041692 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 15.964963 sigma out 15 active outputs RANK 2 NODE 1 --> 14.9112434 sigma out 15 active outputs RANK 3 NODE 6 --> 14.5740633 sigma out 15 active outputs RANK 4 NODE 7 --> 11.5554543 sigma out 15 active outputs RANK 5 NODE 2 --> 8.88186169 sigma out 15 active outputs RANK 6 NODE 4 --> 7.11271954 sigma out 15 active outputs RANK 7 NODE 5 --> 3.14925122 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 19.4257431 sigma in 7act. ( 18.1836185 sig out 1act.) RANK 2 NODE 3 --> 19.339901 sigma in 7act. ( 19.4902306 sig out 1act.) RANK 3 NODE 11 --> 10.6543884 sigma in 7act. ( 11.8931646 sig out 1act.) RANK 4 NODE 15 --> 4.62158537 sigma in 7act. ( 3.51087046 sig out 1act.) RANK 5 NODE 13 --> 4.16354752 sigma in 7act. ( 3.96713543 sig out 1act.) RANK 6 NODE 9 --> 3.5997889 sigma in 7act. ( 2.26788211 sig out 1act.) RANK 7 NODE 8 --> 3.5248239 sigma in 7act. ( 2.92441559 sig out 1act.) RANK 8 NODE 4 --> 2.87145853 sigma in 7act. ( 1.18629491 sig out 1act.) RANK 9 NODE 2 --> 2.75928879 sigma in 7act. ( 1.33532238 sig out 1act.) RANK 10 NODE 1 --> 2.3963809 sigma in 7act. ( 2.17808104 sig out 1act.) RANK 11 NODE 6 --> 2.20563054 sigma in 7act. ( 1.6760602 sig out 1act.) RANK 12 NODE 5 --> 1.62329924 sigma in 7act. ( 0.705197394 sig out 1act.) RANK 13 NODE 14 --> 1.26898253 sigma in 7act. ( 0.407652497 sig out 1act.) RANK 14 NODE 12 --> 1.23658288 sigma in 7act. ( 1.31362653 sig out 1act.) RANK 15 NODE 10 --> 1.15232491 sigma in 7act. ( 0.324671507 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 19.4902306 sigma out 1act.( 19.339901 sig in 7act.) RANK 2 NODE 7 --> 18.1836185 sigma out 1act.( 19.4257431 sig in 7act.) RANK 3 NODE 11 --> 11.8931646 sigma out 1act.( 10.6543884 sig in 7act.) RANK 4 NODE 13 --> 3.96713543 sigma out 1act.( 4.16354752 sig in 7act.) RANK 5 NODE 15 --> 3.51087046 sigma out 1act.( 4.62158537 sig in 7act.) RANK 6 NODE 8 --> 2.92441559 sigma out 1act.( 3.5248239 sig in 7act.) RANK 7 NODE 9 --> 2.26788211 sigma out 1act.( 3.5997889 sig in 7act.) RANK 8 NODE 1 --> 2.17808104 sigma out 1act.( 2.3963809 sig in 7act.) RANK 9 NODE 6 --> 1.6760602 sigma out 1act.( 2.20563054 sig in 7act.) RANK 10 NODE 2 --> 1.33532238 sigma out 1act.( 2.75928879 sig in 7act.) RANK 11 NODE 12 --> 1.31362653 sigma out 1act.( 1.23658288 sig in 7act.) RANK 12 NODE 4 --> 1.18629491 sigma out 1act.( 2.87145853 sig in 7act.) RANK 13 NODE 5 --> 0.705197394 sigma out 1act.( 1.62329924 sig in 7act.) RANK 14 NODE 14 --> 0.407652497 sigma out 1act.( 1.26898253 sig in 7act.) RANK 15 NODE 10 --> 0.324671507 sigma out 1act.( 1.15232491 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 30.1157494 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.457495868 *** contribution from regularisation: 0.0058899303 *** contribution from error: -0.463385791 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -9289.4535 -0.139041692 -0.556800842 EXIT FROM BFGS code NEW_X -9289.4535 -0.139041692 -0.556800842 ENTER BFGS code NEW_X -9289.4535 -0.139041692 -0.556800842 EXIT FROM BFGS code FG_LNSRCH 0. -0.147002712 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.458652377 *** contribution from regularisation: 0.0055934852 *** contribution from error: -0.464245856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9312.9367 -0.147002712 2.21447539 EXIT FROM BFGS code NEW_X -9312.9367 -0.147002712 2.21447539 ENTER BFGS code NEW_X -9312.9367 -0.147002712 2.21447539 EXIT FROM BFGS code FG_LNSRCH 0. -0.130827665 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.459396541 *** contribution from regularisation: 0.00551201589 *** contribution from error: -0.46490857 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9328.04658 -0.130827665 1.72431958 EXIT FROM BFGS code NEW_X -9328.04658 -0.130827665 1.72431958 ENTER BFGS code NEW_X -9328.04658 -0.130827665 1.72431958 EXIT FROM BFGS code FG_LNSRCH 0. -0.0748651624 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.460065633 *** contribution from regularisation: 0.00556846987 *** contribution from error: -0.465634108 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9341.63295 -0.0748651624 -1.67320502 EXIT FROM BFGS code NEW_X -9341.63295 -0.0748651624 -1.67320502 ENTER BFGS code NEW_X -9341.63295 -0.0748651624 -1.67320502 EXIT FROM BFGS code FG_LNSRCH 0. 0.00369670033 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.461113781 *** contribution from regularisation: 0.00593101373 *** contribution from error: -0.467044801 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9362.91543 0.00369670033 -3.0082674 EXIT FROM BFGS code NEW_X -9362.91543 0.00369670033 -3.0082674 ENTER BFGS code NEW_X -9362.91543 0.00369670033 -3.0082674 EXIT FROM BFGS code FG_LNSRCH 0. 0.115111984 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.461984575 *** contribution from regularisation: 0.00664955052 *** contribution from error: -0.468634129 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9380.5965 0.115111984 -25.9516201 EXIT FROM BFGS code FG_LNSRCH 0. 0.0558886454 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.462773919 *** contribution from regularisation: 0.0054773353 *** contribution from error: -0.468251258 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9396.62414 0.0558886454 -11.7936554 EXIT FROM BFGS code NEW_X -9396.62414 0.0558886454 -11.7936554 ENTER BFGS code NEW_X -9396.62414 0.0558886454 -11.7936554 EXIT FROM BFGS code FG_LNSRCH 0. 0.0613686368 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.463130504 *** contribution from regularisation: 0.00613965746 *** contribution from error: -0.46927017 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9403.86504 0.0613686368 -1.08434153 EXIT FROM BFGS code NEW_X -9403.86504 0.0613686368 -1.08434153 ENTER BFGS code NEW_X -9403.86504 0.0613686368 -1.08434153 EXIT FROM BFGS code FG_LNSRCH 0. 0.0444318838 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.463625729 *** contribution from regularisation: 0.00595530402 *** contribution from error: -0.469581038 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9413.92038 0.0444318838 -2.64054775 EXIT FROM BFGS code NEW_X -9413.92038 0.0444318838 -2.64054775 ENTER BFGS code NEW_X -9413.92038 0.0444318838 -2.64054775 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653123781 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.464178145 *** contribution from regularisation: 0.00551954424 *** contribution from error: -0.469697684 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9425.13752 -0.0653123781 -4.28286123 EXIT FROM BFGS code NEW_X -9425.13752 -0.0653123781 -4.28286123 ENTER BFGS code NEW_X -9425.13752 -0.0653123781 -4.28286123 EXIT FROM BFGS code FG_LNSRCH 0. -0.0769804865 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 29.7006836 sigma out 15 active outputs RANK 2 NODE 3 --> 25.9030056 sigma out 15 active outputs RANK 3 NODE 6 --> 23.3906116 sigma out 15 active outputs RANK 4 NODE 7 --> 21.1818371 sigma out 15 active outputs RANK 5 NODE 2 --> 11.2900867 sigma out 15 active outputs RANK 6 NODE 4 --> 10.4735079 sigma out 15 active outputs RANK 7 NODE 5 --> 4.03472424 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 41.3503685 sigma in 7act. ( 40.1448174 sig out 1act.) RANK 2 NODE 7 --> 30.0563889 sigma in 7act. ( 28.4824467 sig out 1act.) RANK 3 NODE 13 --> 8.84784985 sigma in 7act. ( 9.15846348 sig out 1act.) RANK 4 NODE 11 --> 7.60009956 sigma in 7act. ( 6.73686886 sig out 1act.) RANK 5 NODE 15 --> 4.48809528 sigma in 7act. ( 4.16472912 sig out 1act.) RANK 6 NODE 1 --> 3.93867612 sigma in 7act. ( 3.79598188 sig out 1act.) RANK 7 NODE 12 --> 2.35065651 sigma in 7act. ( 2.03948498 sig out 1act.) RANK 8 NODE 6 --> 2.07083201 sigma in 7act. ( 1.62408721 sig out 1act.) RANK 9 NODE 2 --> 1.50451934 sigma in 7act. ( 0.93020153 sig out 1act.) RANK 10 NODE 9 --> 1.36919272 sigma in 7act. ( 0.701392174 sig out 1act.) RANK 11 NODE 5 --> 1.23032892 sigma in 7act. ( 1.26248288 sig out 1act.) RANK 12 NODE 8 --> 0.888652265 sigma in 7act. ( 0.0771101192 sig out 1act.) RANK 13 NODE 4 --> 0.720929444 sigma in 7act. ( 0.235569537 sig out 1act.) RANK 14 NODE 10 --> 0.647719026 sigma in 7act. ( 0.38647452 sig out 1act.) RANK 15 NODE 14 --> 0.43598175 sigma in 7act. ( 0.116170697 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 40.1448174 sigma out 1act.( 41.3503685 sig in 7act.) RANK 2 NODE 7 --> 28.4824467 sigma out 1act.( 30.0563889 sig in 7act.) RANK 3 NODE 13 --> 9.15846348 sigma out 1act.( 8.84784985 sig in 7act.) RANK 4 NODE 11 --> 6.73686886 sigma out 1act.( 7.60009956 sig in 7act.) RANK 5 NODE 15 --> 4.16472912 sigma out 1act.( 4.48809528 sig in 7act.) RANK 6 NODE 1 --> 3.79598188 sigma out 1act.( 3.93867612 sig in 7act.) RANK 7 NODE 12 --> 2.03948498 sigma out 1act.( 2.35065651 sig in 7act.) RANK 8 NODE 6 --> 1.62408721 sigma out 1act.( 2.07083201 sig in 7act.) RANK 9 NODE 5 --> 1.26248288 sigma out 1act.( 1.23032892 sig in 7act.) RANK 10 NODE 2 --> 0.93020153 sigma out 1act.( 1.50451934 sig in 7act.) RANK 11 NODE 9 --> 0.701392174 sigma out 1act.( 1.36919272 sig in 7act.) RANK 12 NODE 10 --> 0.38647452 sigma out 1act.( 0.647719026 sig in 7act.) RANK 13 NODE 4 --> 0.235569537 sigma out 1act.( 0.720929444 sig in 7act.) RANK 14 NODE 14 --> 0.116170697 sigma out 1act.( 0.43598175 sig in 7act.) RANK 15 NODE 8 --> 0.0771101192 sigma out 1act.( 0.888652265 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 50.9298134 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.464025497 *** contribution from regularisation: 0.0058725276 *** contribution from error: -0.469898015 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -9422.03792 -0.0769804865 0.775581002 EXIT FROM BFGS code FG_LNSRCH 0. -0.0681419745 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.463921875 *** contribution from regularisation: 0.00590591505 *** contribution from error: -0.469827801 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9419.93383 -0.0681419745 -3.04849315 EXIT FROM BFGS code FG_LNSRCH 0. -0.0655042157 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.463971287 *** contribution from regularisation: 0.00573644182 *** contribution from error: -0.469707727 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9420.9367 -0.0655042157 -4.16109133 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653138682 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.463818938 *** contribution from regularisation: 0.00587887596 *** contribution from error: -0.469697803 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9417.84325 -0.0653138682 -4.2373991 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653123781 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.463975996 *** contribution from regularisation: 0.00572170783 *** contribution from error: -0.469697714 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9421.03259 -0.0653123781 -4.23236656 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653123781 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.463897973 *** contribution from regularisation: 0.00579973683 *** contribution from error: -0.469697714 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9419.44821 -0.0653123781 -4.22727633 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653123781 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.463914067 *** contribution from regularisation: 0.00578362867 *** contribution from error: -0.469697684 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9419.77529 -0.0653123781 -4.22291088 EXIT FROM BFGS code FG_LNSRCH 0. -0.0653123781 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.463851035 *** contribution from regularisation: 0.00584666152 *** contribution from error: -0.469697684 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9418.49541 -0.0653123781 -4.21808815 EXIT FROM BFGS code NEW_X -9418.49541 -0.0653123781 -4.21808815 ENTER BFGS code NEW_X -9418.49541 -0.0653123781 -4.21808815 EXIT FROM BFGS code CONVERGENC -9418.49541 -0.0653123781 -4.21808815 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 45.3198586 sigma out 15 active outputs RANK 2 NODE 3 --> 38.9586487 sigma out 15 active outputs RANK 3 NODE 6 --> 35.2343674 sigma out 15 active outputs RANK 4 NODE 7 --> 33.2541771 sigma out 15 active outputs RANK 5 NODE 2 --> 17.2174931 sigma out 15 active outputs RANK 6 NODE 4 --> 15.2363386 sigma out 15 active outputs RANK 7 NODE 5 --> 6.2844348 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 62.8553658 sigma in 7act. ( 61.6056824 sig out 1act.) RANK 2 NODE 7 --> 46.519722 sigma in 7act. ( 42.5767632 sig out 1act.) RANK 3 NODE 13 --> 12.524848 sigma in 7act. ( 13.6064768 sig out 1act.) RANK 4 NODE 11 --> 11.1257906 sigma in 7act. ( 10.908679 sig out 1act.) RANK 5 NODE 15 --> 5.94796228 sigma in 7act. ( 6.13388491 sig out 1act.) RANK 6 NODE 1 --> 5.38276625 sigma in 7act. ( 5.52160454 sig out 1act.) RANK 7 NODE 12 --> 3.05615783 sigma in 7act. ( 2.90125322 sig out 1act.) RANK 8 NODE 6 --> 2.62886333 sigma in 7act. ( 2.33132577 sig out 1act.) RANK 9 NODE 2 --> 1.85719216 sigma in 7act. ( 1.41357374 sig out 1act.) RANK 10 NODE 5 --> 1.6116972 sigma in 7act. ( 1.64186633 sig out 1act.) RANK 11 NODE 9 --> 1.56050003 sigma in 7act. ( 0.950462222 sig out 1act.) RANK 12 NODE 8 --> 0.971912622 sigma in 7act. ( 0.0827737823 sig out 1act.) RANK 13 NODE 4 --> 0.867604613 sigma in 7act. ( 0.404638112 sig out 1act.) RANK 14 NODE 10 --> 0.71331656 sigma in 7act. ( 0.474846423 sig out 1act.) RANK 15 NODE 14 --> 0.470273316 sigma in 7act. ( 0.136786088 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 61.6056824 sigma out 1act.( 62.8553658 sig in 7act.) RANK 2 NODE 7 --> 42.5767632 sigma out 1act.( 46.519722 sig in 7act.) RANK 3 NODE 13 --> 13.6064768 sigma out 1act.( 12.524848 sig in 7act.) RANK 4 NODE 11 --> 10.908679 sigma out 1act.( 11.1257906 sig in 7act.) RANK 5 NODE 15 --> 6.13388491 sigma out 1act.( 5.94796228 sig in 7act.) RANK 6 NODE 1 --> 5.52160454 sigma out 1act.( 5.38276625 sig in 7act.) RANK 7 NODE 12 --> 2.90125322 sigma out 1act.( 3.05615783 sig in 7act.) RANK 8 NODE 6 --> 2.33132577 sigma out 1act.( 2.62886333 sig in 7act.) RANK 9 NODE 5 --> 1.64186633 sigma out 1act.( 1.6116972 sig in 7act.) RANK 10 NODE 2 --> 1.41357374 sigma out 1act.( 1.85719216 sig in 7act.) RANK 11 NODE 9 --> 0.950462222 sigma out 1act.( 1.56050003 sig in 7act.) RANK 12 NODE 10 --> 0.474846423 sigma out 1act.( 0.71331656 sig in 7act.) RANK 13 NODE 4 --> 0.404638112 sigma out 1act.( 0.867604613 sig in 7act.) RANK 14 NODE 14 --> 0.136786088 sigma out 1act.( 0.470273316 sig in 7act.) RANK 15 NODE 8 --> 0.0827737823 sigma out 1act.( 0.971912622 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 77.4606628 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.463887185 *** contribution from regularisation: 0.00581052154 *** contribution from error: -0.469697714 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 25167 Closing output file done