NNInput NNInputs_175.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 16441 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 3610 nbkg = 12831 Bkg Entries: 12831 Sig Entries: 3610 Chosen entries: 3610 Warning: entries low (below 6000) Signal fraction: 1 Background fraction: 0.28135 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 12831 Actual Signal Entries: 3610 Entries to split: 3610 Test with : 1805 Train with : 1805 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 3610 for Signal Prepared event 0 for Signal with 3610 events ====Entry 0 Variable Ht : 241.352 Variable LepAPt : 26.6709 Variable LepBPt : 20.757 Variable MetSigLeptonsJets : 8.7004 Variable MetSpec : 102.516 Variable SumEtLeptonsJets : 138.836 Variable VSumJetLeptonsPt : 95.4385 Variable addEt : 149.944 Variable dPhiLepSumMet : 1.91756 Variable dPhiLeptons : 0.237715 Variable dRLeptons : 0.435475 Variable lep1_E : 34.2866 Variable lep2_E : 22.2255 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2175 Ht = 241.352 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 26.6709 LepAPt = 26.6709 LepBEt = 20.7574 LepBPt = 20.757 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 102.516 MetDelPhi = 1.78381 MetSig = 7.13205 MetSigLeptonsJets = 8.7004 MetSpec = 102.516 Mjj = 0 MostCentralJetEta = 1.70124 MtllMet = 156.27 Njets = 1 SB = 0 SumEt = 206.61 SumEtJets = 0 SumEtLeptonsJets = 138.836 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 95.4385 addEt = 149.944 dPhiLepSumMet = 1.91756 dPhiLeptons = 0.237715 dRLeptons = 0.435475 diltype = 54 dimass = 10.2815 event = 116 jet1_Et = 91.408 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 34.2866 lep2_E = 22.2255 rand = 0.999742 run = 236503 weight = 2.5946e-06 ===Show End Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 12831 for Background Prepared event 0 for Background with 12831 events ====Entry 0 Variable Ht : 61.1273 Variable LepAPt : 22.7899 Variable LepBPt : 11.7315 Variable MetSigLeptonsJets : 4.52817 Variable MetSpec : 26.6052 Variable SumEtLeptonsJets : 34.5213 Variable VSumJetLeptonsPt : 31.9684 Variable addEt : 61.1273 Variable dPhiLepSumMet : 2.92616 Variable dPhiLeptons : 0.819534 Variable dRLeptons : 0.934366 Variable lep1_E : 23.1166 Variable lep2_E : 12.194 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 61.1273 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 22.7899 LepAPt = 22.7899 LepBEt = 11.7322 LepBPt = 11.7315 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 26.6052 MetDelPhi = 2.37814 MetSig = 2.46258 MetSigLeptonsJets = 4.52817 MetSpec = 26.6052 Mjj = 0 MostCentralJetEta = 0 MtllMet = 61.9133 Njets = 0 SB = 0 SumEt = 116.722 SumEtJets = 0 SumEtLeptonsJets = 34.5213 Target = 0 TrainWeight = 1.96569 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 31.9684 addEt = 61.1273 dPhiLepSumMet = 2.92616 dPhiLeptons = 0.819534 dRLeptons = 0.934366 diltype = 54 dimass = 14.9851 event = 1421455 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 23.1166 lep2_E = 12.194 rand = 0.999742 run = 271216 weight = 0.0400324 ===Show End Prepared event 10000 for Background with 12831 events Warning: found 395 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 16441 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 395 negative weights. Signal fraction: 73.4161072 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 55.8076553 62.8013802 66.3888397 68.6480637 71.3435211 73.03582 74.3217316 75.681366 76.9746552 77.0398788 77.1511993 78.3612518 78.8119278 80.9730606 81.2132568 81.861496 83.1961441 84.6559296 85.5791168 86.5710754 87.1544724 88.1270752 88.6493073 90.0933456 91.6410828 92.5754471 93.6044998 95.4200592 95.9003754 96.8638153 98.0019379 98.4570084 99.0402527 99.7355347 100.981125 102.647934 103.404907 104.803902 105.931137 106.946762 107.95182 108.535568 109.981705 111.483246 113.193329 113.934219 114.401855 115.202393 117.118134 118.575462 120.069382 120.988647 122.388916 123.729294 125.142395 126.39743 127.952835 129.27887 130.506409 131.662384 133.003143 134.701965 135.913452 137.489258 138.99942 140.782654 142.001038 143.628265 145.26149 146.512878 148.186432 149.345245 151.018066 152.354645 153.377213 155.018829 157.532562 158.811829 159.512146 161.921722 164.353607 166.218201 168.990906 171.198837 173.491745 177.177887 179.07782 183.397522 186.03949 189.877411 194.880783 200.326752 203.538147 208.54599 220.250427 229.368256 239.421066 253.768936 274.318848 303.827515 771.301514 ------------------------------ Transdef: Tab for variable 3 20.0002613 20.5549698 21.0139084 21.1170979 21.2832775 21.5014935 21.7159424 22.0548134 22.2506409 22.5565376 22.7882042 22.9547424 23.2701874 23.5726948 23.7482529 24.2200947 24.5289078 24.9027634 25.0568905 25.2305527 25.3228703 25.5314903 25.6541462 25.8592186 26.2676849 26.5808678 26.8811779 27.0819893 27.1348839 27.2230568 27.5270481 27.7549839 27.868248 28.0930214 28.2068195 28.4019318 28.6201458 28.8493443 29.0485115 29.1049023 29.3113518 29.4295235 29.6611919 29.7840118 30.0681648 30.2701759 30.4991264 30.7924309 31.0609283 31.3350563 31.6531487 31.6734638 31.865036 32.0868835 32.2180519 32.3482513 32.4952698 32.7142944 32.967659 33.127388 33.4076729 33.6951523 33.8812599 34.1444702 34.2671623 34.6101646 34.8222847 35.0724869 35.2191238 35.4884796 35.6376801 35.9766693 36.2909698 36.6433716 36.9103432 37.3784599 37.8825836 38.2501144 38.891552 39.2749405 39.5445786 39.9866943 40.5228653 40.9936256 41.6112747 41.965126 42.5159683 43.0579109 43.5819016 44.3301353 44.9101334 45.7312851 46.3788986 46.8930473 47.7993164 49.261795 50.4682312 53.2589798 55.315506 58.5311432 91.0271454 ------------------------------ Transdef: Tab for variable 4 10.0000381 10.1420736 10.202527 10.3402405 10.4312077 10.492897 10.5368805 10.6143084 10.6467495 10.7350378 10.8445892 10.9244652 11.0486221 11.2381802 11.4343147 11.5046825 11.7162247 11.8531303 11.9973545 12.2339554 12.3138771 12.3735352 12.6439266 12.8300667 13.0830841 13.3534346 13.5031414 13.6087303 13.7219849 13.8058643 13.8362007 14.107132 14.537818 14.7378006 14.9837942 15.1884594 15.2993193 15.6190729 15.818305 15.9633512 16.1750355 16.5050983 16.7638168 16.8373089 17.0005493 17.3630333 17.6673164 17.9605045 18.0897751 18.4077377 18.6850243 19.0943718 19.2382507 19.4889221 19.6401482 19.8987808 20.0812798 20.2080345 20.2972946 20.4496269 20.5623283 20.6650734 20.8627892 21.0402222 21.1679287 21.4045334 21.5658379 21.7396259 21.9218292 22.0397682 22.3251305 22.4745903 22.7037621 22.8782654 23.1735535 23.368679 23.5618877 23.8276978 24.0476913 24.3291168 24.5079231 24.6692009 24.9519348 25.1968689 25.409153 25.6397018 25.8888359 26.1174297 26.4658298 26.7081413 27.1320343 27.654026 27.8610535 28.3235607 28.7661819 29.4461937 30.4619522 31.2824879 32.1093674 33.4150543 41.6332207 ------------------------------ Transdef: Tab for variable 5 2.10590816 3.17266512 3.67098141 3.77911544 4.08482647 4.23569202 4.33816624 4.52733278 4.63964128 4.79123306 4.90772629 4.96743822 5.04740047 5.16213799 5.19633961 5.29036713 5.33599949 5.37793827 5.47170925 5.5235405 5.53855896 5.61608219 5.711586 5.80334663 5.90947962 5.98126554 6.01634216 6.06466484 6.18658495 6.2782383 6.30885744 6.39533854 6.47573423 6.55999422 6.62628794 6.63857841 6.69303989 6.73646164 6.77554226 6.78105545 6.86024714 6.9183569 6.95472622 6.98979473 7.026968 7.07777834 7.10952854 7.17511845 7.2110033 7.28258419 7.32803059 7.37597752 7.39281273 7.43712711 7.49992085 7.532969 7.56274605 7.59482145 7.65939617 7.69280815 7.75467396 7.81950474 7.88819313 7.92879868 7.98001766 8.01261139 8.04847145 8.0788765 8.12845898 8.15933609 8.23367119 8.2545948 8.28944588 8.36419296 8.4214592 8.48948002 8.54983234 8.610569 8.66417027 8.72093868 8.79292107 8.89214802 8.95865631 9.06826591 9.14399338 9.25183678 9.33372879 9.40639019 9.49002075 9.60350323 9.70802879 9.83172226 9.99764729 10.1579933 10.2502842 10.4766321 10.680563 11.0746374 11.5056133 12.5228004 16.957983 ------------------------------ Transdef: Tab for variable 6 25.0099258 26.5192642 27.3280792 28.880043 30.0736046 30.7098503 31.1085052 31.6391411 32.2784576 33.4813271 34.0951004 34.6991997 34.7509499 35.7004509 36.0883675 36.9864693 37.7575302 38.4128647 38.8173065 39.6650543 40.2513847 40.606739 41.054554 41.5545464 42.1998062 42.5973053 42.8843651 43.1103401 43.8359299 44.4518547 44.8393517 45.3658905 45.8523026 46.3613739 47.0025902 47.3881683 47.6494675 47.989994 48.7074661 49.0367279 49.7614136 50.2213669 50.4986649 51.1226921 51.8123932 52.183567 53.0728912 53.807251 54.168663 54.8161888 55.4725571 55.9979286 56.5599823 57.0875626 57.6913147 58.2130585 58.8479691 59.320137 59.7718887 60.4218407 60.9811325 61.6286621 62.1962662 62.9240341 63.3705139 64.0356445 64.7622986 65.2919922 65.8976212 66.6144485 67.4972534 68.319397 69.0256958 69.8507538 70.7397614 71.6448898 72.2493134 73.2507782 74.3199005 75.0913239 76.1084137 76.8854828 78.5541382 80.1313629 81.0469589 82.2665787 83.114975 84.3215485 85.8480453 87.2101822 89.6894073 91.9242554 94.581543 97.224823 99.1053619 102.505478 106.55249 113.63504 120.61702 131.110291 228.436234 ------------------------------ Transdef: Tab for variable 7 30.3830414 31.6798935 33.6189156 33.7796249 34.6674194 35.7290649 36.3379784 37.2159729 37.6551437 38.5671349 38.8823318 39.1590424 39.3913345 39.8977661 40.6177139 41.3234024 41.5645981 41.8729591 42.1641006 42.3231506 42.6303482 42.7331543 43.3777618 44.5998421 44.9043427 45.3432388 45.8179817 46.4393005 46.8464432 47.2049637 48.1696167 48.6661644 49.0493736 49.7367096 50.229351 50.8788834 51.4541016 52.3207397 53.3061943 53.9521561 54.7170105 55.3955536 56.022049 56.3313828 57.1821747 57.9903679 58.613575 59.2192535 60.1437378 61.054184 61.8793144 62.6711578 63.372982 63.897728 64.8992462 65.5214844 66.4304886 67.4353333 68.4474945 68.9349976 69.8961945 70.5782776 71.3379288 72.3704224 73.7083588 74.4432831 75.7350922 76.5879288 76.8364563 77.8181915 79.1906738 80.508667 81.681427 82.8230133 83.6691742 84.6439362 85.9964294 87.5381088 88.6008377 90.104866 91.2867584 93.4603043 94.7422562 97.0661163 99.2168045 101.174759 103.095871 105.355026 107.508591 111.011169 114.110664 116.258171 119.962402 125.844589 130.069977 137.684906 144.015076 150.828552 165.969971 183.514252 422.6651 ------------------------------ Transdef: Tab for variable 8 8.77728748 27.8724823 29.9587402 31.4358482 32.5045166 33.4348679 33.8005104 34.1147461 34.5790405 35.4376106 35.5507469 36.2401199 37.0536652 37.6690521 37.8191223 38.0622864 38.5405083 38.8838539 38.9448166 39.5889854 39.7207069 39.8317871 40.5571213 40.7662735 40.9025421 41.2176361 41.3531952 41.8696175 42.2009888 42.678257 42.9852219 43.4921722 44.1079445 44.4095306 44.8392715 44.9110336 45.7021027 45.9844513 46.5902634 46.7992897 47.2172585 47.7365303 48.225769 48.5852966 48.7885056 49.0557327 49.5815315 50.1643791 50.708519 51.3603134 51.7693863 52.3135529 53.044075 53.5214005 54.1255379 54.7202797 55.085392 55.5795593 56.1859665 56.7517242 57.3486557 58.1343231 58.6627579 59.3006516 60.0234375 60.7424736 61.1680222 61.9257812 62.4749031 62.9980316 63.9641762 64.6048431 65.151413 65.4414444 66.1619644 66.9229431 67.7543869 68.5484009 69.205368 69.7257233 70.4396515 71.2685852 71.8648911 72.7783661 73.9593048 74.5536957 75.8883057 77.1410217 78.3126373 80.6877518 83.1415253 84.8169403 87.7715683 89.8741302 93.8898849 97.7368774 103.451393 109.571503 118.179077 127.907562 350.142761 ------------------------------ Transdef: Tab for variable 9 55.8076553 62.8013802 66.3933029 67.6024475 69.9046402 70.6817169 73.03582 74.3152924 75.5148697 76.6523438 76.9866409 77.0461578 77.1511993 78.3612518 78.6432495 79.7529984 80.962944 81.2132568 81.9073334 83.174408 84.3543243 84.7601624 85.7657471 86.5688782 86.8801575 88.0353241 88.6493073 89.486618 90.9510345 91.6540146 92.1762848 93.2092133 94.2726135 95.3028641 95.6595001 96.8322296 97.4558411 98.0031509 98.4919357 99.0402527 99.6915588 100.493858 101.45108 102.811775 103.933105 104.803902 105.452019 106.386169 107.388573 108.102386 109.065933 110.040543 110.731873 111.900497 113.339508 114.14772 114.740402 115.357422 116.326851 116.695969 117.330238 118.324188 119.154846 119.921066 120.625992 121.51519 122.452377 123.098251 124.307663 125.477997 126.774414 127.99514 129.108032 129.849442 130.778137 131.973251 132.97583 134.112946 135.036026 136.371094 137.464111 138.838165 140.463501 141.630127 143.057068 144.551239 145.917297 147.100418 148.833496 150.690231 152.06955 153.978638 156.092987 157.947693 160.302368 162.826752 166.092606 170.170258 176.878891 187.19841 390.203552 ------------------------------ Transdef: Tab for variable 10 0.498700708 1.19377804 1.38363457 1.64057076 1.76530981 1.90797746 2.01600003 2.07098389 2.13595963 2.20382261 2.26983452 2.32582617 2.36616611 2.41155148 2.43214917 2.46292305 2.49483132 2.5356884 2.56001425 2.57181644 2.59931135 2.62726307 2.64190173 2.6571455 2.6762917 2.70122814 2.73425341 2.75132012 2.76730728 2.78013372 2.79762888 2.80873227 2.82045436 2.83500004 2.85164189 2.85574961 2.8619554 2.86893892 2.87558985 2.88053465 2.8879292 2.89757538 2.90943289 2.92300081 2.92854857 2.93616915 2.94515467 2.95066309 2.95640421 2.96181393 2.96962261 2.97707677 2.98383522 2.98813152 2.98992538 2.99241185 2.99656439 3.00256968 3.00450087 3.01022005 3.01337433 3.01868725 3.02450323 3.02650118 3.03240705 3.03534555 3.03723812 3.04301691 3.04616308 3.0524826 3.05445766 3.05968165 3.06396675 3.06600261 3.06953573 3.0730114 3.07433915 3.07655454 3.07872033 3.08332157 3.08805823 3.09035802 3.09335613 3.09701729 3.10100365 3.10331059 3.10589027 3.10941601 3.11207056 3.1139102 3.11682749 3.12005377 3.12156916 3.12352991 3.12639093 3.12892365 3.13169551 3.13252258 3.13538694 3.13791704 3.14158416 ------------------------------ Transdef: Tab for variable 11 0.00017022471 0.0071657747 0.00830614008 0.0187925082 0.0238713026 0.0337132215 0.0470670909 0.0539547801 0.0606813133 0.0684454143 0.0804202631 0.0923011303 0.0967803299 0.1061306 0.112356901 0.12004517 0.128539801 0.138038158 0.148284584 0.157216072 0.163566113 0.170013487 0.178746849 0.188716158 0.201066017 0.209246904 0.218696475 0.229732752 0.234483466 0.24312529 0.254066944 0.263131738 0.269790381 0.277195573 0.285204649 0.292445898 0.300955772 0.306698263 0.3137393 0.32194829 0.328451633 0.334169745 0.339213729 0.344868243 0.34997952 0.356379747 0.360285103 0.36558044 0.370805025 0.373891711 0.379552722 0.384804398 0.389959157 0.391155183 0.395145833 0.399440557 0.40265274 0.407598138 0.410366654 0.413113713 0.418493152 0.423037767 0.425535411 0.433053017 0.437642455 0.43942672 0.443320811 0.446399212 0.448182225 0.452416688 0.459055185 0.461102098 0.467266709 0.471878052 0.475621879 0.4797436 0.487815976 0.497186661 0.504549742 0.512760162 0.517917275 0.527307391 0.539809048 0.550534368 0.559528351 0.573495626 0.577880383 0.589050174 0.594162822 0.602122486 0.610370755 0.627059102 0.639765263 0.650276423 0.655359387 0.667286277 0.697314143 0.730749249 0.774225175 0.837028265 1.11007142 ------------------------------ Transdef: Tab for variable 12 0.322632462 0.400701284 0.404707223 0.407527924 0.410781711 0.412089348 0.413055807 0.41477555 0.417517126 0.420010984 0.422322035 0.426762849 0.427642643 0.430356443 0.432529509 0.435461998 0.437600136 0.440615237 0.443020225 0.446286768 0.449130177 0.452152878 0.454687655 0.457151949 0.460067153 0.461395979 0.463441193 0.466546953 0.469579369 0.472378969 0.473917365 0.476800859 0.480039239 0.481921047 0.484171987 0.487423003 0.49021256 0.490799695 0.493044853 0.495870173 0.499203563 0.501593888 0.502582967 0.504721165 0.508530617 0.512926042 0.516679525 0.520293474 0.52212429 0.525883317 0.528609872 0.530948937 0.534399748 0.538423181 0.543066144 0.54711628 0.551332057 0.556866169 0.558572531 0.561655164 0.566576481 0.569422662 0.577055573 0.57942152 0.584807754 0.590143323 0.594800711 0.597885966 0.601783335 0.607277513 0.615666687 0.621388435 0.628634691 0.633793712 0.637001276 0.64611721 0.649466991 0.655859947 0.660451651 0.666108429 0.667594612 0.674124002 0.678235948 0.679171026 0.687987387 0.698141575 0.703108668 0.713153362 0.723275125 0.728268981 0.737884521 0.742360175 0.753837228 0.758237541 0.763370872 0.780170023 0.79633975 0.817768931 0.883236706 0.940478206 1.15372479 ------------------------------ Transdef: Tab for variable 13 20.1440792 22.1192017 22.7843533 23.4329071 23.8643627 24.3865623 25.3104362 25.6225815 25.7497349 26.4048615 27.0051346 27.3294964 27.8308983 28.1140785 28.8261948 29.3014641 29.66329 29.9580956 30.1289978 30.3069649 30.5324898 30.9047318 31.0863724 31.3931808 31.7232208 32.239502 32.4414062 32.6876602 33.0597305 33.3518982 33.812397 34.1862068 34.4824066 34.9216461 35.28125 35.4330063 35.630024 35.977478 36.1708374 36.2884216 36.480114 37.1026688 37.6054306 38.1672592 38.5508499 38.8630371 39.2060776 39.5014114 39.8168411 40.2122192 40.53759 41.0951118 41.5580826 42.2055969 42.6445198 43.1600189 43.5029297 43.8798561 44.3050385 44.5029602 45.1987228 45.7548065 46.2294388 46.4478416 46.891983 47.5315628 47.8487091 48.4466934 49.1564178 49.9844818 50.634491 50.8981247 51.7732773 52.2223473 52.9439621 53.9259377 54.7008667 55.7735748 56.5764198 57.3945427 58.4571724 58.9952202 59.7673645 60.268013 61.6040039 62.4289551 63.542984 64.6101303 65.9947433 66.7296295 67.5240631 69.4291534 71.3799438 74.0571594 76.5444946 77.7402191 81.6507416 85.8297577 89.9189453 102.407043 194.379669 ------------------------------ Transdef: Tab for variable 14 10.0082674 10.4330349 10.7233944 11.0432711 11.4079571 11.8663988 12.1933155 12.317812 12.476078 12.8175774 13.3471107 13.6688023 13.8865585 14.4496498 14.7640438 15.0552731 15.1900158 15.8174677 16.0468102 16.2734833 16.4727859 16.7318306 16.9072552 17.1460915 17.5715714 17.7291374 17.9586754 18.1744843 18.3111019 18.5681858 18.8425713 19.1867409 19.4316292 19.7996292 19.9006939 20.2175064 20.5278721 20.6076889 20.7726059 21.1836128 21.5295982 21.6822853 21.891758 22.0566483 22.4140167 22.7433319 23.0563927 23.2040634 23.4643288 23.6997337 23.9410439 24.1877441 24.4911575 24.6698647 25.0143661 25.3688622 25.618866 25.8501282 26.057251 26.5095806 26.6364441 26.889679 27.1713867 27.3959026 27.6764584 27.9185944 28.1512604 28.2778397 28.6041927 28.9006577 29.1966858 29.6282463 30.119091 30.5433502 30.7604942 31.2406883 31.4575005 31.7198715 32.213768 32.7011909 33.2071075 33.814209 34.3495178 34.82864 35.2982979 35.7838364 36.3352051 37.2437439 38.171257 39.2573929 40.2989349 41.3411636 42.6343231 43.8931274 45.3010178 47.2387695 48.6711273 52.1539917 55.6730499 61.2959747 85.8603439 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 0.9 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 57.3 31.6 42.7 27.4 45.0 53.6 45.8 55.3 -26.9 -15.4 -30.4 19.6 38.2 2 57.3 100.0 48.0 46.1 45.0 78.2 93.5 79.7 90.8 -43.6 -21.4 -39.2 25.4 35.7 3 31.6 48.0 100.0 22.6 8.3 29.1 51.3 44.1 57.3 -6.7 -25.5 -47.8 62.4 14.9 4 42.7 46.1 22.6 100.0 17.8 34.7 46.4 45.0 53.4 -5.9 -24.7 -56.6 12.9 78.8 5 27.4 45.0 8.3 17.8 100.0 85.6 13.2 53.3 68.4 18.6 -3.6 -11.6 1.0 13.3 6 45.0 78.2 29.1 34.7 85.6 100.0 55.3 78.5 89.1 -10.7 -11.8 -26.7 13.8 26.8 7 53.6 93.5 51.3 46.4 13.2 55.3 100.0 72.4 75.9 -53.6 -23.7 -40.4 28.8 36.2 8 45.8 79.7 44.1 45.0 53.3 78.5 72.4 100.0 84.1 -14.6 -25.3 -38.7 23.8 36.6 9 55.3 90.8 57.3 53.4 68.4 89.1 75.9 84.1 100.0 -18.1 -23.1 -46.6 31.4 41.3 10 -26.9 -43.6 -6.7 -5.9 18.6 -10.7 -53.6 -14.6 -18.1 100.0 1.9 4.1 -2.0 -2.5 11 -15.4 -21.4 -25.5 -24.7 -3.6 -11.8 -23.7 -25.3 -23.1 1.9 100.0 47.2 -21.6 -27.7 12 -30.4 -39.2 -47.8 -56.6 -11.6 -26.7 -40.4 -38.7 -46.6 4.1 47.2 100.0 -29.5 -44.7 13 19.6 25.4 62.4 12.9 1.0 13.8 28.8 23.8 31.4 -2.0 -21.6 -29.5 100.0 35.3 14 38.2 35.7 14.9 78.8 13.3 26.8 36.2 36.6 41.3 -2.5 -27.7 -44.7 35.3 100.0 TOTAL CORRELATION TO TARGET (diagonal) 143.661616 TOTAL CORRELATION OF ALL VARIABLES 62.4598687 ROUND 1: MAX CORR ( 62.4516294) AFTER KILLING INPUT VARIABLE 11 CONTR 1.01448387 ROUND 2: MAX CORR ( 62.4399651) AFTER KILLING INPUT VARIABLE 8 CONTR 1.20697268 ROUND 3: MAX CORR ( 62.411926) AFTER KILLING INPUT VARIABLE 7 CONTR 1.87102335 ROUND 4: MAX CORR ( 62.3785459) AFTER KILLING INPUT VARIABLE 12 CONTR 2.0409597 ROUND 5: MAX CORR ( 62.3335648) AFTER KILLING INPUT VARIABLE 6 CONTR 2.36847773 ROUND 6: MAX CORR ( 62.2705599) AFTER KILLING INPUT VARIABLE 13 CONTR 2.8019064 ROUND 7: MAX CORR ( 61.8253869) AFTER KILLING INPUT VARIABLE 9 CONTR 7.43264131 ROUND 8: MAX CORR ( 61.4099683) AFTER KILLING INPUT VARIABLE 4 CONTR 7.15501623 ROUND 9: MAX CORR ( 60.8971203) AFTER KILLING INPUT VARIABLE 5 CONTR 7.9199085 ROUND 10: MAX CORR ( 60.5776596) AFTER KILLING INPUT VARIABLE 3 CONTR 6.22948001 ROUND 11: MAX CORR ( 60.3575303) AFTER KILLING INPUT VARIABLE 10 CONTR 5.15959037 ROUND 12: MAX CORR ( 57.302119) AFTER KILLING INPUT VARIABLE 14 CONTR 18.9604491 LAST REMAINING VARIABLE: 2 total correlation to target: 62.4598687 % total significance: 30.8756849 sigma correlations of single variables to target: variable 2: 57.302119 % , in sigma: 28.3260629 variable 3: 31.5666561 % , in sigma: 15.6042936 variable 4: 42.7274203 % , in sigma: 21.1213759 variable 5: 27.3771483 % , in sigma: 13.533301 variable 6: 45.0439449 % , in sigma: 22.2664997 variable 7: 53.5803515 % , in sigma: 26.4862877 variable 8: 45.8255162 % , in sigma: 22.6528526 variable 9: 55.2644049 % , in sigma: 27.3187631 variable 10: -26.9288079 % , in sigma: 13.3116736 variable 11: -15.393764 % , in sigma: 7.60957423 variable 12: -30.4443256 % , in sigma: 15.0494938 variable 13: 19.5872372 % , in sigma: 9.68252699 variable 14: 38.1699308 % , in sigma: 18.8684796 variables sorted by significance: 1 most relevant variable 2 corr 57.3021202 , in sigma: 28.3260635 2 most relevant variable 14 corr 18.9604492 , in sigma: 9.37268791 3 most relevant variable 10 corr 5.15959024 , in sigma: 2.55053182 4 most relevant variable 3 corr 6.22947979 , in sigma: 3.07940858 5 most relevant variable 5 corr 7.91990852 , in sigma: 3.91503545 6 most relevant variable 4 corr 7.15501642 , in sigma: 3.53692759 7 most relevant variable 9 corr 7.43264151 , in sigma: 3.67416554 8 most relevant variable 13 corr 2.80190635 , in sigma: 1.38506179 9 most relevant variable 6 corr 2.36847782 , in sigma: 1.17080578 10 most relevant variable 12 corr 2.0409596 , in sigma: 1.00890423 11 most relevant variable 7 corr 1.8710233 , in sigma: 0.924899892 12 most relevant variable 8 corr 1.20697272 , in sigma: 0.596640854 13 most relevant variable 11 corr 1.01448381 , in sigma: 0.501488126 global correlations between input variables: variable 2: 99.4151408 % variable 3: 92.6413181 % variable 4: 90.2941632 % variable 5: 98.0090779 % variable 6: 97.0463559 % variable 7: 99.2180122 % variable 8: 90.2522099 % variable 9: 99.2928783 % variable 10: 70.3538845 % variable 11: 51.2011199 % variable 12: 72.5752236 % variable 13: 76.7283505 % variable 14: 86.6153096 % significance loss when removing single variables: variable 2: corr = 5.21850019 % , sigma = 2.57965268 variable 3: corr = 10.4680152 % , sigma = 5.17463689 variable 4: corr = 8.0558934 % , sigma = 3.98225663 variable 5: corr = 9.04228349 % , sigma = 4.46985724 variable 6: corr = 2.08615908 % , sigma = 1.03124761 variable 7: corr = 2.22897471 % , sigma = 1.10184543 variable 8: corr = 1.0811844 % , sigma = 0.53446012 variable 9: corr = 6.75068246 % , sigma = 3.33705384 variable 10: corr = 9.28974912 % , sigma = 4.59218652 variable 11: corr = 1.01448387 % , sigma = 0.501488153 variable 12: corr = 1.44931966 % , sigma = 0.716439822 variable 13: corr = 2.84041522 % , sigma = 1.40409782 variable 14: corr = 9.35380297 % , sigma = 4.62385015 Keep only 2 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 3 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 8.99605465 sigma out 15 active outputs RANK 2 NODE 3 --> 7.84514952 sigma out 15 active outputs RANK 3 NODE 1 --> 6.40397024 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 7.26260376 sigma in 3act. ( 9.47716427 sig out 1act.) RANK 2 NODE 2 --> 6.10158539 sigma in 3act. ( 6.25187063 sig out 1act.) RANK 3 NODE 9 --> 4.69638681 sigma in 3act. ( 4.80900097 sig out 1act.) RANK 4 NODE 8 --> 4.06726599 sigma in 3act. ( 4.5749402 sig out 1act.) RANK 5 NODE 11 --> 3.53307891 sigma in 3act. ( 3.48449755 sig out 1act.) RANK 6 NODE 12 --> 3.29998183 sigma in 3act. ( 3.58890557 sig out 1act.) RANK 7 NODE 14 --> 2.89347434 sigma in 3act. ( 3.15628624 sig out 1act.) RANK 8 NODE 7 --> 2.84935522 sigma in 3act. ( 3.11222219 sig out 1act.) RANK 9 NODE 10 --> 2.27158356 sigma in 3act. ( 2.93592405 sig out 1act.) RANK 10 NODE 6 --> 1.8147558 sigma in 3act. ( 1.82903087 sig out 1act.) RANK 11 NODE 1 --> 1.39156449 sigma in 3act. ( 1.47621608 sig out 1act.) RANK 12 NODE 15 --> 1.35485065 sigma in 3act. ( 1.5323987 sig out 1act.) RANK 13 NODE 4 --> 1.1239053 sigma in 3act. ( 0.857768893 sig out 1act.) RANK 14 NODE 13 --> 0.995464146 sigma in 3act. ( 0.11776685 sig out 1act.) RANK 15 NODE 5 --> 0.754698932 sigma in 3act. ( 0.708161175 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 9.47716427 sigma out 1act.( 7.26260376 sig in 3act.) RANK 2 NODE 2 --> 6.25187063 sigma out 1act.( 6.10158539 sig in 3act.) RANK 3 NODE 9 --> 4.80900097 sigma out 1act.( 4.69638681 sig in 3act.) RANK 4 NODE 8 --> 4.5749402 sigma out 1act.( 4.06726599 sig in 3act.) RANK 5 NODE 12 --> 3.58890557 sigma out 1act.( 3.29998183 sig in 3act.) RANK 6 NODE 11 --> 3.48449755 sigma out 1act.( 3.53307891 sig in 3act.) RANK 7 NODE 14 --> 3.15628624 sigma out 1act.( 2.89347434 sig in 3act.) RANK 8 NODE 7 --> 3.11222219 sigma out 1act.( 2.84935522 sig in 3act.) RANK 9 NODE 10 --> 2.93592405 sigma out 1act.( 2.27158356 sig in 3act.) RANK 10 NODE 6 --> 1.82903087 sigma out 1act.( 1.8147558 sig in 3act.) RANK 11 NODE 15 --> 1.5323987 sigma out 1act.( 1.35485065 sig in 3act.) RANK 12 NODE 1 --> 1.47621608 sigma out 1act.( 1.39156449 sig in 3act.) RANK 13 NODE 4 --> 0.857768893 sigma out 1act.( 1.1239053 sig in 3act.) RANK 14 NODE 5 --> 0.708161175 sigma out 1act.( 0.754698932 sig in 3act.) RANK 15 NODE 13 --> 0.11776685 sigma out 1act.( 0.995464146 sig in 3act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 15.3418627 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 12.1962423 sigma out 15 active outputs RANK 2 NODE 1 --> 9.25865841 sigma out 15 active outputs RANK 3 NODE 2 --> 9.03717327 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 6.60637665 sigma in 3act. ( 5.17526865 sig out 1act.) RANK 2 NODE 10 --> 6.45536852 sigma in 3act. ( 4.27129602 sig out 1act.) RANK 3 NODE 9 --> 6.08194685 sigma in 3act. ( 5.65567923 sig out 1act.) RANK 4 NODE 3 --> 6.06862974 sigma in 3act. ( 7.52772808 sig out 1act.) RANK 5 NODE 6 --> 5.74570465 sigma in 3act. ( 4.19842482 sig out 1act.) RANK 6 NODE 7 --> 5.17815208 sigma in 3act. ( 3.67516494 sig out 1act.) RANK 7 NODE 11 --> 4.81030321 sigma in 3act. ( 3.35470963 sig out 1act.) RANK 8 NODE 14 --> 4.45150948 sigma in 3act. ( 4.22689533 sig out 1act.) RANK 9 NODE 2 --> 4.40587139 sigma in 3act. ( 5.18317842 sig out 1act.) RANK 10 NODE 13 --> 3.54154921 sigma in 3act. ( 0.798534036 sig out 1act.) RANK 11 NODE 8 --> 3.33551359 sigma in 3act. ( 3.30714202 sig out 1act.) RANK 12 NODE 1 --> 2.42764258 sigma in 3act. ( 1.79505777 sig out 1act.) RANK 13 NODE 4 --> 1.40356243 sigma in 3act. ( 0.176880211 sig out 1act.) RANK 14 NODE 5 --> 1.3233366 sigma in 3act. ( 0.329195678 sig out 1act.) RANK 15 NODE 15 --> 1.2374624 sigma in 3act. ( 0.754547298 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 7.52772808 sigma out 1act.( 6.06862974 sig in 3act.) RANK 2 NODE 9 --> 5.65567923 sigma out 1act.( 6.08194685 sig in 3act.) RANK 3 NODE 2 --> 5.18317842 sigma out 1act.( 4.40587139 sig in 3act.) RANK 4 NODE 12 --> 5.17526865 sigma out 1act.( 6.60637665 sig in 3act.) RANK 5 NODE 10 --> 4.27129602 sigma out 1act.( 6.45536852 sig in 3act.) RANK 6 NODE 14 --> 4.22689533 sigma out 1act.( 4.45150948 sig in 3act.) RANK 7 NODE 6 --> 4.19842482 sigma out 1act.( 5.74570465 sig in 3act.) RANK 8 NODE 7 --> 3.67516494 sigma out 1act.( 5.17815208 sig in 3act.) RANK 9 NODE 11 --> 3.35470963 sigma out 1act.( 4.81030321 sig in 3act.) RANK 10 NODE 8 --> 3.30714202 sigma out 1act.( 3.33551359 sig in 3act.) RANK 11 NODE 1 --> 1.79505777 sigma out 1act.( 2.42764258 sig in 3act.) RANK 12 NODE 13 --> 0.798534036 sigma out 1act.( 3.54154921 sig in 3act.) RANK 13 NODE 15 --> 0.754547298 sigma out 1act.( 1.2374624 sig in 3act.) RANK 14 NODE 5 --> 0.329195678 sigma out 1act.( 1.3233366 sig in 3act.) RANK 15 NODE 4 --> 0.176880211 sigma out 1act.( 1.40356243 sig in 3act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 15.3722639 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.486356795 *** contribution from regularisation: 0.0137731787 *** contribution from error: -0.500129998 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.553080261 *** contribution from regularisation: 0.00696482277 *** contribution from error: -0.560045063 *********************************************** -----------------> Test sample ENTER BFGS code START -4547.14155 -0.0578247271 -0.0123815378 EXIT FROM BFGS code FG_START 0. -0.0578247271 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.566442072 *** contribution from regularisation: 0.00439895084 *** contribution from error: -0.570841014 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -4656.15404 -0.0578247271 -26.5168781 EXIT FROM BFGS code FG_LNSRCH 0. -0.220180809 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.578193188 *** contribution from regularisation: 0.00746562239 *** contribution from error: -0.585658789 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4752.74796 -0.220180809 -2.8714354 EXIT FROM BFGS code NEW_X -4752.74796 -0.220180809 -2.8714354 ENTER BFGS code NEW_X -4752.74796 -0.220180809 -2.8714354 EXIT FROM BFGS code FG_LNSRCH 0. -0.229079038 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.579337537 *** contribution from regularisation: 0.00697102724 *** contribution from error: -0.586308539 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4762.15433 -0.229079038 0.495950371 EXIT FROM BFGS code NEW_X -4762.15433 -0.229079038 0.495950371 ENTER BFGS code NEW_X -4762.15433 -0.229079038 0.495950371 EXIT FROM BFGS code FG_LNSRCH 0. -0.225758865 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.579352081 *** contribution from regularisation: 0.00662404764 *** contribution from error: -0.585976124 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4762.27404 -0.225758865 -1.37578332 EXIT FROM BFGS code NEW_X -4762.27404 -0.225758865 -1.37578332 ENTER BFGS code NEW_X -4762.27404 -0.225758865 -1.37578332 EXIT FROM BFGS code FG_LNSRCH 0. -0.240738899 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.57931298 *** contribution from regularisation: 0.00630317023 *** contribution from error: -0.585616171 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4761.95285 -0.240738899 -1.84084308 EXIT FROM BFGS code FG_LNSRCH 0. -0.228750318 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.578424811 *** contribution from regularisation: 0.00749380048 *** contribution from error: -0.585918605 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4754.65192 -0.228750318 -1.66603553 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798696 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.579586983 *** contribution from regularisation: 0.00638798531 *** contribution from error: -0.585974991 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4764.20505 -0.225798696 -1.28169203 EXIT FROM BFGS code FG_LNSRCH 0. -0.225831464 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 9.95075893 sigma out 15 active outputs RANK 2 NODE 1 --> 9.73710823 sigma out 15 active outputs RANK 3 NODE 2 --> 4.95292568 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 6.89275789 sigma in 3act. ( 4.76161671 sig out 1act.) RANK 2 NODE 6 --> 6.3906517 sigma in 3act. ( 4.23088932 sig out 1act.) RANK 3 NODE 10 --> 6.07500744 sigma in 3act. ( 4.29907846 sig out 1act.) RANK 4 NODE 9 --> 4.67378616 sigma in 3act. ( 3.85115862 sig out 1act.) RANK 5 NODE 14 --> 4.18464327 sigma in 3act. ( 3.19985867 sig out 1act.) RANK 6 NODE 7 --> 3.63813853 sigma in 3act. ( 2.8198812 sig out 1act.) RANK 7 NODE 13 --> 3.63724828 sigma in 3act. ( 2.2741003 sig out 1act.) RANK 8 NODE 11 --> 2.9046669 sigma in 3act. ( 1.79234707 sig out 1act.) RANK 9 NODE 3 --> 2.55741787 sigma in 3act. ( 2.41516948 sig out 1act.) RANK 10 NODE 1 --> 2.23088336 sigma in 3act. ( 1.62558162 sig out 1act.) RANK 11 NODE 2 --> 1.57042003 sigma in 3act. ( 1.89569473 sig out 1act.) RANK 12 NODE 8 --> 1.46285248 sigma in 3act. ( 0.658231616 sig out 1act.) RANK 13 NODE 5 --> 1.11857152 sigma in 3act. ( 0.822682559 sig out 1act.) RANK 14 NODE 4 --> 0.863570452 sigma in 3act. ( 0.128630936 sig out 1act.) RANK 15 NODE 15 --> 0.851768792 sigma in 3act. ( 0.467261314 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 4.76161671 sigma out 1act.( 6.89275789 sig in 3act.) RANK 2 NODE 10 --> 4.29907846 sigma out 1act.( 6.07500744 sig in 3act.) RANK 3 NODE 6 --> 4.23088932 sigma out 1act.( 6.3906517 sig in 3act.) RANK 4 NODE 9 --> 3.85115862 sigma out 1act.( 4.67378616 sig in 3act.) RANK 5 NODE 14 --> 3.19985867 sigma out 1act.( 4.18464327 sig in 3act.) RANK 6 NODE 7 --> 2.8198812 sigma out 1act.( 3.63813853 sig in 3act.) RANK 7 NODE 3 --> 2.41516948 sigma out 1act.( 2.55741787 sig in 3act.) RANK 8 NODE 13 --> 2.2741003 sigma out 1act.( 3.63724828 sig in 3act.) RANK 9 NODE 2 --> 1.89569473 sigma out 1act.( 1.57042003 sig in 3act.) RANK 10 NODE 11 --> 1.79234707 sigma out 1act.( 2.9046669 sig in 3act.) RANK 11 NODE 1 --> 1.62558162 sigma out 1act.( 2.23088336 sig in 3act.) RANK 12 NODE 5 --> 0.822682559 sigma out 1act.( 1.11857152 sig in 3act.) RANK 13 NODE 8 --> 0.658231616 sigma out 1act.( 1.46285248 sig in 3act.) RANK 14 NODE 15 --> 0.467261314 sigma out 1act.( 0.851768792 sig in 3act.) RANK 15 NODE 4 --> 0.128630936 sigma out 1act.( 0.863570452 sig in 3act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 10.6712685 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.578849196 *** contribution from regularisation: 0.00712530827 *** contribution from error: -0.585974514 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -4758.14037 -0.225831464 -1.33519411 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798711 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.578979433 *** contribution from regularisation: 0.00699551404 *** contribution from error: -0.585974932 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4759.21116 -0.225798711 -1.42023432 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798696 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.579455435 *** contribution from regularisation: 0.00651956815 *** contribution from error: -0.585974991 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4763.12343 -0.225798696 -1.33404577 EXIT FROM BFGS code NEW_X -4763.12343 -0.225798696 -1.33404577 ENTER BFGS code NEW_X -4763.12343 -0.225798696 -1.33404577 EXIT FROM BFGS code FG_LNSRCH 0. -0.240691245 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.579165101 *** contribution from regularisation: 0.00644488027 *** contribution from error: -0.585609972 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4760.73709 -0.240691245 -2.02063012 EXIT FROM BFGS code FG_LNSRCH 0. -0.227556586 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.579093277 *** contribution from regularisation: 0.00684867566 *** contribution from error: -0.58594197 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4760.14695 -0.227556586 -1.42254925 EXIT FROM BFGS code FG_LNSRCH 0. -0.225833893 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.579101861 *** contribution from regularisation: 0.0068725599 *** contribution from error: -0.585974395 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4760.2175 -0.225833893 -1.3793292 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798711 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.579226434 *** contribution from regularisation: 0.00674851565 *** contribution from error: -0.585974932 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4761.24149 -0.225798711 -1.33184385 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798696 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.579257011 *** contribution from regularisation: 0.00671794917 *** contribution from error: -0.585974932 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4761.49274 -0.225798696 -1.31280518 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798696 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.578803539 *** contribution from regularisation: 0.00717140874 *** contribution from error: -0.585974932 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4757.76531 -0.225798696 -1.39963627 EXIT FROM BFGS code FG_LNSRCH 0. -0.225798696 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.579481065 *** contribution from regularisation: 0.0064939186 *** contribution from error: -0.585974991 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4763.33427 -0.225798696 -1.3586477 EXIT FROM BFGS code NEW_X -4763.33427 -0.225798696 -1.3586477 ENTER BFGS code NEW_X -4763.33427 -0.225798696 -1.3586477 EXIT FROM BFGS code FG_LNSRCH 0. -0.240636066 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 9.73781776 sigma out 15 active outputs RANK 2 NODE 1 --> 9.1879015 sigma out 15 active outputs RANK 3 NODE 2 --> 4.67489672 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 6.50061083 sigma in 3act. ( 4.21487951 sig out 1act.) RANK 2 NODE 6 --> 6.05550671 sigma in 3act. ( 3.76682115 sig out 1act.) RANK 3 NODE 10 --> 6.0188098 sigma in 3act. ( 4.1940589 sig out 1act.) RANK 4 NODE 9 --> 4.66818237 sigma in 3act. ( 3.9613781 sig out 1act.) RANK 5 NODE 14 --> 3.92345619 sigma in 3act. ( 2.94532681 sig out 1act.) RANK 6 NODE 7 --> 3.54972482 sigma in 3act. ( 2.73715973 sig out 1act.) RANK 7 NODE 13 --> 3.5126307 sigma in 3act. ( 2.16443872 sig out 1act.) RANK 8 NODE 11 --> 2.74884224 sigma in 3act. ( 1.66093421 sig out 1act.) RANK 9 NODE 3 --> 2.31332755 sigma in 3act. ( 2.07033157 sig out 1act.) RANK 10 NODE 1 --> 2.17319274 sigma in 3act. ( 1.59942257 sig out 1act.) RANK 11 NODE 2 --> 1.40008926 sigma in 3act. ( 1.63783157 sig out 1act.) RANK 12 NODE 8 --> 1.34555352 sigma in 3act. ( 0.569818437 sig out 1act.) RANK 13 NODE 5 --> 1.06791914 sigma in 3act. ( 0.836622417 sig out 1act.) RANK 14 NODE 15 --> 0.804986894 sigma in 3act. ( 0.463398129 sig out 1act.) RANK 15 NODE 4 --> 0.793691456 sigma in 3act. ( 0.14778018 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 4.21487951 sigma out 1act.( 6.50061083 sig in 3act.) RANK 2 NODE 10 --> 4.1940589 sigma out 1act.( 6.0188098 sig in 3act.) RANK 3 NODE 9 --> 3.9613781 sigma out 1act.( 4.66818237 sig in 3act.) RANK 4 NODE 6 --> 3.76682115 sigma out 1act.( 6.05550671 sig in 3act.) RANK 5 NODE 14 --> 2.94532681 sigma out 1act.( 3.92345619 sig in 3act.) RANK 6 NODE 7 --> 2.73715973 sigma out 1act.( 3.54972482 sig in 3act.) RANK 7 NODE 13 --> 2.16443872 sigma out 1act.( 3.5126307 sig in 3act.) RANK 8 NODE 3 --> 2.07033157 sigma out 1act.( 2.31332755 sig in 3act.) RANK 9 NODE 11 --> 1.66093421 sigma out 1act.( 2.74884224 sig in 3act.) RANK 10 NODE 2 --> 1.63783157 sigma out 1act.( 1.40008926 sig in 3act.) RANK 11 NODE 1 --> 1.59942257 sigma out 1act.( 2.17319274 sig in 3act.) RANK 12 NODE 5 --> 0.836622417 sigma out 1act.( 1.06791914 sig in 3act.) RANK 13 NODE 8 --> 0.569818437 sigma out 1act.( 1.34555352 sig in 3act.) RANK 14 NODE 15 --> 0.463398129 sigma out 1act.( 0.804986894 sig in 3act.) RANK 15 NODE 4 --> 0.14778018 sigma out 1act.( 0.793691456 sig in 3act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 9.9817543 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.579210997 *** contribution from regularisation: 0.00641178293 *** contribution from error: -0.585622787 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -4761.11424 -0.240636066 -1.88808775 EXIT FROM BFGS code FG_LNSRCH 0. -0.227522269 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.578468084 *** contribution from regularisation: 0.00747550465 *** contribution from error: -0.58594358 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4755.00741 -0.227522269 -1.4745363 EXIT FROM BFGS code FG_LNSRCH 0. -0.225811154 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.579485595 *** contribution from regularisation: 0.00648919726 *** contribution from error: -0.585974813 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4763.37174 -0.225811154 -1.27529359 EXIT FROM BFGS code FG_LNSRCH 0. -0.225823671 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.578941226 *** contribution from regularisation: 0.00703340955 *** contribution from error: -0.585974634 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4758.89712 -0.225823671 -1.27641666 EXIT FROM BFGS code FG_LNSRCH 0. -0.225811154 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.578823864 *** contribution from regularisation: 0.00715097459 *** contribution from error: -0.585974813 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4757.93193 -0.225811154 -1.38099051 EXIT FROM BFGS code FG_LNSRCH 0. -0.225811154 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.579244792 *** contribution from regularisation: 0.00672999071 *** contribution from error: -0.585974813 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4761.39242 -0.225811154 -1.33904147 EXIT FROM BFGS code NEW_X -4761.39242 -0.225811154 -1.33904147 ENTER BFGS code NEW_X -4761.39242 -0.225811154 -1.33904147 EXIT FROM BFGS code CONVERGENC -4761.39242 -0.225811154 -1.33904147 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 14.3279572 sigma out 15 active outputs RANK 2 NODE 1 --> 14.0655155 sigma out 15 active outputs RANK 3 NODE 2 --> 6.37939882 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 10.3815308 sigma in 3act. ( 6.51361609 sig out 1act.) RANK 2 NODE 6 --> 9.61530113 sigma in 3act. ( 5.64313555 sig out 1act.) RANK 3 NODE 10 --> 8.74190807 sigma in 3act. ( 6.31367445 sig out 1act.) RANK 4 NODE 9 --> 6.72274923 sigma in 3act. ( 5.5724082 sig out 1act.) RANK 5 NODE 14 --> 6.08453846 sigma in 3act. ( 4.03507233 sig out 1act.) RANK 6 NODE 13 --> 4.98943233 sigma in 3act. ( 2.9649694 sig out 1act.) RANK 7 NODE 7 --> 4.48799944 sigma in 3act. ( 4.36376619 sig out 1act.) RANK 8 NODE 11 --> 3.3093214 sigma in 3act. ( 2.68212819 sig out 1act.) RANK 9 NODE 3 --> 3.28551984 sigma in 3act. ( 3.2072854 sig out 1act.) RANK 10 NODE 1 --> 2.94092226 sigma in 3act. ( 2.11417937 sig out 1act.) RANK 11 NODE 2 --> 2.00183606 sigma in 3act. ( 2.46298003 sig out 1act.) RANK 12 NODE 8 --> 1.53422332 sigma in 3act. ( 0.942869842 sig out 1act.) RANK 13 NODE 5 --> 1.2719692 sigma in 3act. ( 0.996304631 sig out 1act.) RANK 14 NODE 15 --> 0.899635553 sigma in 3act. ( 0.521431029 sig out 1act.) RANK 15 NODE 4 --> 0.858736992 sigma in 3act. ( 0.169702172 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 6.51361609 sigma out 1act.( 10.3815308 sig in 3act.) RANK 2 NODE 10 --> 6.31367445 sigma out 1act.( 8.74190807 sig in 3act.) RANK 3 NODE 6 --> 5.64313555 sigma out 1act.( 9.61530113 sig in 3act.) RANK 4 NODE 9 --> 5.5724082 sigma out 1act.( 6.72274923 sig in 3act.) RANK 5 NODE 7 --> 4.36376619 sigma out 1act.( 4.48799944 sig in 3act.) RANK 6 NODE 14 --> 4.03507233 sigma out 1act.( 6.08453846 sig in 3act.) RANK 7 NODE 3 --> 3.2072854 sigma out 1act.( 3.28551984 sig in 3act.) RANK 8 NODE 13 --> 2.9649694 sigma out 1act.( 4.98943233 sig in 3act.) RANK 9 NODE 11 --> 2.68212819 sigma out 1act.( 3.3093214 sig in 3act.) RANK 10 NODE 2 --> 2.46298003 sigma out 1act.( 2.00183606 sig in 3act.) RANK 11 NODE 1 --> 2.11417937 sigma out 1act.( 2.94092226 sig in 3act.) RANK 12 NODE 5 --> 0.996304631 sigma out 1act.( 1.2719692 sig in 3act.) RANK 13 NODE 8 --> 0.942869842 sigma out 1act.( 1.53422332 sig in 3act.) RANK 14 NODE 15 --> 0.521431029 sigma out 1act.( 0.899635553 sig in 3act.) RANK 15 NODE 4 --> 0.169702172 sigma out 1act.( 0.858736992 sig in 3act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 14.8155508 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.578940332 *** contribution from regularisation: 0.00703450106 *** contribution from error: -0.585974813 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 22767 Closing output file done