NNInput NNInputs_160.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 204464 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 57529 nbkg = 146935 Bkg Entries: 146935 Sig Entries: 57529 Chosen entries: 57529 Signal fraction: 1 Background fraction: 0.391527 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 146935 Actual Signal Entries: 57529 Entries to split: 57529 Test with : 28764 Train with : 28764 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 57529 for Signal Prepared event 0 for Signal with 57529 events ====Entry 0 Variable Ht : 120.598 Variable LepAPt : 29.9056 Variable LepBPt : 13.5082 Variable MetSigLeptonsJets : 11.7141 Variable MetSpec : 77.1834 Variable SumEtLeptonsJets : 43.4138 Variable VSumJetLeptonsPt : 43.304 Variable addEt : 120.598 Variable dPhiLepSumMet : 2.82377 Variable dPhiLeptons : 0.153702 Variable dRLeptons : 0.732968 Variable lep1_E : 40.451 Variable lep2_E : 32.7603 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2160 Ht = 120.598 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 29.9058 LepAPt = 29.9056 LepBEt = 13.5084 LepBPt = 13.5082 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 77.1834 MetDelPhi = 2.77599 MetSig = 6.78472 MetSigLeptonsJets = 11.7141 MetSpec = 77.1834 Mjj = 0 MostCentralJetEta = 0 MtllMet = 139.14 Njets = 0 SB = 0 SumEt = 129.415 SumEtJets = 0 SumEtLeptonsJets = 43.4138 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 43.304 addEt = 120.598 dPhiLepSumMet = 2.82377 dPhiLeptons = 0.153702 dRLeptons = 0.732968 diltype = 34 dimass = 15.0375 event = 2600 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 40.451 lep2_E = 32.7603 rand = 0.999742 run = 233650 weight = 2.30027e-06 ===Show End Prepared event 10000 for Signal with 57529 events Prepared event 20000 for Signal with 57529 events Prepared event 30000 for Signal with 57529 events Prepared event 40000 for Signal with 57529 events Prepared event 50000 for Signal with 57529 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 146935 for Background Prepared event 0 for Background with 146935 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.307151 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 146935 events Prepared event 20000 for Background with 146935 events Prepared event 30000 for Background with 146935 events Prepared event 40000 for Background with 146935 events Prepared event 50000 for Background with 146935 events Prepared event 60000 for Background with 146935 events Prepared event 70000 for Background with 146935 events Prepared event 80000 for Background with 146935 events Prepared event 90000 for Background with 146935 events Prepared event 100000 for Background with 146935 events Prepared event 110000 for Background with 146935 events Prepared event 120000 for Background with 146935 events Prepared event 130000 for Background with 146935 events Prepared event 140000 for Background with 146935 events Warning: found 4618 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 204464 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4618 negative weights. Signal fraction: 62.4729156 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 47.3116226 62.6395798 65.1399689 66.9172974 68.2545242 69.1719589 70.2174454 71.0071259 71.7829437 72.7631836 73.5592957 74.1543732 74.8925629 75.603653 76.1992645 76.8851776 77.4270554 78.0748825 78.7614822 79.3627701 79.8421478 80.6572723 81.2491989 81.9855957 82.6375198 83.4151764 84.3115692 85.2330093 86.1289215 87.2758408 88.6288605 90.0957489 91.7008514 93.2663116 95.0218353 96.5829239 98.2483826 99.8506165 101.510162 102.875839 104.550034 106.108337 107.815353 109.244385 110.968697 112.58844 114.10498 115.645988 117.332527 119.078384 120.775497 122.393036 124.140015 125.781837 127.432304 128.842743 130.45639 132.023132 133.52951 135.110565 136.658783 138.258118 139.700623 141.244705 142.761642 144.203278 145.717529 147.146729 148.447479 149.95401 151.346191 152.640717 154.178467 155.603088 156.972015 158.600311 160.120544 161.908569 163.75589 165.92157 168.38855 170.856934 173.405945 176.020538 178.864273 181.885101 184.858307 188.277802 191.631561 195.210388 199.063263 203.68428 208.603638 214.781586 220.903076 229.187561 239.161102 252.316681 271.905701 302.80304 1486.89319 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.3133507 20.6042194 20.895216 21.1918068 21.47192 21.7212486 21.9653168 22.2216377 22.4568481 22.6937256 22.9089146 23.1120758 23.3568993 23.5792809 23.746582 23.9462833 24.1927986 24.4166908 24.608633 24.8561134 25.0913467 25.3228645 25.5338974 25.6925583 25.8762016 26.0923824 26.2569256 26.4824677 26.7122078 26.9396572 27.1601028 27.3893642 27.6712074 27.9071503 28.1174202 28.3350487 28.5930519 28.8440361 29.0892563 29.2932739 29.5392456 29.7883987 30.0895462 30.3963089 30.6636581 30.967907 31.2536449 31.5331821 31.8266487 32.1275101 32.3921509 32.6427155 32.9035988 33.1879044 33.4607239 33.7460365 34.0532761 34.3483162 34.6471863 34.9759293 35.3047714 35.5787888 35.8829727 36.2001419 36.5240974 36.8460083 37.1672974 37.4932976 37.8092346 38.1378479 38.4597549 38.7829742 39.1006546 39.4065323 39.7478561 40.0841064 40.461731 40.8215408 41.235302 41.6299934 42.0593872 42.4741287 42.9062576 43.3934288 43.8948059 44.3634644 44.934864 45.5709457 46.2629318 47.0493393 47.8065872 48.684845 49.8723793 51.0366211 52.5861893 54.3901787 56.9526062 60.2859268 66.0109177 155.865875 ------------------------------ Transdef: Tab for variable 4 10.0000381 10.1008406 10.2260914 10.3513145 10.4585781 10.563343 10.6870766 10.8140984 10.9395256 11.0536213 11.2032919 11.3381605 11.4606953 11.6044846 11.7469826 11.9009418 12.0452023 12.2111225 12.3762522 12.5564775 12.7174292 12.8969355 13.0596085 13.2220554 13.4027605 13.5829077 13.7348909 13.9175949 14.133976 14.3173084 14.4813805 14.7151184 14.9359674 15.1667938 15.3794584 15.5798836 15.8182249 16.0509701 16.3266296 16.5706749 16.8028984 17.0471802 17.3414268 17.6199074 17.8785057 18.1613388 18.4139137 18.6969261 18.9785919 19.2656212 19.5795593 19.9141884 20.2021675 20.4721584 20.7982273 21.1149731 21.4285564 21.7294464 22.0450134 22.3755951 22.6722984 23.0041714 23.3312626 23.6631603 24.0137062 24.3897438 24.7319565 25.1133919 25.5059586 25.8678017 26.23102 26.6011086 26.9667263 27.3701897 27.7329483 28.0977821 28.4540634 28.8415203 29.23246 29.6344299 30.0586281 30.4422646 30.9163284 31.3473778 31.8254662 32.3063126 32.775322 33.2409973 33.7773552 34.2806931 34.7856064 35.3523712 35.9758873 36.6774902 37.4331207 38.4671326 39.6330147 41.3129807 43.6237755 47.6845169 72.3423386 ------------------------------ Transdef: Tab for variable 5 1.04102874 2.66056204 3.05602074 3.34309196 3.61595535 3.80011368 4.00508976 4.14437962 4.28483868 4.40824413 4.51627254 4.62579155 4.71835852 4.8104701 4.90594101 4.99416351 5.07041645 5.14877033 5.2195611 5.28394127 5.34885788 5.40172434 5.45312071 5.50654507 5.56088495 5.61272097 5.65850925 5.71089792 5.75965261 5.80269527 5.84766245 5.89917374 5.94041872 5.98877382 6.02901554 6.07885265 6.12551975 6.17270803 6.21835041 6.26227379 6.29951382 6.3395977 6.39072466 6.43546391 6.4841814 6.5284543 6.57933569 6.62752438 6.67031193 6.71217251 6.75375366 6.78901291 6.83391523 6.87329292 6.92175817 6.96764565 7.02100182 7.06980991 7.12128687 7.16957045 7.22664261 7.2790699 7.3280592 7.38392639 7.43232489 7.49050999 7.55469131 7.61744213 7.67462921 7.73890448 7.79587936 7.85246086 7.91297388 7.97150993 8.0360527 8.09577942 8.15671921 8.2172699 8.27184677 8.33565903 8.40177155 8.47509193 8.53877354 8.60159111 8.66789436 8.74493217 8.81840134 8.90330029 8.98622894 9.08042717 9.1824913 9.29505825 9.41467667 9.55686188 9.71169186 9.89988136 10.119524 10.4220457 10.8397808 11.5285168 26.9916515 ------------------------------ Transdef: Tab for variable 6 15.0050735 20.5314255 25.0426407 26.1654148 27.1068516 27.9954872 28.8705139 29.7103176 30.4567337 31.1642342 31.7651577 32.4212151 32.8927841 33.3239594 33.8082047 34.295784 34.6936417 35.0777435 35.5296631 35.9534149 36.2722549 36.5428658 36.9631653 37.3101578 37.6928444 38.0238152 38.4585037 38.8263092 39.2041168 39.5370445 39.9384003 40.2965736 40.6235123 41.0360565 41.4597282 41.8883324 42.2724533 42.6768303 43.0415154 43.5155602 44.001461 44.4865189 45.1176605 45.7894592 46.4799271 47.1931152 47.8401833 48.5580025 49.2216225 49.862648 50.4698448 51.2007103 52.0250854 52.9261169 53.7674789 54.483017 55.2680893 56.1151047 56.8960762 57.5961761 58.3798065 59.1574516 59.9104233 60.6672668 61.4373169 62.2287254 63.0649185 63.8932953 64.6706543 65.4890976 66.2830353 67.0642624 67.8308563 68.6150742 69.4220428 70.1749115 70.9697266 71.7711945 72.5479355 73.342804 74.1881714 75.0589371 75.8624725 76.7759247 77.6135864 78.5278168 79.515152 80.5014038 81.5881805 82.7939911 84.1603546 85.7325821 87.5335541 89.5758972 92.3726044 95.3417892 98.9575882 103.89209 111.271576 124.324234 738.435913 ------------------------------ Transdef: Tab for variable 7 30.1199875 32.5477371 33.6055984 34.2488937 34.9347191 35.4585953 35.9775276 36.3391647 36.70998 37.0690689 37.4027176 37.7534218 38.0834808 38.3800163 38.6589203 38.8917122 39.2067108 39.4526978 39.7473831 40.0852432 40.4140167 40.7698441 41.0864372 41.4217987 41.7587585 42.1755524 42.7264786 43.3126144 44.0415421 44.6767197 45.3090057 46.0874557 46.9214439 47.8530006 48.755661 49.7377625 50.7946167 51.8931046 52.8598747 53.9026871 54.9357643 55.9141464 56.7161255 57.6801071 58.5959244 59.45401 60.2549515 61.1570816 62.0072632 62.8957977 63.8125305 64.6763611 65.4720306 66.3870697 67.259903 68.102829 69.0095215 69.9027328 70.7545547 71.5584869 72.4577942 73.2957306 74.1427002 74.9966125 75.8472443 76.7706375 77.6781616 78.6195221 79.5371857 80.716629 81.9053802 83.0134888 84.3442612 85.8499985 87.2553253 88.7689667 90.4860611 92.0582809 93.7442627 95.4495392 97.2532272 99.2214966 101.123184 103.179726 105.481934 107.727737 110.273636 112.94397 115.816788 118.924736 122.506332 126.527649 131.225647 135.941467 141.625854 148.483231 157.539673 168.514618 183.246948 207.877686 748.456726 ------------------------------ Transdef: Tab for variable 8 0.678276479 24.3339386 28.8588562 30.8891983 31.8454819 32.5302658 33.2246017 33.7325668 34.2668381 34.7263336 35.0801125 35.4146042 35.7742691 36.0600967 36.3575249 36.6572189 36.95578 37.272625 37.5445976 37.8095932 38.0446663 38.3304596 38.5811958 38.8398819 39.0708618 39.3072433 39.6145554 39.9472656 40.2158585 40.4869347 40.8029022 41.10746 41.4383278 41.8189545 42.2174835 42.7123871 43.2154846 43.7274017 44.2456207 44.8068047 45.3417664 45.9875755 46.4882507 47.129303 47.8073883 48.4887505 49.1709366 49.7697563 50.462677 51.1585846 51.9222107 52.6727066 53.3562279 54.0418739 54.8019638 55.5556068 56.2886276 56.9710999 57.6943359 58.3835068 59.0231781 59.6699371 60.4295883 61.1173019 61.7780838 62.4913635 63.2169495 63.9377289 64.5922241 65.2510681 65.9998856 66.6485214 67.325943 68.1001434 68.858963 69.5771179 70.3253479 71.0515289 71.8317871 72.6164856 73.3643417 74.1320953 74.9116287 75.7125244 76.5993347 77.5059891 78.4604187 79.3345871 80.4386139 81.6911469 83.192627 84.711113 86.6381683 88.8502045 91.4614029 94.6167984 98.4195099 103.701035 110.819069 124.013054 744.690247 ------------------------------ Transdef: Tab for variable 9 47.3116226 61.8818779 64.2924347 66.1082306 67.4325104 68.4872284 69.3713989 70.1941986 70.8739014 71.6182022 72.4435654 73.2299118 73.8575897 74.3816071 75.0713348 75.6449509 76.1645813 76.8113556 77.2684021 77.865921 78.3904114 78.9750824 79.4729691 79.9367905 80.6517487 81.1951828 81.7969818 82.4281464 82.9719849 83.6303864 84.4247742 85.083786 85.9141388 86.7469635 87.7118759 88.7622757 89.9337769 91.1164703 92.3064575 93.5475845 95.0389099 96.3340759 97.8188553 99.2030792 100.711273 101.894806 103.223648 104.636642 105.946609 107.274384 108.709724 110.131279 111.350113 112.704422 114.008331 115.281128 116.552002 117.914024 119.338966 120.529755 121.741104 123.020309 124.295815 125.700447 126.96434 128.17691 129.394714 130.677948 131.952576 133.141998 134.395233 135.598785 136.805923 137.992889 139.200989 140.330536 141.584106 142.745605 143.906982 145.040253 146.220978 147.361618 148.407944 149.538986 150.665741 151.778198 152.889496 154.004608 155.176575 156.321594 157.563721 159.024734 160.445923 161.982086 163.857437 166.233078 169.435989 173.714691 180.707306 194.313141 801.871094 ------------------------------ Transdef: Tab for variable 10 0.0050833188 1.15760469 1.4764905 1.67860675 1.83710551 1.94720864 2.03725624 2.11844158 2.19031119 2.24581575 2.29796696 2.34152269 2.38474965 2.42353892 2.45938015 2.492414 2.52408409 2.55539227 2.58282256 2.60481834 2.62840462 2.65252876 2.67349219 2.69396353 2.71338034 2.73010635 2.74824429 2.76395607 2.78035569 2.79604721 2.80982065 2.82257056 2.83263373 2.8437891 2.85504436 2.86509681 2.87555146 2.88589931 2.89630556 2.90490317 2.91320157 2.92151737 2.92861223 2.93694615 2.94365263 2.95049953 2.95700216 2.96346617 2.96949863 2.97559929 2.98071003 2.98560619 2.98995566 2.99465823 2.99957848 3.00475883 3.00898981 3.01376748 3.01821089 3.02258682 3.02718163 3.03097677 3.0348835 3.03873205 3.04294205 3.04655838 3.05051184 3.05418229 3.05769396 3.06073809 3.06360292 3.06664157 3.07001734 3.07310677 3.07552433 3.07885933 3.08202124 3.08476138 3.08746529 3.09019375 3.09270239 3.09513712 3.0975976 3.10032701 3.10277987 3.10526657 3.10804415 3.11095715 3.11368489 3.11579132 3.11814213 3.12044811 3.12302828 3.12524605 3.127635 3.129807 3.13230109 3.13454318 3.13696003 3.13911867 3.14159226 ------------------------------ Transdef: Tab for variable 11 1.19226625E-05 0.00566148758 0.0102593508 0.0160410348 0.0216364302 0.0282758474 0.0352260321 0.0421133637 0.0496044308 0.0563914403 0.0637938976 0.0707708895 0.0772867799 0.0847084522 0.0911989212 0.098359853 0.105697215 0.112945139 0.119481325 0.126121789 0.13262558 0.138672367 0.145045295 0.151970208 0.157924414 0.163890123 0.169582129 0.174840927 0.180086076 0.186067104 0.191793621 0.196653366 0.201340973 0.20603025 0.210351586 0.215033829 0.220172763 0.224695146 0.229274511 0.233387709 0.238345116 0.243313551 0.24818635 0.252929837 0.257609993 0.261873007 0.266603053 0.271635175 0.27680397 0.2822721 0.287280917 0.292739451 0.298304856 0.303018987 0.308771729 0.314018846 0.319840372 0.325138927 0.330615848 0.335950613 0.341792345 0.347249627 0.352539539 0.358035445 0.363685369 0.369141698 0.375387549 0.381417334 0.387931764 0.393845081 0.399987102 0.405934989 0.412259817 0.41869092 0.426171005 0.433207631 0.43975395 0.447262496 0.454884529 0.462733716 0.471764684 0.479261488 0.489301592 0.499178648 0.509189367 0.518412948 0.528209627 0.538865805 0.55012238 0.563123167 0.576946735 0.591220975 0.608337283 0.626685619 0.642194033 0.664953947 0.693461895 0.725788116 0.764015794 0.827995956 1.13201857 ------------------------------ Transdef: Tab for variable 12 0.200000316 0.206842452 0.213742673 0.220744327 0.228004158 0.23356685 0.238978401 0.244660407 0.250193477 0.255972773 0.261037618 0.266352952 0.272127151 0.277267814 0.282902896 0.288400233 0.29443568 0.299892366 0.30469209 0.309943676 0.314875841 0.320292115 0.325458914 0.330327809 0.335462868 0.340672135 0.346022606 0.350350499 0.354932904 0.360662282 0.365480334 0.370317757 0.37582469 0.380645573 0.386029035 0.390861511 0.39564389 0.400299489 0.404218972 0.407875359 0.411837786 0.41484803 0.418657243 0.422506392 0.426295072 0.430294216 0.434272826 0.438126683 0.441707164 0.44565469 0.449170887 0.453903794 0.457612157 0.462559253 0.467190385 0.471457243 0.475747466 0.480035067 0.484613538 0.488796711 0.493128598 0.497756898 0.502768517 0.507825911 0.512758911 0.517659962 0.522214532 0.526716709 0.532446146 0.538001835 0.545014381 0.551597774 0.557437062 0.563950241 0.570067585 0.577012479 0.583826423 0.590894461 0.597031772 0.60472393 0.612704396 0.620833397 0.629567981 0.637066841 0.646322846 0.656496167 0.664956093 0.673259735 0.681758285 0.69303894 0.703694463 0.719861627 0.73174727 0.745397031 0.758223653 0.774819613 0.796390414 0.822017312 0.85692966 0.897640049 1.14129901 ------------------------------ Transdef: Tab for variable 13 20.0361443 21.513546 22.3549252 22.955986 23.526474 24.0795956 24.6533127 25.1843758 25.6994591 26.0652447 26.5634365 26.970356 27.3501053 27.7547436 28.1503334 28.5152531 28.8637123 29.2391052 29.5670242 29.9165459 30.2264557 30.5762634 30.8866158 31.1478577 31.5217056 31.8515778 32.2092361 32.5011978 32.8287811 33.1517639 33.4993515 33.8492126 34.2035904 34.5593567 34.9342957 35.3056793 35.6636124 35.9827957 36.2455978 36.5572357 36.9063301 37.2127228 37.5381088 37.8717651 38.2040482 38.5370407 38.8886795 39.2779617 39.615715 39.9540024 40.2962799 40.6343689 40.9855728 41.3210297 41.7108917 42.1004715 42.5034676 42.889946 43.2630539 43.6134415 44.0065956 44.4480667 44.8751144 45.2611465 45.69207 46.1685867 46.6260757 47.0671158 47.5145416 48.0098801 48.4697418 48.9752045 49.5335541 50.1216049 50.7098236 51.3059769 51.9299774 52.5373268 53.2145691 53.9920387 54.7397842 55.4790497 56.3315277 57.1805344 58.0959282 59.0212708 60.0690651 61.0818901 62.340477 63.5062408 64.9879456 66.7218628 68.6132355 70.6662445 73.2985535 75.9360428 78.7614212 83.5223083 89.6159897 99.2424622 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0086575 10.7497358 11.1771488 11.5094509 11.8167305 12.1959524 12.4931641 12.8109188 13.1705894 13.4733276 13.7289162 14.0296068 14.3276691 14.5949516 14.9168072 15.1956596 15.4884205 15.7920961 16.1584053 16.4656258 16.7909756 17.0483017 17.3690987 17.6659241 17.8883133 18.225666 18.5164318 18.8632545 19.1133652 19.3921261 19.7707214 20.0776482 20.3840542 20.7243996 21.070961 21.3779583 21.6685143 21.9799423 22.3357239 22.6532249 22.9657059 23.3104248 23.6117935 23.94804 24.2712631 24.5772324 24.9336166 25.2380238 25.601757 25.8940086 26.2489948 26.5872421 26.9365807 27.2743168 27.6280003 27.9891071 28.3454056 28.7282104 29.0739098 29.4185219 29.7386208 30.0969276 30.4443741 30.8063183 31.1689568 31.4969444 31.8951797 32.2712173 32.6774368 33.0563889 33.4208374 33.8163071 34.2159348 34.6068039 34.9709358 35.3898849 35.8149796 36.2586365 36.7107391 37.1621895 37.6418915 38.15765 38.7199478 39.2173767 39.7735596 40.3275909 41.013237 41.7112541 42.5275764 43.2726746 44.1232605 45.0441437 46.1169968 47.4699707 48.7641335 50.4857445 52.5923309 55.0117607 58.9715576 65.4968185 127.567856 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 68.3 52.0 57.6 33.1 52.0 65.2 57.8 66.6 -25.0 -15.6 -32.5 16.8 34.5 2 68.3 100.0 66.8 65.1 37.8 70.0 94.4 75.7 91.7 -42.7 -22.7 -43.4 31.0 42.3 3 52.0 66.8 100.0 51.2 10.1 34.7 70.5 53.8 71.3 -16.3 -23.5 -46.5 62.9 33.4 4 57.6 65.1 51.2 100.0 22.5 42.9 64.6 57.7 72.0 -11.2 -27.4 -55.1 23.2 75.8 5 33.1 37.8 10.1 22.5 100.0 85.1 10.4 59.3 59.1 24.8 -8.0 -11.9 -4.3 12.1 6 52.0 70.0 34.7 42.9 85.1 100.0 48.9 80.2 80.8 -5.1 -14.9 -26.4 9.8 26.5 7 65.2 94.4 70.5 64.6 10.4 48.9 100.0 66.8 80.2 -51.8 -21.2 -42.8 36.1 43.4 8 57.8 75.7 53.8 57.7 59.3 80.2 66.8 100.0 84.0 -9.1 -21.7 -38.2 24.0 38.8 9 66.6 91.7 71.3 72.0 59.1 80.8 80.2 84.0 100.0 -18.0 -25.3 -48.5 35.0 48.9 10 -25.0 -42.7 -16.3 -11.2 24.8 -5.1 -51.8 -9.1 -18.0 100.0 2.9 6.7 -5.6 -4.8 11 -15.6 -22.7 -23.5 -27.4 -8.0 -14.9 -21.2 -21.7 -25.3 2.9 100.0 54.1 -5.8 -16.4 12 -32.5 -43.4 -46.5 -55.1 -11.9 -26.4 -42.8 -38.2 -48.5 6.7 54.1 100.0 -17.1 -32.6 13 16.8 31.0 62.9 23.2 -4.3 9.8 36.1 24.0 35.0 -5.6 -5.8 -17.1 100.0 50.0 14 34.5 42.3 33.4 75.8 12.1 26.5 43.4 38.8 48.9 -4.8 -16.4 -32.6 50.0 100.0 TOTAL CORRELATION TO TARGET (diagonal) 172.980486 TOTAL CORRELATION OF ALL VARIABLES 74.5876583 ROUND 1: MAX CORR ( 74.5864154) AFTER KILLING INPUT VARIABLE 14 CONTR 0.43059943 ROUND 2: MAX CORR ( 74.5827891) AFTER KILLING INPUT VARIABLE 6 CONTR 0.735476313 ROUND 3: MAX CORR ( 74.5782323) AFTER KILLING INPUT VARIABLE 8 CONTR 0.824439626 ROUND 4: MAX CORR ( 74.5652679) AFTER KILLING INPUT VARIABLE 11 CONTR 1.39052739 ROUND 5: MAX CORR ( 74.529216) AFTER KILLING INPUT VARIABLE 2 CONTR 2.31843316 ROUND 6: MAX CORR ( 74.3565274) AFTER KILLING INPUT VARIABLE 12 CONTR 5.07058913 ROUND 7: MAX CORR ( 74.1400131) AFTER KILLING INPUT VARIABLE 10 CONTR 5.67024042 ROUND 8: MAX CORR ( 73.4854334) AFTER KILLING INPUT VARIABLE 13 CONTR 9.83018896 ROUND 9: MAX CORR ( 72.2914312) AFTER KILLING INPUT VARIABLE 3 CONTR 13.1931003 ROUND 10: MAX CORR ( 72.0104011) AFTER KILLING INPUT VARIABLE 9 CONTR 6.36813608 ROUND 11: MAX CORR ( 70.4156759) AFTER KILLING INPUT VARIABLE 4 CONTR 15.0708477 ROUND 12: MAX CORR ( 65.2379052) AFTER KILLING INPUT VARIABLE 5 CONTR 26.502512 LAST REMAINING VARIABLE: 7 total correlation to target: 74.5876583 % total significance: 160.671076 sigma correlations of single variables to target: variable 2: 68.28421 % , in sigma: 147.092666 variable 3: 51.9521989 % , in sigma: 111.911487 variable 4: 57.5887966 % , in sigma: 124.053418 variable 5: 33.1396209 % , in sigma: 71.3868578 variable 6: 52.0482517 % , in sigma: 112.118396 variable 7: 65.2379052 % , in sigma: 140.530547 variable 8: 57.8408211 % , in sigma: 124.59631 variable 9: 66.6013945 % , in sigma: 143.467673 variable 10: -24.9953274 % , in sigma: 53.8430385 variable 11: -15.6303396 % , in sigma: 33.669692 variable 12: -32.4959147 % , in sigma: 70.0002348 variable 13: 16.7974717 % , in sigma: 36.1838395 variable 14: 34.5290586 % , in sigma: 74.3798792 variables sorted by significance: 1 most relevant variable 7 corr 65.2379074 , in sigma: 140.530552 2 most relevant variable 5 corr 26.502512 , in sigma: 57.0897012 3 most relevant variable 4 corr 15.0708475 , in sigma: 32.4644766 4 most relevant variable 9 corr 6.36813593 , in sigma: 13.7177554 5 most relevant variable 3 corr 13.1931 , in sigma: 28.4195753 6 most relevant variable 13 corr 9.83018875 , in sigma: 21.175447 7 most relevant variable 10 corr 5.6702404 , in sigma: 12.2144018 8 most relevant variable 12 corr 5.07058907 , in sigma: 10.9226784 9 most relevant variable 2 corr 2.31843305 , in sigma: 4.99419262 10 most relevant variable 11 corr 1.39052737 , in sigma: 2.99536859 11 most relevant variable 8 corr 0.824439645 , in sigma: 1.77594535 12 most relevant variable 6 corr 0.735476315 , in sigma: 1.58430729 13 most relevant variable 14 corr 0.430599421 , in sigma: 0.927564614 global correlations between input variables: variable 2: 99.1754367 % variable 3: 92.1685055 % variable 4: 92.9425723 % variable 5: 95.4722529 % variable 6: 95.4133982 % variable 7: 98.8070221 % variable 8: 89.914785 % variable 9: 98.674337 % variable 10: 71.7888242 % variable 11: 54.8533713 % variable 12: 71.4664964 % variable 13: 85.0032949 % variable 14: 89.510616 % significance loss when removing single variables: variable 2: corr = 2.59316104 % , sigma = 5.58599084 variable 3: corr = 15.4515801 % , sigma = 33.2846219 variable 4: corr = 14.0567804 % , sigma = 30.2800502 variable 5: corr = 15.6566544 % , sigma = 33.7263774 variable 6: corr = 0.731392628 % , sigma = 1.57551053 variable 7: corr = 7.46670931 % , sigma = 16.0842189 variable 8: corr = 1.04554564 % , sigma = 2.25223511 variable 9: corr = 11.5114846 % , sigma = 24.7971671 variable 10: corr = 5.19852016 % , sigma = 11.1982578 variable 11: corr = 1.4218528 % , sigma = 3.06284746 variable 12: corr = 3.96785472 % , sigma = 8.5472517 variable 13: corr = 6.82580244 % , sigma = 14.7036259 variable 14: corr = 0.43059943 % , sigma = 0.927564633 Keep only 9 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 10 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 7 --> 24.983799 sigma out 15 active outputs RANK 2 NODE 9 --> 23.0182896 sigma out 15 active outputs RANK 3 NODE 4 --> 19.607439 sigma out 15 active outputs RANK 4 NODE 6 --> 17.7254009 sigma out 15 active outputs RANK 5 NODE 2 --> 17.6317444 sigma out 15 active outputs RANK 6 NODE 3 --> 16.30233 sigma out 15 active outputs RANK 7 NODE 1 --> 16.076416 sigma out 15 active outputs RANK 8 NODE 5 --> 11.9881449 sigma out 15 active outputs RANK 9 NODE 10 --> 11.7234087 sigma out 15 active outputs RANK 10 NODE 8 --> 11.2708807 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 27.5586262 sigma in 10act. ( 29.1919441 sig out 1act.) RANK 2 NODE 13 --> 24.6309853 sigma in 10act. ( 30.7313251 sig out 1act.) RANK 3 NODE 3 --> 22.9840145 sigma in 10act. ( 20.8602238 sig out 1act.) RANK 4 NODE 4 --> 17.4308605 sigma in 10act. ( 17.6065426 sig out 1act.) RANK 5 NODE 2 --> 15.5086899 sigma in 10act. ( 13.3377028 sig out 1act.) RANK 6 NODE 10 --> 11.6899786 sigma in 10act. ( 6.31736422 sig out 1act.) RANK 7 NODE 14 --> 10.9685516 sigma in 10act. ( 3.91315556 sig out 1act.) RANK 8 NODE 12 --> 10.2367077 sigma in 10act. ( 6.30842781 sig out 1act.) RANK 9 NODE 5 --> 10.1154976 sigma in 10act. ( 4.56444597 sig out 1act.) RANK 10 NODE 7 --> 9.22131538 sigma in 10act. ( 0.288009524 sig out 1act.) RANK 11 NODE 8 --> 5.89740229 sigma in 10act. ( 1.54344094 sig out 1act.) RANK 12 NODE 15 --> 5.57414627 sigma in 10act. ( 10.9257498 sig out 1act.) RANK 13 NODE 6 --> 4.41824913 sigma in 10act. ( 0.282951295 sig out 1act.) RANK 14 NODE 1 --> 4.24186611 sigma in 10act. ( 4.10597992 sig out 1act.) RANK 15 NODE 11 --> 2.16728973 sigma in 10act. ( 0.00184144569 sig out 1 act.) sorted by output significance RANK 1 NODE 13 --> 30.7313251 sigma out 1act.( 24.6309853 sig in 10act.) RANK 2 NODE 9 --> 29.1919441 sigma out 1act.( 27.5586262 sig in 10act.) RANK 3 NODE 3 --> 20.8602238 sigma out 1act.( 22.9840145 sig in 10act.) RANK 4 NODE 4 --> 17.6065426 sigma out 1act.( 17.4308605 sig in 10act.) RANK 5 NODE 2 --> 13.3377028 sigma out 1act.( 15.5086899 sig in 10act.) RANK 6 NODE 15 --> 10.9257498 sigma out 1act.( 5.57414627 sig in 10act.) RANK 7 NODE 10 --> 6.31736422 sigma out 1act.( 11.6899786 sig in 10act.) RANK 8 NODE 12 --> 6.30842781 sigma out 1act.( 10.2367077 sig in 10act.) RANK 9 NODE 5 --> 4.56444597 sigma out 1act.( 10.1154976 sig in 10act.) RANK 10 NODE 1 --> 4.10597992 sigma out 1act.( 4.24186611 sig in 10act.) RANK 11 NODE 14 --> 3.91315556 sigma out 1act.( 10.9685516 sig in 10act.) RANK 12 NODE 8 --> 1.54344094 sigma out 1act.( 5.89740229 sig in 10act.) RANK 13 NODE 7 --> 0.288009524 sigma out 1act.( 9.22131538 sig in 10act.) RANK 14 NODE 6 --> 0.282951295 sigma out 1act.( 4.41824913 sig in 10act.) RANK 15 NODE 11 --> 0.00184144569 sigma out 1act.( 2.16728973 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 54.5366478 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 7 --> 25.1688919 sigma out 15 active outputs RANK 2 NODE 9 --> 22.0526085 sigma out 15 active outputs RANK 3 NODE 2 --> 21.9428768 sigma out 15 active outputs RANK 4 NODE 4 --> 18.3286152 sigma out 15 active outputs RANK 5 NODE 6 --> 17.445488 sigma out 15 active outputs RANK 6 NODE 1 --> 15.8533878 sigma out 15 active outputs RANK 7 NODE 3 --> 13.9714928 sigma out 15 active outputs RANK 8 NODE 5 --> 12.8799467 sigma out 15 active outputs RANK 9 NODE 8 --> 11.0831289 sigma out 15 active outputs RANK 10 NODE 10 --> 10.3703003 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 13 --> 30.467186 sigma in 10act. ( 29.1969471 sig out 1act.) RANK 2 NODE 9 --> 26.4583282 sigma in 10act. ( 26.6409645 sig out 1act.) RANK 3 NODE 4 --> 18.5534801 sigma in 10act. ( 16.54245 sig out 1act.) RANK 4 NODE 3 --> 16.9934044 sigma in 10act. ( 17.7658577 sig out 1act.) RANK 5 NODE 15 --> 14.6290092 sigma in 10act. ( 12.9274607 sig out 1act.) RANK 6 NODE 2 --> 12.2656851 sigma in 10act. ( 11.5700226 sig out 1act.) RANK 7 NODE 10 --> 9.40818787 sigma in 10act. ( 5.39222002 sig out 1act.) RANK 8 NODE 12 --> 8.85203552 sigma in 10act. ( 5.42701626 sig out 1act.) RANK 9 NODE 1 --> 7.48510981 sigma in 10act. ( 4.02736998 sig out 1act.) RANK 10 NODE 5 --> 7.46709919 sigma in 10act. ( 3.64756989 sig out 1act.) RANK 11 NODE 14 --> 7.168571 sigma in 10act. ( 2.9021821 sig out 1act.) RANK 12 NODE 8 --> 7.11469221 sigma in 10act. ( 1.08199024 sig out 1act.) RANK 13 NODE 7 --> 6.43411207 sigma in 10act. ( 0.927177727 sig out 1act.) RANK 14 NODE 6 --> 4.44722462 sigma in 10act. ( 1.07964039 sig out 1act.) RANK 15 NODE 11 --> 4.09552574 sigma in 10act. ( 0.222615525 sig out 1act.) sorted by output significance RANK 1 NODE 13 --> 29.1969471 sigma out 1act.( 30.467186 sig in 10act.) RANK 2 NODE 9 --> 26.6409645 sigma out 1act.( 26.4583282 sig in 10act.) RANK 3 NODE 3 --> 17.7658577 sigma out 1act.( 16.9934044 sig in 10act.) RANK 4 NODE 4 --> 16.54245 sigma out 1act.( 18.5534801 sig in 10act.) RANK 5 NODE 15 --> 12.9274607 sigma out 1act.( 14.6290092 sig in 10act.) RANK 6 NODE 2 --> 11.5700226 sigma out 1act.( 12.2656851 sig in 10act.) RANK 7 NODE 12 --> 5.42701626 sigma out 1act.( 8.85203552 sig in 10act.) RANK 8 NODE 10 --> 5.39222002 sigma out 1act.( 9.40818787 sig in 10act.) RANK 9 NODE 1 --> 4.02736998 sigma out 1act.( 7.48510981 sig in 10act.) RANK 10 NODE 5 --> 3.64756989 sigma out 1act.( 7.46709919 sig in 10act.) RANK 11 NODE 14 --> 2.9021821 sigma out 1act.( 7.168571 sig in 10act.) RANK 12 NODE 8 --> 1.08199024 sigma out 1act.( 7.11469221 sig in 10act.) RANK 13 NODE 6 --> 1.07964039 sigma out 1act.( 4.44722462 sig in 10act.) RANK 14 NODE 7 --> 0.927177727 sigma out 1act.( 6.43411207 sig in 10act.) RANK 15 NODE 11 --> 0.222615525 sigma out 1act.( 4.09552574 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 50.5191727 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.382778645 *** contribution from regularisation: 0.00327460212 *** contribution from error: -0.386053234 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.434577882 *** contribution from regularisation: 0.0016139443 *** contribution from error: -0.436191827 *********************************************** -----------------> Test sample ENTER BFGS code START -44437.9882 -0.335330248 0.0910517648 EXIT FROM BFGS code FG_START 0. -0.335330248 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.465442777 *** contribution from regularisation: 0.00166934473 *** contribution from error: -0.467112124 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -47583.1448 -0.335330248 -295.062775 EXIT FROM BFGS code FG_LNSRCH 0. -0.364040911 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.531616569 *** contribution from regularisation: 0.00254565896 *** contribution from error: -0.534162223 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -54348.2271 -0.364040911 -325.100128 EXIT FROM BFGS code NEW_X -54348.2271 -0.364040911 -325.100128 ENTER BFGS code NEW_X -54348.2271 -0.364040911 -325.100128 EXIT FROM BFGS code FG_LNSRCH 0. -0.42364952 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.548274279 *** contribution from regularisation: 0.00310231582 *** contribution from error: -0.551376581 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56051.1774 -0.42364952 -115.81031 EXIT FROM BFGS code NEW_X -56051.1774 -0.42364952 -115.81031 ENTER BFGS code NEW_X -56051.1774 -0.42364952 -115.81031 EXIT FROM BFGS code FG_LNSRCH 0. -0.454618543 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.551546097 *** contribution from regularisation: 0.00280374824 *** contribution from error: -0.55434984 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56385.6588 -0.454618543 7.41130352 EXIT FROM BFGS code NEW_X -56385.6588 -0.454618543 7.41130352 ENTER BFGS code NEW_X -56385.6588 -0.454618543 7.41130352 EXIT FROM BFGS code FG_LNSRCH 0. -0.458117962 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.552719533 *** contribution from regularisation: 0.00252630026 *** contribution from error: -0.555245817 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56505.6247 -0.458117962 13.985076 EXIT FROM BFGS code NEW_X -56505.6247 -0.458117962 13.985076 ENTER BFGS code NEW_X -56505.6247 -0.458117962 13.985076 EXIT FROM BFGS code FG_LNSRCH 0. -0.45612675 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.553182781 *** contribution from regularisation: 0.00244449521 *** contribution from error: -0.555627286 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56552.984 -0.45612675 51.8019485 EXIT FROM BFGS code NEW_X -56552.984 -0.45612675 51.8019485 ENTER BFGS code NEW_X -56552.984 -0.45612675 51.8019485 EXIT FROM BFGS code FG_LNSRCH 0. -0.440274864 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.553795218 *** contribution from regularisation: 0.00244973158 *** contribution from error: -0.556244969 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56615.5908 -0.440274864 43.2790985 EXIT FROM BFGS code NEW_X -56615.5908 -0.440274864 43.2790985 ENTER BFGS code NEW_X -56615.5908 -0.440274864 43.2790985 EXIT FROM BFGS code FG_LNSRCH 0. -0.372463048 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 56.3822746 sigma out 15 active outputs RANK 2 NODE 1 --> 36.6228523 sigma out 15 active outputs RANK 3 NODE 4 --> 30.3959351 sigma out 15 active outputs RANK 4 NODE 7 --> 26.3214893 sigma out 15 active outputs RANK 5 NODE 8 --> 21.8508453 sigma out 15 active outputs RANK 6 NODE 10 --> 20.8996048 sigma out 15 active outputs RANK 7 NODE 9 --> 19.0985298 sigma out 15 active outputs RANK 8 NODE 5 --> 16.0333385 sigma out 15 active outputs RANK 9 NODE 3 --> 13.6524849 sigma out 15 active outputs RANK 10 NODE 6 --> 10.8250942 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 59.9289169 sigma in 10act. ( 59.6593819 sig out 1act.) RANK 2 NODE 13 --> 41.4999008 sigma in 10act. ( 30.6892929 sig out 1act.) RANK 3 NODE 4 --> 35.6207848 sigma in 10act. ( 34.1134605 sig out 1act.) RANK 4 NODE 6 --> 19.5590038 sigma in 10act. ( 14.9789295 sig out 1act.) RANK 5 NODE 2 --> 19.1637154 sigma in 10act. ( 19.6635323 sig out 1act.) RANK 6 NODE 1 --> 12.4335642 sigma in 10act. ( 11.828784 sig out 1act.) RANK 7 NODE 9 --> 11.4934435 sigma in 10act. ( 10.3793039 sig out 1act.) RANK 8 NODE 7 --> 10.0312243 sigma in 10act. ( 8.45402718 sig out 1act.) RANK 9 NODE 5 --> 9.19647408 sigma in 10act. ( 9.02894497 sig out 1act.) RANK 10 NODE 10 --> 7.62284851 sigma in 10act. ( 6.42130852 sig out 1act.) RANK 11 NODE 3 --> 6.25478125 sigma in 10act. ( 5.28191233 sig out 1act.) RANK 12 NODE 8 --> 4.94045401 sigma in 10act. ( 2.05091381 sig out 1act.) RANK 13 NODE 14 --> 4.63099003 sigma in 10act. ( 2.36037302 sig out 1act.) RANK 14 NODE 12 --> 4.46440268 sigma in 10act. ( 0.983109653 sig out 1act.) RANK 15 NODE 11 --> 3.05074358 sigma in 10act. ( 0.835556746 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 59.6593819 sigma out 1act.( 59.9289169 sig in 10act.) RANK 2 NODE 4 --> 34.1134605 sigma out 1act.( 35.6207848 sig in 10act.) RANK 3 NODE 13 --> 30.6892929 sigma out 1act.( 41.4999008 sig in 10act.) RANK 4 NODE 2 --> 19.6635323 sigma out 1act.( 19.1637154 sig in 10act.) RANK 5 NODE 6 --> 14.9789295 sigma out 1act.( 19.5590038 sig in 10act.) RANK 6 NODE 1 --> 11.828784 sigma out 1act.( 12.4335642 sig in 10act.) RANK 7 NODE 9 --> 10.3793039 sigma out 1act.( 11.4934435 sig in 10act.) RANK 8 NODE 5 --> 9.02894497 sigma out 1act.( 9.19647408 sig in 10act.) RANK 9 NODE 7 --> 8.45402718 sigma out 1act.( 10.0312243 sig in 10act.) RANK 10 NODE 10 --> 6.42130852 sigma out 1act.( 7.62284851 sig in 10act.) RANK 11 NODE 3 --> 5.28191233 sigma out 1act.( 6.25478125 sig in 10act.) RANK 12 NODE 14 --> 2.36037302 sigma out 1act.( 4.63099003 sig in 10act.) RANK 13 NODE 8 --> 2.05091381 sigma out 1act.( 4.94045401 sig in 10act.) RANK 14 NODE 12 --> 0.983109653 sigma out 1act.( 4.46440268 sig in 10act.) RANK 15 NODE 11 --> 0.835556746 sigma out 1act.( 3.05074358 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 82.2012329 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.555145502 *** contribution from regularisation: 0.00253729965 *** contribution from error: -0.557682812 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -56753.6371 -0.372463048 53.1776733 EXIT FROM BFGS code NEW_X -56753.6371 -0.372463048 53.1776733 ENTER BFGS code NEW_X -56753.6371 -0.372463048 53.1776733 EXIT FROM BFGS code FG_LNSRCH 0. -0.336115658 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.556064069 *** contribution from regularisation: 0.00231232448 *** contribution from error: -0.558376372 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56847.5403 -0.336115658 -28.4605541 EXIT FROM BFGS code NEW_X -56847.5403 -0.336115658 -28.4605541 ENTER BFGS code NEW_X -56847.5403 -0.336115658 -28.4605541 EXIT FROM BFGS code FG_LNSRCH 0. -0.342603147 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.556773186 *** contribution from regularisation: 0.00216277363 *** contribution from error: -0.55893594 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56920.0364 -0.342603147 -22.2756271 EXIT FROM BFGS code NEW_X -56920.0364 -0.342603147 -22.2756271 ENTER BFGS code NEW_X -56920.0364 -0.342603147 -22.2756271 EXIT FROM BFGS code FG_LNSRCH 0. -0.351301551 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.556927443 *** contribution from regularisation: 0.00216957997 *** contribution from error: -0.559097052 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56935.8063 -0.351301551 7.93039799 EXIT FROM BFGS code NEW_X -56935.8063 -0.351301551 7.93039799 ENTER BFGS code NEW_X -56935.8063 -0.351301551 7.93039799 EXIT FROM BFGS code FG_LNSRCH 0. -0.35099715 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.557075083 *** contribution from regularisation: 0.00216606469 *** contribution from error: -0.559241176 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56950.8997 -0.35099715 5.97764683 EXIT FROM BFGS code NEW_X -56950.8997 -0.35099715 5.97764683 ENTER BFGS code NEW_X -56950.8997 -0.35099715 5.97764683 EXIT FROM BFGS code FG_LNSRCH 0. -0.351575702 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.557305396 *** contribution from regularisation: 0.00215583248 *** contribution from error: -0.559461236 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56974.4466 -0.351575702 40.8060188 EXIT FROM BFGS code NEW_X -56974.4466 -0.351575702 40.8060188 ENTER BFGS code NEW_X -56974.4466 -0.351575702 40.8060188 EXIT FROM BFGS code FG_LNSRCH 0. -0.336733162 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.557520688 *** contribution from regularisation: 0.0021711397 *** contribution from error: -0.559691846 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56996.4565 -0.336733162 -31.0449047 EXIT FROM BFGS code NEW_X -56996.4565 -0.336733162 -31.0449047 ENTER BFGS code NEW_X -56996.4565 -0.336733162 -31.0449047 EXIT FROM BFGS code FG_LNSRCH 0. -0.336556286 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.557680905 *** contribution from regularisation: 0.00223424798 *** contribution from error: -0.559915125 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57012.8343 -0.336556286 -0.483931988 EXIT FROM BFGS code NEW_X -57012.8343 -0.336556286 -0.483931988 ENTER BFGS code NEW_X -57012.8343 -0.336556286 -0.483931988 EXIT FROM BFGS code FG_LNSRCH 0. -0.331335753 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.557768226 *** contribution from regularisation: 0.00226146192 *** contribution from error: -0.560029685 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57021.7625 -0.331335753 11.5136642 EXIT FROM BFGS code NEW_X -57021.7625 -0.331335753 11.5136642 ENTER BFGS code NEW_X -57021.7625 -0.331335753 11.5136642 EXIT FROM BFGS code FG_LNSRCH 0. -0.322379261 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.55787003 *** contribution from regularisation: 0.00229291175 *** contribution from error: -0.560162961 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57032.1673 -0.322379261 21.1497879 EXIT FROM BFGS code NEW_X -57032.1673 -0.322379261 21.1497879 ENTER BFGS code NEW_X -57032.1673 -0.322379261 21.1497879 EXIT FROM BFGS code FG_LNSRCH 0. -0.299551934 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 68.7729187 sigma out 15 active outputs RANK 2 NODE 1 --> 41.1151352 sigma out 15 active outputs RANK 3 NODE 4 --> 36.5363083 sigma out 15 active outputs RANK 4 NODE 10 --> 30.612709 sigma out 15 active outputs RANK 5 NODE 8 --> 26.9400997 sigma out 15 active outputs RANK 6 NODE 9 --> 26.5033989 sigma out 15 active outputs RANK 7 NODE 3 --> 24.119772 sigma out 15 active outputs RANK 8 NODE 7 --> 21.8701153 sigma out 15 active outputs RANK 9 NODE 6 --> 15.8613548 sigma out 15 active outputs RANK 10 NODE 5 --> 14.1673002 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 79.8628006 sigma in 10act. ( 74.1287842 sig out 1act.) RANK 2 NODE 13 --> 45.4662399 sigma in 10act. ( 42.4343491 sig out 1act.) RANK 3 NODE 4 --> 41.0934563 sigma in 10act. ( 42.8894234 sig out 1act.) RANK 4 NODE 2 --> 21.2500439 sigma in 10act. ( 22.555233 sig out 1act.) RANK 5 NODE 6 --> 18.7163734 sigma in 10act. ( 15.7542744 sig out 1act.) RANK 6 NODE 5 --> 14.9039888 sigma in 10act. ( 17.3905201 sig out 1act.) RANK 7 NODE 1 --> 13.9041033 sigma in 10act. ( 14.4486208 sig out 1act.) RANK 8 NODE 7 --> 10.0331726 sigma in 10act. ( 8.34505558 sig out 1act.) RANK 9 NODE 10 --> 8.00798321 sigma in 10act. ( 7.83372402 sig out 1act.) RANK 10 NODE 9 --> 6.50728989 sigma in 10act. ( 5.06117105 sig out 1act.) RANK 11 NODE 8 --> 5.07214642 sigma in 10act. ( 3.97436523 sig out 1act.) RANK 12 NODE 14 --> 4.5641551 sigma in 10act. ( 3.89290309 sig out 1act.) RANK 13 NODE 3 --> 3.46840525 sigma in 10act. ( 2.18794727 sig out 1act.) RANK 14 NODE 12 --> 3.18164396 sigma in 10act. ( 0.541838109 sig out 1act.) RANK 15 NODE 11 --> 2.38426781 sigma in 10act. ( 1.35735893 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 74.1287842 sigma out 1act.( 79.8628006 sig in 10act.) RANK 2 NODE 4 --> 42.8894234 sigma out 1act.( 41.0934563 sig in 10act.) RANK 3 NODE 13 --> 42.4343491 sigma out 1act.( 45.4662399 sig in 10act.) RANK 4 NODE 2 --> 22.555233 sigma out 1act.( 21.2500439 sig in 10act.) RANK 5 NODE 5 --> 17.3905201 sigma out 1act.( 14.9039888 sig in 10act.) RANK 6 NODE 6 --> 15.7542744 sigma out 1act.( 18.7163734 sig in 10act.) RANK 7 NODE 1 --> 14.4486208 sigma out 1act.( 13.9041033 sig in 10act.) RANK 8 NODE 7 --> 8.34505558 sigma out 1act.( 10.0331726 sig in 10act.) RANK 9 NODE 10 --> 7.83372402 sigma out 1act.( 8.00798321 sig in 10act.) RANK 10 NODE 9 --> 5.06117105 sigma out 1act.( 6.50728989 sig in 10act.) RANK 11 NODE 8 --> 3.97436523 sigma out 1act.( 5.07214642 sig in 10act.) RANK 12 NODE 14 --> 3.89290309 sigma out 1act.( 4.5641551 sig in 10act.) RANK 13 NODE 3 --> 2.18794727 sigma out 1act.( 3.46840525 sig in 10act.) RANK 14 NODE 11 --> 1.35735893 sigma out 1act.( 2.38426781 sig in 10act.) RANK 15 NODE 12 --> 0.541838109 sigma out 1act.( 3.18164396 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 102.945992 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.558056831 *** contribution from regularisation: 0.00237283274 *** contribution from error: -0.560429692 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57051.2679 -0.299551934 24.115654 EXIT FROM BFGS code NEW_X -57051.2679 -0.299551934 24.115654 ENTER BFGS code NEW_X -57051.2679 -0.299551934 24.115654 EXIT FROM BFGS code FG_LNSRCH 0. -0.255626291 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.55817461 *** contribution from regularisation: 0.00245899009 *** contribution from error: -0.5606336 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57063.3051 -0.255626291 -38.2414551 EXIT FROM BFGS code NEW_X -57063.3051 -0.255626291 -38.2414551 ENTER BFGS code NEW_X -57063.3051 -0.255626291 -38.2414551 EXIT FROM BFGS code FG_LNSRCH 0. -0.260091662 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.558308005 *** contribution from regularisation: 0.00242407131 *** contribution from error: -0.560732067 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57076.9452 -0.260091662 14.6503639 EXIT FROM BFGS code NEW_X -57076.9452 -0.260091662 14.6503639 ENTER BFGS code NEW_X -57076.9452 -0.260091662 14.6503639 EXIT FROM BFGS code FG_LNSRCH 0. -0.266390353 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.558347404 *** contribution from regularisation: 0.00239906297 *** contribution from error: -0.560746491 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57080.9701 -0.266390353 12.3735332 EXIT FROM BFGS code NEW_X -57080.9701 -0.266390353 12.3735332 ENTER BFGS code NEW_X -57080.9701 -0.266390353 12.3735332 EXIT FROM BFGS code FG_LNSRCH 0. -0.259129614 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.558435559 *** contribution from regularisation: 0.00232782261 *** contribution from error: -0.560763359 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57089.9846 -0.259129614 -40.3237686 EXIT FROM BFGS code NEW_X -57089.9846 -0.259129614 -40.3237686 ENTER BFGS code NEW_X -57089.9846 -0.259129614 -40.3237686 EXIT FROM BFGS code FG_LNSRCH 0. -0.259228051 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.558406651 *** contribution from regularisation: 0.00238207332 *** contribution from error: -0.560788751 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57087.0301 -0.259228051 -28.8391094 EXIT FROM BFGS code FG_LNSRCH 0. -0.259150267 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.558254242 *** contribution from regularisation: 0.00252251094 *** contribution from error: -0.56077677 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57071.4503 -0.259150267 -37.6932983 EXIT FROM BFGS code FG_LNSRCH 0. -0.25913018 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.558317065 *** contribution from regularisation: 0.00244664121 *** contribution from error: -0.560763717 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57077.8678 -0.25913018 -40.4367447 EXIT FROM BFGS code FG_LNSRCH 0. -0.259129614 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.558328986 *** contribution from regularisation: 0.00243438594 *** contribution from error: -0.560763359 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57079.0904 -0.259129614 -40.5023003 EXIT FROM BFGS code FG_LNSRCH 0. -0.259129614 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.558310151 *** contribution from regularisation: 0.00245321705 *** contribution from error: -0.560763359 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57077.1652 -0.259129614 -40.537838 EXIT FROM BFGS code FG_LNSRCH 0. -0.259129614 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 72.5667877 sigma out 15 active outputs RANK 2 NODE 1 --> 44.5612602 sigma out 15 active outputs RANK 3 NODE 4 --> 36.5437088 sigma out 15 active outputs RANK 4 NODE 10 --> 32.8870926 sigma out 15 active outputs RANK 5 NODE 9 --> 29.6499386 sigma out 15 active outputs RANK 6 NODE 3 --> 27.8605194 sigma out 15 active outputs RANK 7 NODE 8 --> 27.7208042 sigma out 15 active outputs RANK 8 NODE 7 --> 22.4946117 sigma out 15 active outputs RANK 9 NODE 6 --> 15.0558023 sigma out 15 active outputs RANK 10 NODE 5 --> 13.489687 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 85.5608521 sigma in 10act. ( 76.8267288 sig out 1act.) RANK 2 NODE 13 --> 46.7608337 sigma in 10act. ( 43.5012741 sig out 1act.) RANK 3 NODE 4 --> 41.8854141 sigma in 10act. ( 42.7343903 sig out 1act.) RANK 4 NODE 2 --> 21.5707359 sigma in 10act. ( 23.294096 sig out 1act.) RANK 5 NODE 6 --> 18.9453506 sigma in 10act. ( 16.5092564 sig out 1act.) RANK 6 NODE 5 --> 18.174572 sigma in 10act. ( 21.8743973 sig out 1act.) RANK 7 NODE 1 --> 16.2251244 sigma in 10act. ( 16.878397 sig out 1act.) RANK 8 NODE 7 --> 10.731142 sigma in 10act. ( 9.57246494 sig out 1act.) RANK 9 NODE 10 --> 10.2779064 sigma in 10act. ( 10.2190189 sig out 1act.) RANK 10 NODE 8 --> 5.72347879 sigma in 10act. ( 4.81700802 sig out 1act.) RANK 11 NODE 9 --> 5.33769131 sigma in 10act. ( 3.16673994 sig out 1act.) RANK 12 NODE 14 --> 4.52777481 sigma in 10act. ( 4.02151585 sig out 1act.) RANK 13 NODE 3 --> 3.08457947 sigma in 10act. ( 1.65486085 sig out 1act.) RANK 14 NODE 12 --> 2.97965765 sigma in 10act. ( 0.939664602 sig out 1act.) RANK 15 NODE 11 --> 2.59312749 sigma in 10act. ( 2.09722757 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 76.8267288 sigma out 1act.( 85.5608521 sig in 10act.) RANK 2 NODE 13 --> 43.5012741 sigma out 1act.( 46.7608337 sig in 10act.) RANK 3 NODE 4 --> 42.7343903 sigma out 1act.( 41.8854141 sig in 10act.) RANK 4 NODE 2 --> 23.294096 sigma out 1act.( 21.5707359 sig in 10act.) RANK 5 NODE 5 --> 21.8743973 sigma out 1act.( 18.174572 sig in 10act.) RANK 6 NODE 1 --> 16.878397 sigma out 1act.( 16.2251244 sig in 10act.) RANK 7 NODE 6 --> 16.5092564 sigma out 1act.( 18.9453506 sig in 10act.) RANK 8 NODE 10 --> 10.2190189 sigma out 1act.( 10.2779064 sig in 10act.) RANK 9 NODE 7 --> 9.57246494 sigma out 1act.( 10.731142 sig in 10act.) RANK 10 NODE 8 --> 4.81700802 sigma out 1act.( 5.72347879 sig in 10act.) RANK 11 NODE 14 --> 4.02151585 sigma out 1act.( 4.52777481 sig in 10act.) RANK 12 NODE 9 --> 3.16673994 sigma out 1act.( 5.33769131 sig in 10act.) RANK 13 NODE 11 --> 2.09722757 sigma out 1act.( 2.59312749 sig in 10act.) RANK 14 NODE 3 --> 1.65486085 sigma out 1act.( 3.08457947 sig in 10act.) RANK 15 NODE 12 --> 0.939664602 sigma out 1act.( 2.97965765 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 107.018501 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.558318317 *** contribution from regularisation: 0.00244504563 *** contribution from error: -0.560763359 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57078.0006 -0.259129614 -40.5624199 EXIT FROM BFGS code FG_LNSRCH 0. -0.259129614 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.558317065 *** contribution from regularisation: 0.00244633667 *** contribution from error: -0.560763419 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57077.8686 -0.259129614 -40.6151619 EXIT FROM BFGS code NEW_X -57077.8686 -0.259129614 -40.6151619 ENTER BFGS code NEW_X -57077.8686 -0.259129614 -40.6151619 EXIT FROM BFGS code CONVERGENC -57077.8686 -0.259129614 -40.6151619 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 102.580452 sigma out 15 active outputs RANK 2 NODE 1 --> 64.1776962 sigma out 15 active outputs RANK 3 NODE 4 --> 53.4288177 sigma out 15 active outputs RANK 4 NODE 10 --> 46.3687973 sigma out 15 active outputs RANK 5 NODE 9 --> 43.0211906 sigma out 15 active outputs RANK 6 NODE 3 --> 40.6261902 sigma out 15 active outputs RANK 7 NODE 8 --> 40.5891495 sigma out 15 active outputs RANK 8 NODE 7 --> 32.3623772 sigma out 15 active outputs RANK 9 NODE 6 --> 21.3113213 sigma out 15 active outputs RANK 10 NODE 5 --> 18.7704735 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 121.381393 sigma in 10act. ( 114.748421 sig out 1act.) RANK 2 NODE 13 --> 67.7317276 sigma in 10act. ( 64.2866592 sig out 1act.) RANK 3 NODE 4 --> 61.4211731 sigma in 10act. ( 63.1677437 sig out 1act.) RANK 4 NODE 2 --> 31.3380337 sigma in 10act. ( 34.2983284 sig out 1act.) RANK 5 NODE 6 --> 26.9745865 sigma in 10act. ( 24.504528 sig out 1act.) RANK 6 NODE 5 --> 26.3912563 sigma in 10act. ( 31.9207001 sig out 1act.) RANK 7 NODE 1 --> 23.2306023 sigma in 10act. ( 24.6596489 sig out 1act.) RANK 8 NODE 7 --> 15.1410418 sigma in 10act. ( 13.9099112 sig out 1act.) RANK 9 NODE 10 --> 14.614109 sigma in 10act. ( 15.0317993 sig out 1act.) RANK 10 NODE 8 --> 7.82156706 sigma in 10act. ( 7.10630655 sig out 1act.) RANK 11 NODE 9 --> 6.12527037 sigma in 10act. ( 4.76139069 sig out 1act.) RANK 12 NODE 14 --> 5.95719433 sigma in 10act. ( 5.93909073 sig out 1act.) RANK 13 NODE 3 --> 3.46603918 sigma in 10act. ( 2.43256021 sig out 1act.) RANK 14 NODE 11 --> 3.21760368 sigma in 10act. ( 2.97483301 sig out 1act.) RANK 15 NODE 12 --> 3.16589069 sigma in 10act. ( 1.38307464 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 114.748421 sigma out 1act.( 121.381393 sig in 10act.) RANK 2 NODE 13 --> 64.2866592 sigma out 1act.( 67.7317276 sig in 10act.) RANK 3 NODE 4 --> 63.1677437 sigma out 1act.( 61.4211731 sig in 10act.) RANK 4 NODE 2 --> 34.2983284 sigma out 1act.( 31.3380337 sig in 10act.) RANK 5 NODE 5 --> 31.9207001 sigma out 1act.( 26.3912563 sig in 10act.) RANK 6 NODE 1 --> 24.6596489 sigma out 1act.( 23.2306023 sig in 10act.) RANK 7 NODE 6 --> 24.504528 sigma out 1act.( 26.9745865 sig in 10act.) RANK 8 NODE 10 --> 15.0317993 sigma out 1act.( 14.614109 sig in 10act.) RANK 9 NODE 7 --> 13.9099112 sigma out 1act.( 15.1410418 sig in 10act.) RANK 10 NODE 8 --> 7.10630655 sigma out 1act.( 7.82156706 sig in 10act.) RANK 11 NODE 14 --> 5.93909073 sigma out 1act.( 5.95719433 sig in 10act.) RANK 12 NODE 9 --> 4.76139069 sigma out 1act.( 6.12527037 sig in 10act.) RANK 13 NODE 11 --> 2.97483301 sigma out 1act.( 3.21760368 sig in 10act.) RANK 14 NODE 3 --> 2.43256021 sigma out 1act.( 3.46603918 sig in 10act.) RANK 15 NODE 12 --> 1.38307464 sigma out 1act.( 3.16589069 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 158.866547 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.558325946 *** contribution from regularisation: 0.0024374458 *** contribution from error: -0.560763419 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 28444 Closing output file done