NNInput NNInputs_120.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 186030 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 46230 nbkg = 139800 Bkg Entries: 139800 Sig Entries: 46230 Chosen entries: 46230 Signal fraction: 1 Background fraction: 0.330687 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 139800 Actual Signal Entries: 46230 Entries to split: 46230 Test with : 23115 Train with : 23115 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 46230 for Signal Prepared event 0 for Signal with 46230 events ====Entry 0 Variable Ht : 156.497 Variable LepAPt : 24.4462 Variable LepBPt : 15.2546 Variable MetSigLeptonsJets : 4.03888 Variable MetSpec : 26.0087 Variable SumEtLeptonsJets : 113.473 Variable VSumJetLeptonsPt : 37.0367 Variable addEt : 82.7245 Variable dPhiLepSumMet : 0.85328 Variable dPhiLeptons : 0.330941 Variable dRLeptons : 0.538617 Variable lep1_E : 30.9903 Variable lep2_E : 15.9065 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2120 Ht = 156.497 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 24.4462 LepAPt = 24.4462 LepBEt = 15.2546 LepBPt = 15.2546 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 43.0236 MetDelPhi = 0.649165 MetSig = 3.00203 MetSigLeptonsJets = 4.03888 MetSpec = 26.0087 Mjj = 0 MostCentralJetEta = -1.64867 MtllMet = 86.7807 Njets = 1 SB = 0 SumEt = 205.392 SumEtJets = 0 SumEtLeptonsJets = 113.473 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 37.0367 addEt = 82.7245 dPhiLepSumMet = 0.85328 dPhiLeptons = 0.330941 dRLeptons = 0.538617 diltype = 45 dimass = 10.4324 event = 71 jet1_Et = 73.7722 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 30.9903 lep2_E = 15.9065 rand = 0.999742 run = 232428 weight = 1.10299e-06 ===Show End Prepared event 10000 for Signal with 46230 events Prepared event 20000 for Signal with 46230 events Prepared event 30000 for Signal with 46230 events Prepared event 40000 for Signal with 46230 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 139800 for Background Prepared event 0 for Background with 139800 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.270596 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 139800 events Prepared event 20000 for Background with 139800 events Prepared event 30000 for Background with 139800 events Prepared event 40000 for Background with 139800 events Prepared event 50000 for Background with 139800 events Prepared event 60000 for Background with 139800 events Prepared event 70000 for Background with 139800 events Prepared event 80000 for Background with 139800 events Prepared event 90000 for Background with 139800 events Prepared event 100000 for Background with 139800 events Prepared event 110000 for Background with 139800 events Prepared event 120000 for Background with 139800 events Prepared event 130000 for Background with 139800 events Warning: found 4402 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 186030 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4402 negative weights. Signal fraction: 62.3769035 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0046387 62.7754517 64.8940048 66.3924789 67.6092682 68.6955872 69.6494141 70.4366989 71.2664795 71.9723816 72.9038391 73.5671692 74.2392578 74.8656311 75.4512711 76.0682068 76.8113708 77.2893066 77.8025208 78.3615723 78.9195938 79.4197235 79.9344482 80.5881042 81.071991 81.6350403 82.1891479 82.7312317 83.4282074 84.2155533 84.9570923 85.7967606 86.6896667 87.5444031 88.4667816 89.3137741 90.1578445 91.2484131 92.2631683 93.2132034 94.3186646 95.3179779 96.175705 97.1283417 98.06427 98.9256439 99.7605286 100.697586 101.62896 102.432587 103.324768 104.119705 104.923325 105.88018 106.690536 107.595886 108.552689 109.406815 110.388748 111.244339 112.072754 112.986359 113.898293 114.972977 116.192795 117.187004 118.408676 119.619263 120.856133 122.302391 123.709076 125.377419 127.203964 128.932861 130.611618 132.533768 134.499481 136.350723 138.340851 140.344757 142.569489 145.026871 147.279892 149.598877 152.177429 155.220871 158.333237 161.427429 165.014099 169.009613 173.928741 178.728134 184.808716 191.597992 199.514618 207.869019 220.5009 237.29364 262.607117 311.574799 893.967896 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.2981625 20.5695744 20.816021 21.0921555 21.3101082 21.523365 21.7331562 21.9462357 22.1115303 22.3003597 22.5055389 22.7031078 22.9217587 23.0998611 23.2765198 23.439682 23.6339874 23.809557 24.0256615 24.2170639 24.4109421 24.5875912 24.7686844 24.9572792 25.1507549 25.3192291 25.4889526 25.6478024 25.8088799 25.9664917 26.1669006 26.3098984 26.5259743 26.7294979 26.9444962 27.1227264 27.3298988 27.5430965 27.7366371 27.926899 28.1078606 28.3018112 28.5124664 28.7323799 28.9384041 29.133152 29.3279076 29.5501671 29.760088 29.9876518 30.2434273 30.4972839 30.695631 30.9411507 31.1995926 31.4619751 31.6789608 31.8696918 32.1134567 32.3350029 32.6031036 32.8468857 33.1136475 33.4014053 33.67342 33.9606857 34.2312775 34.5311432 34.8851166 35.2224731 35.5612411 35.894001 36.2538605 36.6396713 37.0015564 37.3531685 37.7621613 38.1626511 38.6120377 39.0819054 39.5683861 40.0592194 40.5505714 40.9013519 41.4397888 42.0858154 42.7193527 43.5571289 44.3674622 45.2603912 46.3284302 47.5684128 48.9425087 50.6720314 52.3734055 54.939579 58.6590767 63.8431473 72.7509918 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0000496 10.1097641 10.2100296 10.3101721 10.394989 10.484725 10.5842762 10.674222 10.7692223 10.8707752 10.9478359 11.0315514 11.1259508 11.2344141 11.3325481 11.4296684 11.5241261 11.6319866 11.737359 11.8480263 11.9710598 12.0747414 12.187561 12.298048 12.4022017 12.5172739 12.621891 12.7403164 12.8572731 12.9693413 13.0885086 13.2322941 13.3732796 13.5028906 13.6134815 13.7294674 13.8417168 13.9655304 14.1037636 14.2320089 14.3495922 14.4946871 14.641511 14.7934122 14.9452362 15.0819654 15.2226782 15.3511257 15.461091 15.5964279 15.7414875 15.873291 15.9919434 16.1544285 16.311655 16.4620285 16.6308746 16.7935009 16.9518394 17.1508102 17.3223667 17.5067768 17.6969872 17.8676414 18.0082493 18.1999683 18.3860798 18.5558224 18.7664204 18.9690514 19.1837425 19.3945808 19.6391869 19.8548374 20.0904083 20.283268 20.4654846 20.6742668 20.8864517 21.1304379 21.3850842 21.6343727 21.9213181 22.1936226 22.5084114 22.8436089 23.169117 23.5104446 23.9103432 24.3497772 24.7630119 25.2221527 25.7760792 26.3882446 26.9897213 27.849041 28.9025383 30.3214684 32.3589554 36.1946411 71.4863358 ------------------------------ Transdef: Tab for variable 5 1.0228225 2.41664314 2.80169106 3.0287714 3.22578502 3.41578174 3.60994816 3.77533221 3.93176365 4.05800104 4.16447592 4.25971889 4.36111259 4.45048904 4.53654671 4.60576344 4.67519617 4.75267696 4.81732941 4.88739872 4.9572525 5.01703644 5.07712364 5.12962246 5.17648029 5.23300648 5.28292942 5.33241844 5.37909412 5.42240047 5.46357632 5.51008606 5.54020786 5.58431339 5.62302971 5.65970325 5.70170593 5.75067616 5.79511642 5.83324337 5.87370586 5.91104126 5.95330334 5.99207115 6.02528334 6.06183481 6.0974474 6.13241768 6.17270184 6.21051788 6.2516613 6.28667641 6.31924009 6.35136318 6.39471245 6.43010187 6.46886158 6.50522614 6.54761124 6.59198666 6.62225437 6.6557107 6.69378948 6.73271847 6.76594734 6.79900932 6.83692884 6.8720665 6.91444302 6.95415115 6.99267721 7.03888416 7.07901478 7.12623215 7.17340851 7.22303772 7.27175903 7.32010841 7.37462997 7.42434216 7.47303534 7.52807331 7.58684444 7.65046215 7.70885849 7.76696301 7.83884716 7.90330982 7.97914791 8.06004715 8.14172745 8.23954487 8.36135006 8.474123 8.61570454 8.7751236 8.98649788 9.29794884 9.67833805 10.5328617 18.5645885 ------------------------------ Transdef: Tab for variable 6 15.0018101 19.261116 23.8579254 25.6598167 26.5282116 27.2082844 28.0005722 28.7171402 29.3185539 29.8537102 30.4954357 31.0430756 31.577858 32.0622406 32.5212173 32.9117432 33.2933426 33.6985092 34.102272 34.5103378 34.78022 35.1006813 35.4753799 35.823246 36.1474533 36.4734573 36.8466072 37.1562767 37.434761 37.7456741 38.0330162 38.3531036 38.6696243 38.9350586 39.2028961 39.4591827 39.7671852 40.0798569 40.3613625 40.6580353 40.9954071 41.3221436 41.6671791 41.9793243 42.2764511 42.574913 42.8782196 43.1152077 43.4865952 43.7622147 44.1321068 44.5219002 44.9331131 45.3393288 45.7728729 46.1738091 46.6374893 47.0705032 47.5108719 47.9929733 48.3919525 48.8413849 49.2163696 49.6179695 50.0431442 50.4262314 50.8794098 51.3651505 51.8466187 52.2790718 52.7762833 53.3033485 53.7577057 54.2524948 54.7078094 55.2132568 55.6947556 56.2246933 56.7586823 57.2810745 57.890152 58.4772339 59.0889168 59.8012581 60.6078224 61.3835907 62.2269974 63.1768799 64.168396 65.2920685 66.6648026 68.1529465 69.7988815 71.7068939 73.7882919 76.3605347 78.378418 82.2758408 87.5534668 97.4807358 250.623764 ------------------------------ Transdef: Tab for variable 7 30.1229668 32.62117 33.6056213 34.2150421 34.7805176 35.3335953 35.7398453 36.1338463 36.4870605 36.8509979 37.1675797 37.4921417 37.7860794 38.136795 38.4218102 38.6648026 38.8825035 39.1597023 39.3862991 39.5926132 39.8885422 40.227047 40.5397949 40.8965073 41.1788025 41.4470673 41.7547531 42.0633698 42.4094772 42.8047142 43.2784271 43.6870346 44.1522713 44.673008 45.0980148 45.5016403 45.9676781 46.4575539 46.945015 47.5409737 48.0459518 48.6350517 49.2314758 49.795269 50.3775482 50.9454956 51.4455109 52.0203285 52.5704193 53.1237602 53.7085342 54.3283844 54.9477501 55.4967499 56.052124 56.6842422 57.3095932 58.0674362 58.8531036 59.5330238 60.4257317 61.1472282 61.9918747 63.0346603 64.1299286 65.3519669 66.5488586 67.6599274 68.8467865 70.3861084 71.5672913 73.0086365 74.1925659 75.6773453 77.0127411 78.4358978 79.9082413 81.5552063 83.0167847 84.7713928 86.7423401 88.7842941 90.8015137 93.1715317 95.4974213 98.1406403 100.89888 104.021301 107.306694 110.888695 114.730042 119.042374 123.597649 129.814224 136.848633 145.51297 155.325974 168.459595 189.918304 222.855377 521.871643 ------------------------------ Transdef: Tab for variable 8 2.839571 20.6077919 25.5536537 28.3434639 30.0223427 31.2448273 32.0217972 32.6228676 33.1339302 33.5830078 33.9756393 34.3736916 34.7778473 35.0667343 35.3273087 35.5914078 35.9109573 36.1209641 36.4348831 36.7019806 37.0076828 37.2580032 37.5141754 37.735672 37.9730949 38.1781883 38.4036636 38.6098404 38.831604 39.0076752 39.2549896 39.5036392 39.754921 40.0278854 40.2880516 40.5113449 40.788559 41.0757675 41.3337402 41.6367416 41.9097404 42.2172012 42.5375137 42.8902512 43.2387009 43.5731926 43.8852997 44.2807693 44.6125183 44.9547119 45.347187 45.7306366 46.1013794 46.5066338 46.8553848 47.2460022 47.6820374 48.0200195 48.4222565 48.7982101 49.2201309 49.629921 50.0429306 50.4189224 50.8558884 51.2395554 51.6509018 52.0880928 52.532753 52.9493752 53.3568954 53.7843094 54.2432327 54.6891632 55.1570435 55.6380081 56.144722 56.6882668 57.2776031 57.9235001 58.5253754 59.2064896 59.9371414 60.6992264 61.4395294 62.3956223 63.3580894 64.4263687 65.5805817 66.8559875 68.3782501 69.8544769 71.6960449 73.856102 76.2820358 78.9316254 82.54776 87.8839874 95.6301117 113.640472 406.55069 ------------------------------ Transdef: Tab for variable 9 49.6214828 62.2515106 64.0803223 65.6225739 66.8694916 67.7984009 68.7525711 69.5890732 70.2174454 70.9170837 71.6667328 72.2884903 73.0357056 73.6122894 74.2084579 74.7818604 75.3332138 75.8145294 76.4133148 77.0081635 77.3971786 77.8558273 78.3211288 78.7536469 79.2076721 79.6620789 80.0829926 80.5881042 81.008667 81.495903 81.9152985 82.3573456 82.8477936 83.4009552 83.9527969 84.5349884 85.0604401 85.6892548 86.4011307 86.9450226 87.5858154 88.2588654 88.9552307 89.6110992 90.296875 91.0446396 91.7588806 92.5154724 93.2278748 94.0473404 94.8327789 95.4763031 96.1308899 96.8262177 97.5453033 98.1972351 98.85112 99.5440521 100.21933 100.873016 101.632828 102.250214 102.957962 103.659073 104.35788 105.043335 105.755882 106.475479 107.181717 107.770782 108.454453 109.125626 109.802002 110.502075 111.168457 111.829559 112.548134 113.253067 114.005066 114.728592 115.552536 116.415474 117.205147 118.110138 119.02533 120.064163 121.105286 122.230515 123.430038 124.8853 126.62178 128.696167 130.890106 133.497192 136.878754 141.491455 146.888565 153.732208 165.111053 190.84314 446.331207 ------------------------------ Transdef: Tab for variable 10 0.00602381537 0.800824642 1.10593438 1.29670262 1.463269 1.59873819 1.71084571 1.81236708 1.90354419 1.99113905 2.0638814 2.12566376 2.18640137 2.23348141 2.28365326 2.32521486 2.36380577 2.4005053 2.43870902 2.47420073 2.50275278 2.53147888 2.56037378 2.58666182 2.6140933 2.63538051 2.66171598 2.68474627 2.70709085 2.72813344 2.74899435 2.76800108 2.78501368 2.80119181 2.81679058 2.82914305 2.84322596 2.85527468 2.86633444 2.8780334 2.8896122 2.89908648 2.90859222 2.9169805 2.92527485 2.93331981 2.94216776 2.94976473 2.95671105 2.96366501 2.96994495 2.97595644 2.98032379 2.98620653 2.98992538 2.99381757 2.99849916 3.00368977 3.00793934 3.01342964 3.0176785 3.02216578 3.0264535 3.0300622 3.03462934 3.0380764 3.04196739 3.04555941 3.04959798 3.05347252 3.05732918 3.06073523 3.06448722 3.0678196 3.07116795 3.07385707 3.07680416 3.08010435 3.08325958 3.08639669 3.08934593 3.09221458 3.09493923 3.09799051 3.10093856 3.10349083 3.10588837 3.10913038 3.11202478 3.11431551 3.1166048 3.11914253 3.12117815 3.12412357 3.12658691 3.12928176 3.13169765 3.13376808 3.1362431 3.13889694 3.1415906 ------------------------------ Transdef: Tab for variable 11 1.13248825E-05 0.00615954725 0.0111601297 0.018076539 0.0251268223 0.0326726437 0.0410256684 0.0497224331 0.0588368252 0.0680826902 0.0770267844 0.0852590799 0.0931124091 0.10110797 0.110514879 0.117880106 0.126559734 0.135505319 0.143424764 0.151128769 0.157361537 0.165043846 0.172646999 0.179976672 0.187844038 0.194895506 0.202181578 0.208381772 0.214397252 0.222104549 0.22867924 0.234477997 0.240434051 0.247019291 0.253716916 0.259444952 0.265877962 0.271718144 0.278704166 0.284913301 0.290486515 0.296878338 0.302111387 0.308400273 0.314134121 0.320452124 0.326147676 0.33261019 0.338057429 0.344080657 0.35040915 0.356508732 0.362157166 0.367775679 0.374314427 0.380818367 0.38613236 0.391355515 0.397600412 0.402924895 0.409225821 0.415189147 0.421270043 0.427655578 0.434379935 0.43997097 0.446227074 0.452622235 0.459626317 0.465137541 0.472288489 0.477916777 0.484365135 0.491737843 0.499670267 0.506155014 0.512704194 0.519404948 0.526653826 0.533089399 0.539798617 0.545824289 0.553374171 0.562655091 0.572324634 0.580845892 0.591466248 0.600820303 0.613087475 0.625845373 0.637213469 0.650054455 0.663396001 0.6787889 0.69623363 0.712736487 0.732825637 0.76248908 0.802783012 0.861783504 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.200001657 0.22726585 0.246756971 0.260842204 0.273943186 0.284802705 0.296850324 0.305367738 0.315320492 0.324160606 0.332550645 0.340402305 0.347986639 0.355469882 0.361437798 0.369142026 0.375663549 0.382259727 0.388718963 0.394573778 0.39991796 0.405162215 0.409909666 0.414421737 0.419100285 0.423886091 0.427971542 0.43274343 0.436979294 0.441181123 0.445373952 0.448675096 0.453879595 0.457445592 0.46209985 0.466394305 0.470417798 0.473923862 0.477939367 0.482040197 0.485841602 0.489680171 0.492985666 0.497276485 0.50112915 0.505192399 0.509259582 0.513009846 0.517124355 0.520963311 0.524615526 0.528759718 0.532628596 0.537372947 0.541687548 0.546303272 0.549016595 0.552912712 0.557353675 0.560825109 0.564987302 0.569592416 0.574465871 0.578807116 0.583189666 0.588253498 0.593148708 0.597705841 0.60347271 0.608426094 0.613793135 0.61854434 0.623988271 0.629875183 0.635020375 0.64123702 0.647281051 0.653942823 0.660660803 0.666261554 0.6720981 0.678278804 0.685392499 0.693040848 0.700431228 0.707219124 0.71570617 0.724749744 0.733030558 0.742483318 0.751047969 0.759429455 0.77024591 0.781323552 0.79645884 0.814665437 0.832520127 0.851844907 0.880078018 0.923209131 1.13539624 ------------------------------ Transdef: Tab for variable 13 20.0025482 21.3635139 22.1211414 22.5800095 23.2226677 23.7180061 24.2423286 24.6554451 25.0831299 25.4481182 25.7508469 26.1172981 26.508503 26.8887062 27.2210598 27.5306244 27.8583298 28.2184105 28.5280113 28.8242435 29.1219177 29.3993053 29.6684647 29.9774208 30.2548103 30.5477505 30.8120308 31.0547562 31.3722153 31.6635418 31.9307251 32.2336617 32.5218048 32.8287811 33.1294861 33.4053574 33.7174492 34.0444183 34.3448257 34.5935745 34.9390106 35.2794342 35.5686035 35.8925781 36.1221123 36.4154854 36.7591515 37.1183014 37.4250183 37.7695198 38.0826378 38.4752502 38.8527298 39.2414474 39.6079025 40.0295525 40.4671783 40.850647 41.2812881 41.699379 42.1216621 42.6281853 43.0480042 43.5162277 44.0140839 44.5371094 45.0710297 45.635376 46.2382278 46.8471909 47.415493 48.0739098 48.7535248 49.4331665 50.1964722 50.9529762 51.7769012 52.5458374 53.3864594 54.2688904 55.187088 56.1494141 57.095253 58.2609863 59.4649811 60.8536606 62.3144531 63.6362915 65.0518951 66.9088745 68.1901245 69.9817505 72.3524323 74.7244797 77.5030289 80.7416916 84.4818726 89.571991 95.9848785 106.733261 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0084028 10.5552902 10.9017687 11.1549063 11.3847351 11.6454973 11.8585129 12.1223373 12.3337898 12.5406532 12.7647209 12.9531555 13.1700802 13.376564 13.5700283 13.7472744 13.9599724 14.1635303 14.3914852 14.5816956 14.7593079 14.9709368 15.1557255 15.3679199 15.5507851 15.7492905 15.9510555 16.1581535 16.3493366 16.5305805 16.7306099 16.9220657 17.1066132 17.3062057 17.5187912 17.7299652 17.9290466 18.1141357 18.2800045 18.5048141 18.6781693 18.8730087 19.0622444 19.2142963 19.4322262 19.6915588 19.9153557 20.1329784 20.3528786 20.5784874 20.7876778 21.0215836 21.2253132 21.4722404 21.6826363 21.8972015 22.1304398 22.3669167 22.6115398 22.8624573 23.1287689 23.3720512 23.6305313 23.9145775 24.2259254 24.5152645 24.7836437 25.0657196 25.3899841 25.710825 25.99156 26.3228054 26.6549454 26.9884377 27.3127327 27.6507301 27.9611435 28.3089848 28.754385 29.1631432 29.5813046 30.0464783 30.5960884 31.1538296 31.5931664 32.1788101 32.857666 33.5871201 34.4043961 35.2228317 36.170433 37.218132 38.3534241 39.6906967 40.8189011 42.5614243 44.3191185 47.21772 51.0972519 57.4324379 128.659836 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 54.1 39.5 29.8 6.6 31.6 52.4 36.9 49.8 -29.8 -7.6 -21.2 4.0 -2.2 2 54.1 100.0 62.9 45.3 14.9 57.1 93.9 61.9 89.5 -53.0 -18.6 -40.3 28.7 15.8 3 39.5 62.9 100.0 21.5 -7.2 23.3 66.3 44.4 68.5 -21.7 -19.5 -44.8 62.5 2.8 4 29.8 45.3 21.5 100.0 -2.0 20.0 47.3 34.6 49.8 -14.1 -23.5 -48.7 6.3 65.8 5 6.6 14.9 -7.2 -2.0 100.0 79.6 -12.9 44.1 43.0 29.0 2.3 4.3 -12.1 -6.8 6 31.6 57.1 23.3 20.0 79.6 100.0 34.4 70.4 73.9 -4.3 -7.3 -16.0 3.7 2.5 7 52.4 93.9 66.3 47.3 -12.9 34.4 100.0 55.2 76.8 -59.6 -18.4 -41.3 33.5 19.2 8 36.9 61.9 44.4 34.6 44.1 70.4 55.2 100.0 74.9 -10.9 -16.6 -31.4 20.0 13.5 9 49.8 89.5 68.5 49.8 43.0 73.9 76.8 74.9 100.0 -27.5 -20.5 -45.5 33.4 19.7 10 -29.8 -53.0 -21.7 -14.1 29.0 -4.3 -59.6 -10.9 -27.5 100.0 4.0 11.1 -7.4 -2.1 11 -7.6 -18.6 -19.5 -23.5 2.3 -7.3 -18.4 -16.6 -20.5 4.0 100.0 54.4 -9.2 -14.1 12 -21.2 -40.3 -44.8 -48.7 4.3 -16.0 -41.3 -31.4 -45.5 11.1 54.4 100.0 -22.2 -24.6 13 4.0 28.7 62.5 6.3 -12.1 3.7 33.5 20.0 33.4 -7.4 -9.2 -22.2 100.0 38.0 14 -2.2 15.8 2.8 65.8 -6.8 2.5 19.2 13.5 19.7 -2.1 -14.1 -24.6 38.0 100.0 TOTAL CORRELATION TO TARGET (diagonal) 120.070851 TOTAL CORRELATION OF ALL VARIABLES 59.9741458 ROUND 1: MAX CORR ( 59.9601536) AFTER KILLING INPUT VARIABLE 7 CONTR 1.29543246 ROUND 2: MAX CORR ( 59.9523563) AFTER KILLING INPUT VARIABLE 5 CONTR 0.966954562 ROUND 3: MAX CORR ( 59.9320393) AFTER KILLING INPUT VARIABLE 11 CONTR 1.56066804 ROUND 4: MAX CORR ( 59.8504925) AFTER KILLING INPUT VARIABLE 8 CONTR 3.12536153 ROUND 5: MAX CORR ( 59.7304805) AFTER KILLING INPUT VARIABLE 10 CONTR 3.78829185 ROUND 6: MAX CORR ( 59.4711561) AFTER KILLING INPUT VARIABLE 9 CONTR 5.55984679 ROUND 7: MAX CORR ( 59.4431263) AFTER KILLING INPUT VARIABLE 6 CONTR 1.8256884 ROUND 8: MAX CORR ( 58.9931951) AFTER KILLING INPUT VARIABLE 12 CONTR 7.29987679 ROUND 9: MAX CORR ( 58.4827618) AFTER KILLING INPUT VARIABLE 14 CONTR 7.74361914 ROUND 10: MAX CORR ( 58.1509489) AFTER KILLING INPUT VARIABLE 4 CONTR 6.22097783 ROUND 11: MAX CORR ( 55.3993126) AFTER KILLING INPUT VARIABLE 3 CONTR 17.6762276 ROUND 12: MAX CORR ( 54.0765707) AFTER KILLING INPUT VARIABLE 13 CONTR 12.0336334 LAST REMAINING VARIABLE: 2 total correlation to target: 59.9741458 % total significance: 115.735722 sigma correlations of single variables to target: variable 2: 54.0765707 % , in sigma: 104.354816 variable 3: 39.4695879 % , in sigma: 76.1668412 variable 4: 29.7909185 % , in sigma: 57.4893299 variable 5: 6.61678365 % , in sigma: 12.768806 variable 6: 31.5821635 % , in sigma: 60.9460031 variable 7: 52.4024419 % , in sigma: 101.124148 variable 8: 36.9155635 % , in sigma: 71.2381863 variable 9: 49.8495301 % , in sigma: 96.1976408 variable 10: -29.7590352 % , in sigma: 57.4278027 variable 11: -7.55687674 % , in sigma: 14.5829602 variable 12: -21.1912075 % , in sigma: 40.8939495 variable 13: 3.97845721 % , in sigma: 7.67746852 variable 14: -2.19528836 % , in sigma: 4.23638018 variables sorted by significance: 1 most relevant variable 2 corr 54.0765724 , in sigma: 104.354819 2 most relevant variable 13 corr 12.0336332 , in sigma: 23.2220269 3 most relevant variable 3 corr 17.6762276 , in sigma: 34.110881 4 most relevant variable 4 corr 6.22097778 , in sigma: 12.0049955 5 most relevant variable 14 corr 7.74361897 , in sigma: 14.9433279 6 most relevant variable 12 corr 7.29987669 , in sigma: 14.0870117 7 most relevant variable 6 corr 1.82568836 , in sigma: 3.5231408 8 most relevant variable 9 corr 5.55984688 , in sigma: 10.7291714 9 most relevant variable 10 corr 3.78829193 , in sigma: 7.31049512 10 most relevant variable 8 corr 3.12536144 , in sigma: 6.03119823 11 most relevant variable 11 corr 1.56066799 , in sigma: 3.01171503 12 most relevant variable 5 corr 0.966954589 , in sigma: 1.86599051 13 most relevant variable 7 corr 1.29543245 , in sigma: 2.49987402 global correlations between input variables: variable 2: 98.9865764 % variable 3: 93.4500476 % variable 4: 89.478057 % variable 5: 94.6012974 % variable 6: 93.1603533 % variable 7: 98.4687627 % variable 8: 85.0156523 % variable 9: 98.7208576 % variable 10: 72.0420337 % variable 11: 55.5113974 % variable 12: 72.3474311 % variable 13: 83.9237933 % variable 14: 85.3563797 % significance loss when removing single variables: variable 2: corr = 4.95344847 % , sigma = 9.55896787 variable 3: corr = 11.6555649 % , sigma = 22.4924457 variable 4: corr = 11.8965099 % , sigma = 22.9574117 variable 5: corr = 1.32729828 % , sigma = 2.56136744 variable 6: corr = 3.7488392 % , sigma = 7.23436081 variable 7: corr = 1.29543246 % , sigma = 2.49987404 variable 8: corr = 2.23586615 % , sigma = 4.31468558 variable 9: corr = 3.95067526 % , sigma = 7.623856 variable 10: corr = 3.78064073 % , sigma = 7.29573014 variable 11: corr = 1.48722268 % , sigma = 2.86998318 variable 12: corr = 5.15060085 % , sigma = 9.9394247 variable 13: corr = 8.65707074 % , sigma = 16.7060708 variable 14: corr = 7.94129478 % , sigma = 15.3247949 Keep only 10 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 11 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 4 --> 16.3861885 sigma out 15 active outputs RANK 2 NODE 6 --> 15.1296272 sigma out 15 active outputs RANK 3 NODE 8 --> 15.085063 sigma out 15 active outputs RANK 4 NODE 3 --> 14.5211201 sigma out 15 active outputs RANK 5 NODE 11 --> 13.7478857 sigma out 15 active outputs RANK 6 NODE 2 --> 12.6330967 sigma out 15 active outputs RANK 7 NODE 1 --> 12.6102629 sigma out 15 active outputs RANK 8 NODE 10 --> 11.5609541 sigma out 15 active outputs RANK 9 NODE 5 --> 10.9627161 sigma out 15 active outputs RANK 10 NODE 7 --> 10.7580509 sigma out 15 active outputs RANK 11 NODE 9 --> 8.35726547 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 20.180294 sigma in 11act. ( 22.4813709 sig out 1act.) RANK 2 NODE 9 --> 18.3455715 sigma in 11act. ( 18.8948879 sig out 1act.) RANK 3 NODE 1 --> 16.3846169 sigma in 11act. ( 15.6742611 sig out 1act.) RANK 4 NODE 15 --> 13.9998531 sigma in 11act. ( 15.1517906 sig out 1act.) RANK 5 NODE 12 --> 13.0286484 sigma in 11act. ( 10.3755617 sig out 1act.) RANK 6 NODE 2 --> 10.8459835 sigma in 11act. ( 11.0475912 sig out 1act.) RANK 7 NODE 3 --> 10.5976124 sigma in 11act. ( 10.5249805 sig out 1act.) RANK 8 NODE 11 --> 10.2948008 sigma in 11act. ( 12.2189608 sig out 1act.) RANK 9 NODE 7 --> 10.1905327 sigma in 11act. ( 15.7859879 sig out 1act.) RANK 10 NODE 10 --> 4.64520502 sigma in 11act. ( 5.71150064 sig out 1act.) RANK 11 NODE 8 --> 3.58119369 sigma in 11act. ( 3.30599284 sig out 1act.) RANK 12 NODE 4 --> 3.52060771 sigma in 11act. ( 0.516636074 sig out 1act.) RANK 13 NODE 6 --> 3.26679921 sigma in 11act. ( 6.37424946 sig out 1act.) RANK 14 NODE 14 --> 2.57802391 sigma in 11act. ( 1.18362963 sig out 1act.) RANK 15 NODE 13 --> 0.988695621 sigma in 11act. ( 0.510781288 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 22.4813709 sigma out 1act.( 20.180294 sig in 11act.) RANK 2 NODE 9 --> 18.8948879 sigma out 1act.( 18.3455715 sig in 11act.) RANK 3 NODE 7 --> 15.7859879 sigma out 1act.( 10.1905327 sig in 11act.) RANK 4 NODE 1 --> 15.6742611 sigma out 1act.( 16.3846169 sig in 11act.) RANK 5 NODE 15 --> 15.1517906 sigma out 1act.( 13.9998531 sig in 11act.) RANK 6 NODE 11 --> 12.2189608 sigma out 1act.( 10.2948008 sig in 11act.) RANK 7 NODE 2 --> 11.0475912 sigma out 1act.( 10.8459835 sig in 11act.) RANK 8 NODE 3 --> 10.5249805 sigma out 1act.( 10.5976124 sig in 11act.) RANK 9 NODE 12 --> 10.3755617 sigma out 1act.( 13.0286484 sig in 11act.) RANK 10 NODE 6 --> 6.37424946 sigma out 1act.( 3.26679921 sig in 11act.) RANK 11 NODE 10 --> 5.71150064 sigma out 1act.( 4.64520502 sig in 11act.) RANK 12 NODE 8 --> 3.30599284 sigma out 1act.( 3.58119369 sig in 11act.) RANK 13 NODE 14 --> 1.18362963 sigma out 1act.( 2.57802391 sig in 11act.) RANK 14 NODE 4 --> 0.516636074 sigma out 1act.( 3.52060771 sig in 11act.) RANK 15 NODE 13 --> 0.510781288 sigma out 1act.( 0.988695621 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 46.5056534 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 4 --> 19.9448929 sigma out 15 active outputs RANK 2 NODE 2 --> 19.4309845 sigma out 15 active outputs RANK 3 NODE 8 --> 16.4610825 sigma out 15 active outputs RANK 4 NODE 3 --> 15.919075 sigma out 15 active outputs RANK 5 NODE 6 --> 15.8288441 sigma out 15 active outputs RANK 6 NODE 1 --> 15.306179 sigma out 15 active outputs RANK 7 NODE 11 --> 15.1316042 sigma out 15 active outputs RANK 8 NODE 10 --> 12.9994106 sigma out 15 active outputs RANK 9 NODE 5 --> 12.8444433 sigma out 15 active outputs RANK 10 NODE 7 --> 10.791317 sigma out 15 active outputs RANK 11 NODE 9 --> 10.1945419 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 22.0138359 sigma in 11act. ( 21.7988129 sig out 1act.) RANK 2 NODE 9 --> 19.5969086 sigma in 11act. ( 18.5194359 sig out 1act.) RANK 3 NODE 7 --> 17.6182728 sigma in 11act. ( 15.8317471 sig out 1act.) RANK 4 NODE 15 --> 15.2549009 sigma in 11act. ( 14.7170553 sig out 1act.) RANK 5 NODE 1 --> 14.8765459 sigma in 11act. ( 14.3118629 sig out 1act.) RANK 6 NODE 11 --> 13.4407377 sigma in 11act. ( 12.9323521 sig out 1act.) RANK 7 NODE 6 --> 13.1488428 sigma in 11act. ( 7.86933422 sig out 1act.) RANK 8 NODE 2 --> 10.744175 sigma in 11act. ( 10.5685625 sig out 1act.) RANK 9 NODE 12 --> 10.5645189 sigma in 11act. ( 8.70103836 sig out 1act.) RANK 10 NODE 3 --> 9.6269331 sigma in 11act. ( 9.68894196 sig out 1act.) RANK 11 NODE 10 --> 9.56498051 sigma in 11act. ( 5.78739929 sig out 1act.) RANK 12 NODE 4 --> 8.41093254 sigma in 11act. ( 1.04781103 sig out 1act.) RANK 13 NODE 8 --> 6.17352343 sigma in 11act. ( 3.11839914 sig out 1act.) RANK 14 NODE 13 --> 5.91639376 sigma in 11act. ( 1.2078644 sig out 1act.) RANK 15 NODE 14 --> 5.11811161 sigma in 11act. ( 0.757757425 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 21.7988129 sigma out 1act.( 22.0138359 sig in 11act.) RANK 2 NODE 9 --> 18.5194359 sigma out 1act.( 19.5969086 sig in 11act.) RANK 3 NODE 7 --> 15.8317471 sigma out 1act.( 17.6182728 sig in 11act.) RANK 4 NODE 15 --> 14.7170553 sigma out 1act.( 15.2549009 sig in 11act.) RANK 5 NODE 1 --> 14.3118629 sigma out 1act.( 14.8765459 sig in 11act.) RANK 6 NODE 11 --> 12.9323521 sigma out 1act.( 13.4407377 sig in 11act.) RANK 7 NODE 2 --> 10.5685625 sigma out 1act.( 10.744175 sig in 11act.) RANK 8 NODE 3 --> 9.68894196 sigma out 1act.( 9.6269331 sig in 11act.) RANK 9 NODE 12 --> 8.70103836 sigma out 1act.( 10.5645189 sig in 11act.) RANK 10 NODE 6 --> 7.86933422 sigma out 1act.( 13.1488428 sig in 11act.) RANK 11 NODE 10 --> 5.78739929 sigma out 1act.( 9.56498051 sig in 11act.) RANK 12 NODE 8 --> 3.11839914 sigma out 1act.( 6.17352343 sig in 11act.) RANK 13 NODE 13 --> 1.2078644 sigma out 1act.( 5.91639376 sig in 11act.) RANK 14 NODE 4 --> 1.04781103 sigma out 1act.( 8.41093254 sig in 11act.) RANK 15 NODE 14 --> 0.757757425 sigma out 1act.( 5.11811161 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 45.2440376 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.43678844 *** contribution from regularisation: 0.00398979615 *** contribution from error: -0.440778226 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.468564391 *** contribution from regularisation: 0.00232702354 *** contribution from error: -0.470891416 *********************************************** -----------------> Test sample ENTER BFGS code START -43592.8177 -0.311172515 0.274455756 EXIT FROM BFGS code FG_START 0. -0.311172515 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.482810408 *** contribution from regularisation: 0.00213325792 *** contribution from error: -0.484943658 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -44908.6098 -0.311172515 -228.027237 EXIT FROM BFGS code FG_LNSRCH 0. -0.361454129 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.508264005 *** contribution from regularisation: 0.00288657355 *** contribution from error: -0.511150599 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47276.1765 -0.361454129 19.9427299 EXIT FROM BFGS code NEW_X -47276.1765 -0.361454129 19.9427299 ENTER BFGS code NEW_X -47276.1765 -0.361454129 19.9427299 EXIT FROM BFGS code FG_LNSRCH 0. -0.36073935 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.511655211 *** contribution from regularisation: 0.00285471068 *** contribution from error: -0.514509916 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47591.6117 -0.36073935 -17.7670631 EXIT FROM BFGS code NEW_X -47591.6117 -0.36073935 -17.7670631 ENTER BFGS code NEW_X -47591.6117 -0.36073935 -17.7670631 EXIT FROM BFGS code FG_LNSRCH 0. -0.368624121 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.51271522 *** contribution from regularisation: 0.00284176879 *** contribution from error: -0.515556991 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47690.2043 -0.368624121 -12.090085 EXIT FROM BFGS code NEW_X -47690.2043 -0.368624121 -12.090085 ENTER BFGS code NEW_X -47690.2043 -0.368624121 -12.090085 EXIT FROM BFGS code FG_LNSRCH 0. -0.376626551 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.513247013 *** contribution from regularisation: 0.00284434459 *** contribution from error: -0.516091347 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47739.6715 -0.376626551 13.8943634 EXIT FROM BFGS code NEW_X -47739.6715 -0.376626551 13.8943634 ENTER BFGS code NEW_X -47739.6715 -0.376626551 13.8943634 EXIT FROM BFGS code FG_LNSRCH 0. -0.374857068 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.514026761 *** contribution from regularisation: 0.00278567849 *** contribution from error: -0.516812444 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47812.2015 -0.374857068 25.805027 EXIT FROM BFGS code NEW_X -47812.2015 -0.374857068 25.805027 ENTER BFGS code NEW_X -47812.2015 -0.374857068 25.805027 EXIT FROM BFGS code FG_LNSRCH 0. -0.337400228 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.515310764 *** contribution from regularisation: 0.00350585743 *** contribution from error: -0.51881665 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47931.6304 -0.337400228 51.0803642 EXIT FROM BFGS code FG_LNSRCH 0. -0.357129663 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 40.3613396 sigma out 15 active outputs RANK 2 NODE 1 --> 24.7673035 sigma out 15 active outputs RANK 3 NODE 10 --> 15.7579927 sigma out 15 active outputs RANK 4 NODE 6 --> 14.4157858 sigma out 15 active outputs RANK 5 NODE 4 --> 14.1247444 sigma out 15 active outputs RANK 6 NODE 9 --> 12.9559765 sigma out 15 active outputs RANK 7 NODE 8 --> 9.82239246 sigma out 15 active outputs RANK 8 NODE 3 --> 9.00362492 sigma out 15 active outputs RANK 9 NODE 5 --> 7.38772106 sigma out 15 active outputs RANK 10 NODE 7 --> 7.15135241 sigma out 15 active outputs RANK 11 NODE 11 --> 6.03907299 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 41.3762894 sigma in 11act. ( 45.3175087 sig out 1act.) RANK 2 NODE 11 --> 18.7383881 sigma in 11act. ( 17.6891212 sig out 1act.) RANK 3 NODE 9 --> 17.5522327 sigma in 11act. ( 15.9623508 sig out 1act.) RANK 4 NODE 7 --> 17.4351025 sigma in 11act. ( 15.3861847 sig out 1act.) RANK 5 NODE 13 --> 11.554307 sigma in 11act. ( 12.2354879 sig out 1act.) RANK 6 NODE 5 --> 11.5003567 sigma in 11act. ( 10.4725809 sig out 1act.) RANK 7 NODE 2 --> 11.1710491 sigma in 11act. ( 11.2583208 sig out 1act.) RANK 8 NODE 15 --> 10.5183716 sigma in 11act. ( 9.81165981 sig out 1act.) RANK 9 NODE 8 --> 7.10361385 sigma in 11act. ( 6.10834265 sig out 1act.) RANK 10 NODE 10 --> 6.67970991 sigma in 11act. ( 5.21727133 sig out 1act.) RANK 11 NODE 3 --> 5.50225258 sigma in 11act. ( 5.28275776 sig out 1act.) RANK 12 NODE 12 --> 5.17519474 sigma in 11act. ( 4.60076761 sig out 1act.) RANK 13 NODE 1 --> 5.0399127 sigma in 11act. ( 3.85845995 sig out 1act.) RANK 14 NODE 4 --> 4.94094515 sigma in 11act. ( 1.9789263 sig out 1act.) RANK 15 NODE 14 --> 2.71284842 sigma in 11act. ( 0.560362935 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 45.3175087 sigma out 1act.( 41.3762894 sig in 11act.) RANK 2 NODE 11 --> 17.6891212 sigma out 1act.( 18.7383881 sig in 11act.) RANK 3 NODE 9 --> 15.9623508 sigma out 1act.( 17.5522327 sig in 11act.) RANK 4 NODE 7 --> 15.3861847 sigma out 1act.( 17.4351025 sig in 11act.) RANK 5 NODE 13 --> 12.2354879 sigma out 1act.( 11.554307 sig in 11act.) RANK 6 NODE 2 --> 11.2583208 sigma out 1act.( 11.1710491 sig in 11act.) RANK 7 NODE 5 --> 10.4725809 sigma out 1act.( 11.5003567 sig in 11act.) RANK 8 NODE 15 --> 9.81165981 sigma out 1act.( 10.5183716 sig in 11act.) RANK 9 NODE 8 --> 6.10834265 sigma out 1act.( 7.10361385 sig in 11act.) RANK 10 NODE 3 --> 5.28275776 sigma out 1act.( 5.50225258 sig in 11act.) RANK 11 NODE 10 --> 5.21727133 sigma out 1act.( 6.67970991 sig in 11act.) RANK 12 NODE 12 --> 4.60076761 sigma out 1act.( 5.17519474 sig in 11act.) RANK 13 NODE 1 --> 3.85845995 sigma out 1act.( 5.0399127 sig in 11act.) RANK 14 NODE 4 --> 1.9789263 sigma out 1act.( 4.94094515 sig in 11act.) RANK 15 NODE 14 --> 0.560362935 sigma out 1act.( 2.71284842 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 58.9341774 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.517268538 *** contribution from regularisation: 0.00206026295 *** contribution from error: -0.519328773 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -48113.7354 -0.357129663 47.3495445 EXIT FROM BFGS code NEW_X -48113.7354 -0.357129663 47.3495445 ENTER BFGS code NEW_X -48113.7354 -0.357129663 47.3495445 EXIT FROM BFGS code FG_LNSRCH 0. -0.321691841 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.519462883 *** contribution from regularisation: 0.0038266296 *** contribution from error: -0.523289502 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48317.8395 -0.321691841 80.9745102 EXIT FROM BFGS code NEW_X -48317.8395 -0.321691841 80.9745102 ENTER BFGS code NEW_X -48317.8395 -0.321691841 80.9745102 EXIT FROM BFGS code FG_LNSRCH 0. -0.238151565 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.521749496 *** contribution from regularisation: 0.00407789741 *** contribution from error: -0.525827408 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48530.5293 -0.238151565 -168.927292 EXIT FROM BFGS code FG_LNSRCH 0. -0.283173352 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.524033666 *** contribution from regularisation: 0.00224717217 *** contribution from error: -0.52628082 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48742.9918 -0.283173352 13.4098282 EXIT FROM BFGS code NEW_X -48742.9918 -0.283173352 13.4098282 ENTER BFGS code NEW_X -48742.9918 -0.283173352 13.4098282 EXIT FROM BFGS code FG_LNSRCH 0. -0.273741603 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.524418414 *** contribution from regularisation: 0.0030775445 *** contribution from error: -0.52749598 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48778.7812 -0.273741603 5.11914587 EXIT FROM BFGS code NEW_X -48778.7812 -0.273741603 5.11914587 ENTER BFGS code NEW_X -48778.7812 -0.273741603 5.11914587 EXIT FROM BFGS code FG_LNSRCH 0. -0.267875612 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.525168419 *** contribution from regularisation: 0.00288172299 *** contribution from error: -0.528050125 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48848.5398 -0.267875612 -36.5102577 EXIT FROM BFGS code NEW_X -48848.5398 -0.267875612 -36.5102577 ENTER BFGS code NEW_X -48848.5398 -0.267875612 -36.5102577 EXIT FROM BFGS code FG_LNSRCH 0. -0.316805363 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.526215792 *** contribution from regularisation: 0.00269389316 *** contribution from error: -0.528909683 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48945.9643 -0.316805363 -11.8169413 EXIT FROM BFGS code NEW_X -48945.9643 -0.316805363 -11.8169413 ENTER BFGS code NEW_X -48945.9643 -0.316805363 -11.8169413 EXIT FROM BFGS code FG_LNSRCH 0. -0.325403631 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.525273204 *** contribution from regularisation: 0.00299072755 *** contribution from error: -0.528263927 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48858.2849 -0.325403631 -109.560577 EXIT FROM BFGS code FG_LNSRCH 0. -0.318908721 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.526253641 *** contribution from regularisation: 0.00294229086 *** contribution from error: -0.529195905 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48949.4815 -0.318908721 -33.8341026 EXIT FROM BFGS code NEW_X -48949.4815 -0.318908721 -33.8341026 ENTER BFGS code NEW_X -48949.4815 -0.318908721 -33.8341026 EXIT FROM BFGS code FG_LNSRCH 0. -0.324596852 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.526584923 *** contribution from regularisation: 0.00293046236 *** contribution from error: -0.529515386 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48980.2989 -0.324596852 -29.0174332 EXIT FROM BFGS code NEW_X -48980.2989 -0.324596852 -29.0174332 ENTER BFGS code NEW_X -48980.2989 -0.324596852 -29.0174332 EXIT FROM BFGS code FG_LNSRCH 0. -0.34679088 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 69.6927643 sigma out 15 active outputs RANK 2 NODE 10 --> 26.7350903 sigma out 15 active outputs RANK 3 NODE 1 --> 26.5890408 sigma out 15 active outputs RANK 4 NODE 6 --> 22.8601646 sigma out 15 active outputs RANK 5 NODE 9 --> 22.3359222 sigma out 15 active outputs RANK 6 NODE 4 --> 20.1166668 sigma out 15 active outputs RANK 7 NODE 3 --> 18.2660942 sigma out 15 active outputs RANK 8 NODE 7 --> 9.79707623 sigma out 15 active outputs RANK 9 NODE 8 --> 9.65054703 sigma out 15 active outputs RANK 10 NODE 11 --> 8.72750282 sigma out 15 active outputs RANK 11 NODE 5 --> 6.3622942 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 79.391243 sigma in 11act. ( 100.780937 sig out 1act.) RANK 2 NODE 2 --> 25.001358 sigma in 11act. ( 30.60569 sig out 1act.) RANK 3 NODE 11 --> 20.6367397 sigma in 11act. ( 15.5928879 sig out 1act.) RANK 4 NODE 7 --> 18.3205166 sigma in 11act. ( 18.8859978 sig out 1act.) RANK 5 NODE 9 --> 12.876996 sigma in 11act. ( 11.3592176 sig out 1act.) RANK 6 NODE 13 --> 11.4237623 sigma in 11act. ( 9.49316883 sig out 1act.) RANK 7 NODE 8 --> 7.44481277 sigma in 11act. ( 5.35019398 sig out 1act.) RANK 8 NODE 1 --> 7.44073391 sigma in 11act. ( 7.873909 sig out 1act.) RANK 9 NODE 3 --> 7.39877748 sigma in 11act. ( 7.77547741 sig out 1act.) RANK 10 NODE 5 --> 7.27796459 sigma in 11act. ( 5.79659748 sig out 1act.) RANK 11 NODE 4 --> 7.10095024 sigma in 11act. ( 5.74889851 sig out 1act.) RANK 12 NODE 10 --> 6.01142168 sigma in 11act. ( 4.85074759 sig out 1act.) RANK 13 NODE 15 --> 5.23645163 sigma in 11act. ( 2.50838757 sig out 1act.) RANK 14 NODE 12 --> 4.76937819 sigma in 11act. ( 5.25814152 sig out 1act.) RANK 15 NODE 14 --> 1.81547284 sigma in 11act. ( 0.912790358 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 100.780937 sigma out 1act.( 79.391243 sig in 11act.) RANK 2 NODE 2 --> 30.60569 sigma out 1act.( 25.001358 sig in 11act.) RANK 3 NODE 7 --> 18.8859978 sigma out 1act.( 18.3205166 sig in 11act.) RANK 4 NODE 11 --> 15.5928879 sigma out 1act.( 20.6367397 sig in 11act.) RANK 5 NODE 9 --> 11.3592176 sigma out 1act.( 12.876996 sig in 11act.) RANK 6 NODE 13 --> 9.49316883 sigma out 1act.( 11.4237623 sig in 11act.) RANK 7 NODE 1 --> 7.873909 sigma out 1act.( 7.44073391 sig in 11act.) RANK 8 NODE 3 --> 7.77547741 sigma out 1act.( 7.39877748 sig in 11act.) RANK 9 NODE 5 --> 5.79659748 sigma out 1act.( 7.27796459 sig in 11act.) RANK 10 NODE 4 --> 5.74889851 sigma out 1act.( 7.10095024 sig in 11act.) RANK 11 NODE 8 --> 5.35019398 sigma out 1act.( 7.44481277 sig in 11act.) RANK 12 NODE 12 --> 5.25814152 sigma out 1act.( 4.76937819 sig in 11act.) RANK 13 NODE 10 --> 4.85074759 sigma out 1act.( 6.01142168 sig in 11act.) RANK 14 NODE 15 --> 2.50838757 sigma out 1act.( 5.23645163 sig in 11act.) RANK 15 NODE 14 --> 0.912790358 sigma out 1act.( 1.81547284 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 110.401588 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.526937664 *** contribution from regularisation: 0.0029566905 *** contribution from error: -0.529894352 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -49013.1076 -0.34679088 -14.0387697 EXIT FROM BFGS code NEW_X -49013.1076 -0.34679088 -14.0387697 ENTER BFGS code NEW_X -49013.1076 -0.34679088 -14.0387697 EXIT FROM BFGS code FG_LNSRCH 0. -0.372270137 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.527261376 *** contribution from regularisation: 0.00299724191 *** contribution from error: -0.530258596 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49043.2163 -0.372270137 -11.4717274 EXIT FROM BFGS code NEW_X -49043.2163 -0.372270137 -11.4717274 ENTER BFGS code NEW_X -49043.2163 -0.372270137 -11.4717274 EXIT FROM BFGS code FG_LNSRCH 0. -0.4306795 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.527743816 *** contribution from regularisation: 0.00312109408 *** contribution from error: -0.530864894 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49088.0935 -0.4306795 32.9144096 EXIT FROM BFGS code NEW_X -49088.0935 -0.4306795 32.9144096 ENTER BFGS code NEW_X -49088.0935 -0.4306795 32.9144096 EXIT FROM BFGS code FG_LNSRCH 0. -0.456227928 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.52758038 *** contribution from regularisation: 0.00321885291 *** contribution from error: -0.53079921 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49072.8868 -0.456227928 -185.808624 EXIT FROM BFGS code FG_LNSRCH 0. -0.441444218 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.527874529 *** contribution from regularisation: 0.00328149321 *** contribution from error: -0.531156003 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49100.25 -0.441444218 -46.1749382 EXIT FROM BFGS code NEW_X -49100.25 -0.441444218 -46.1749382 ENTER BFGS code NEW_X -49100.25 -0.441444218 -46.1749382 EXIT FROM BFGS code FG_LNSRCH 0. -0.469625741 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.528235197 *** contribution from regularisation: 0.0031884769 *** contribution from error: -0.531423688 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49133.7955 -0.469625741 -9.78092861 EXIT FROM BFGS code NEW_X -49133.7955 -0.469625741 -9.78092861 ENTER BFGS code NEW_X -49133.7955 -0.469625741 -9.78092861 EXIT FROM BFGS code FG_LNSRCH 0. -0.477318823 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.528353691 *** contribution from regularisation: 0.00320258085 *** contribution from error: -0.531556249 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49144.8211 -0.477318823 22.5501461 EXIT FROM BFGS code NEW_X -49144.8211 -0.477318823 22.5501461 ENTER BFGS code NEW_X -49144.8211 -0.477318823 22.5501461 EXIT FROM BFGS code FG_LNSRCH 0. -0.47325027 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.528655112 *** contribution from regularisation: 0.00307510141 *** contribution from error: -0.531730235 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49172.8578 -0.47325027 -12.064538 EXIT FROM BFGS code NEW_X -49172.8578 -0.47325027 -12.064538 ENTER BFGS code NEW_X -49172.8578 -0.47325027 -12.064538 EXIT FROM BFGS code FG_LNSRCH 0. -0.478859007 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.528407812 *** contribution from regularisation: 0.00311647751 *** contribution from error: -0.531524301 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49149.8551 -0.478859007 145.872253 EXIT FROM BFGS code FG_LNSRCH 0. -0.474658608 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.528425932 *** contribution from regularisation: 0.00337924436 *** contribution from error: -0.531805158 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49151.5377 -0.474658608 25.5411053 EXIT FROM BFGS code FG_LNSRCH 0. -0.473350823 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 94.0359116 sigma out 15 active outputs RANK 2 NODE 9 --> 35.630352 sigma out 15 active outputs RANK 3 NODE 1 --> 33.424511 sigma out 15 active outputs RANK 4 NODE 6 --> 32.5324669 sigma out 15 active outputs RANK 5 NODE 10 --> 31.3661995 sigma out 15 active outputs RANK 6 NODE 4 --> 22.6067867 sigma out 15 active outputs RANK 7 NODE 3 --> 21.1256409 sigma out 15 active outputs RANK 8 NODE 7 --> 19.7625217 sigma out 15 active outputs RANK 9 NODE 11 --> 12.8448515 sigma out 15 active outputs RANK 10 NODE 5 --> 11.5688782 sigma out 15 active outputs RANK 11 NODE 8 --> 10.6502457 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 103.947372 sigma in 11act. ( 140.038544 sig out 1act.) RANK 2 NODE 2 --> 39.4626007 sigma in 11act. ( 51.5784492 sig out 1act.) RANK 3 NODE 11 --> 30.0334244 sigma in 11act. ( 26.4899483 sig out 1act.) RANK 4 NODE 7 --> 26.5041008 sigma in 11act. ( 30.0237846 sig out 1act.) RANK 5 NODE 1 --> 17.3808346 sigma in 11act. ( 20.0953083 sig out 1act.) RANK 6 NODE 3 --> 12.1945467 sigma in 11act. ( 13.4588232 sig out 1act.) RANK 7 NODE 13 --> 11.8118591 sigma in 11act. ( 10.0034904 sig out 1act.) RANK 8 NODE 12 --> 10.8268356 sigma in 11act. ( 12.4157133 sig out 1act.) RANK 9 NODE 9 --> 10.7411909 sigma in 11act. ( 9.65306473 sig out 1act.) RANK 10 NODE 4 --> 8.47432232 sigma in 11act. ( 8.06073856 sig out 1act.) RANK 11 NODE 5 --> 8.35913277 sigma in 11act. ( 8.23942471 sig out 1act.) RANK 12 NODE 8 --> 7.44443178 sigma in 11act. ( 6.46632242 sig out 1act.) RANK 13 NODE 15 --> 5.28172445 sigma in 11act. ( 4.02838087 sig out 1act.) RANK 14 NODE 10 --> 3.24861765 sigma in 11act. ( 2.35597301 sig out 1act.) RANK 15 NODE 14 --> 2.55911589 sigma in 11act. ( 2.53945255 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 140.038544 sigma out 1act.( 103.947372 sig in 11act.) RANK 2 NODE 2 --> 51.5784492 sigma out 1act.( 39.4626007 sig in 11act.) RANK 3 NODE 7 --> 30.0237846 sigma out 1act.( 26.5041008 sig in 11act.) RANK 4 NODE 11 --> 26.4899483 sigma out 1act.( 30.0334244 sig in 11act.) RANK 5 NODE 1 --> 20.0953083 sigma out 1act.( 17.3808346 sig in 11act.) RANK 6 NODE 3 --> 13.4588232 sigma out 1act.( 12.1945467 sig in 11act.) RANK 7 NODE 12 --> 12.4157133 sigma out 1act.( 10.8268356 sig in 11act.) RANK 8 NODE 13 --> 10.0034904 sigma out 1act.( 11.8118591 sig in 11act.) RANK 9 NODE 9 --> 9.65306473 sigma out 1act.( 10.7411909 sig in 11act.) RANK 10 NODE 5 --> 8.23942471 sigma out 1act.( 8.35913277 sig in 11act.) RANK 11 NODE 4 --> 8.06073856 sigma out 1act.( 8.47432232 sig in 11act.) RANK 12 NODE 8 --> 6.46632242 sigma out 1act.( 7.44443178 sig in 11act.) RANK 13 NODE 15 --> 4.02838087 sigma out 1act.( 5.28172445 sig in 11act.) RANK 14 NODE 14 --> 2.53945255 sigma out 1act.( 2.55911589 sig in 11act.) RANK 15 NODE 10 --> 2.35597301 sigma out 1act.( 3.24861765 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 158.144058 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.528454721 *** contribution from regularisation: 0.00328330253 *** contribution from error: -0.531738043 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -49154.217 -0.473350823 -9.93404675 EXIT FROM BFGS code FG_LNSRCH 0. -0.473251075 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.528478324 *** contribution from regularisation: 0.00325203734 *** contribution from error: -0.531730354 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49156.4109 -0.473251075 -12.5287628 EXIT FROM BFGS code FG_LNSRCH 0. -0.47325027 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.528475881 *** contribution from regularisation: 0.00325436378 *** contribution from error: -0.531730235 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49156.1836 -0.47325027 -12.5446768 EXIT FROM BFGS code FG_LNSRCH 0. -0.47325027 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.528483868 *** contribution from regularisation: 0.00324638421 *** contribution from error: -0.531730235 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49156.9259 -0.47325027 -12.5246506 EXIT FROM BFGS code FG_LNSRCH 0. -0.47325027 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.528482497 *** contribution from regularisation: 0.00324776792 *** contribution from error: -0.531730294 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49156.7972 -0.47325027 -12.5374613 EXIT FROM BFGS code FG_LNSRCH 0. -0.47325027 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.528473318 *** contribution from regularisation: 0.0032569014 *** contribution from error: -0.531730235 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49155.9476 -0.47325027 -12.4414577 EXIT FROM BFGS code NEW_X -49155.9476 -0.47325027 -12.4414577 ENTER BFGS code NEW_X -49155.9476 -0.47325027 -12.4414577 EXIT FROM BFGS code CONVERGENC -49155.9476 -0.47325027 -12.4414577 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 149.659805 sigma out 15 active outputs RANK 2 NODE 9 --> 55.0567589 sigma out 15 active outputs RANK 3 NODE 1 --> 51.8274765 sigma out 15 active outputs RANK 4 NODE 6 --> 49.2665367 sigma out 15 active outputs RANK 5 NODE 10 --> 48.7198677 sigma out 15 active outputs RANK 6 NODE 4 --> 34.6074295 sigma out 15 active outputs RANK 7 NODE 3 --> 33.2801704 sigma out 15 active outputs RANK 8 NODE 7 --> 31.354908 sigma out 15 active outputs RANK 9 NODE 11 --> 20.1284447 sigma out 15 active outputs RANK 10 NODE 5 --> 18.3407593 sigma out 15 active outputs RANK 11 NODE 8 --> 16.8630219 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 164.407745 sigma in 11act. ( 222.893036 sig out 1act.) RANK 2 NODE 2 --> 62.1955185 sigma in 11act. ( 81.7467117 sig out 1act.) RANK 3 NODE 11 --> 46.7294579 sigma in 11act. ( 42.0682449 sig out 1act.) RANK 4 NODE 7 --> 40.9706306 sigma in 11act. ( 47.8805733 sig out 1act.) RANK 5 NODE 1 --> 27.4436913 sigma in 11act. ( 31.7916241 sig out 1act.) RANK 6 NODE 3 --> 19.1218395 sigma in 11act. ( 21.2744179 sig out 1act.) RANK 7 NODE 13 --> 17.1206322 sigma in 11act. ( 15.8800116 sig out 1act.) RANK 8 NODE 12 --> 16.7081966 sigma in 11act. ( 19.6453724 sig out 1act.) RANK 9 NODE 9 --> 15.9563179 sigma in 11act. ( 15.2604523 sig out 1act.) RANK 10 NODE 4 --> 12.9733391 sigma in 11act. ( 12.9895487 sig out 1act.) RANK 11 NODE 5 --> 12.3484116 sigma in 11act. ( 13.0642109 sig out 1act.) RANK 12 NODE 8 --> 10.9467106 sigma in 11act. ( 10.2599802 sig out 1act.) RANK 13 NODE 15 --> 7.24728966 sigma in 11act. ( 6.3951149 sig out 1act.) RANK 14 NODE 10 --> 4.20983076 sigma in 11act. ( 3.81239009 sig out 1act.) RANK 15 NODE 14 --> 3.67528439 sigma in 11act. ( 3.99078059 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 222.893036 sigma out 1act.( 164.407745 sig in 11act.) RANK 2 NODE 2 --> 81.7467117 sigma out 1act.( 62.1955185 sig in 11act.) RANK 3 NODE 7 --> 47.8805733 sigma out 1act.( 40.9706306 sig in 11act.) RANK 4 NODE 11 --> 42.0682449 sigma out 1act.( 46.7294579 sig in 11act.) RANK 5 NODE 1 --> 31.7916241 sigma out 1act.( 27.4436913 sig in 11act.) RANK 6 NODE 3 --> 21.2744179 sigma out 1act.( 19.1218395 sig in 11act.) RANK 7 NODE 12 --> 19.6453724 sigma out 1act.( 16.7081966 sig in 11act.) RANK 8 NODE 13 --> 15.8800116 sigma out 1act.( 17.1206322 sig in 11act.) RANK 9 NODE 9 --> 15.2604523 sigma out 1act.( 15.9563179 sig in 11act.) RANK 10 NODE 5 --> 13.0642109 sigma out 1act.( 12.3484116 sig in 11act.) RANK 11 NODE 4 --> 12.9895487 sigma out 1act.( 12.9733391 sig in 11act.) RANK 12 NODE 8 --> 10.2599802 sigma out 1act.( 10.9467106 sig in 11act.) RANK 13 NODE 15 --> 6.3951149 sigma out 1act.( 7.24728966 sig in 11act.) RANK 14 NODE 14 --> 3.99078059 sigma out 1act.( 3.67528439 sig in 11act.) RANK 15 NODE 10 --> 3.81239009 sigma out 1act.( 4.20983076 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 251.548126 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.528469563 *** contribution from regularisation: 0.00326066348 *** contribution from error: -0.531730235 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 29164 Closing output file done