NNInput NNInputs_165.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 198612 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 56797 nbkg = 141815 Bkg Entries: 141815 Sig Entries: 56797 Chosen entries: 56797 Signal fraction: 1 Background fraction: 0.400501 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 141815 Actual Signal Entries: 56797 Entries to split: 56797 Test with : 28398 Train with : 28398 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 56797 for Signal Prepared event 0 for Signal with 56797 events ====Entry 0 Variable Ht : 267.874 Variable LepAPt : 45.292 Variable LepBPt : 35.604 Variable MetSigLeptonsJets : 7.24556 Variable MetSpec : 95.2083 Variable SumEtLeptonsJets : 172.666 Variable VSumJetLeptonsPt : 107.325 Variable addEt : 176.104 Variable dPhiLepSumMet : 2.28406 Variable dPhiLeptons : 0.0254087 Variable dRLeptons : 0.325152 Variable lep1_E : 46.0978 Variable lep2_E : 40.383 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2165 Ht = 267.874 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 45.2922 LepAPt = 45.292 LepBEt = 35.604 LepBPt = 35.604 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 95.2083 MetDelPhi = 2.19471 MetSig = 6.53886 MetSigLeptonsJets = 7.24556 MetSpec = 95.2083 Mjj = 0 MostCentralJetEta = -1.47546 MtllMet = 179.575 Njets = 1 SB = 0 SumEt = 212.005 SumEtJets = 0 SumEtLeptonsJets = 172.666 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 107.325 addEt = 176.104 dPhiLepSumMet = 2.28406 dPhiLeptons = 0.0254087 dRLeptons = 0.325152 diltype = 52 dimass = 13.1154 event = 77 jet1_Et = 91.7697 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 46.0978 lep2_E = 40.383 rand = 0.999742 run = 237144 weight = 3.96224e-06 ===Show End Prepared event 10000 for Signal with 56797 events Prepared event 20000 for Signal with 56797 events Prepared event 30000 for Signal with 56797 events Prepared event 40000 for Signal with 56797 events Prepared event 50000 for Signal with 56797 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 141815 for Background Prepared event 0 for Background with 141815 events ====Entry 0 Variable Ht : 79.7868 Variable LepAPt : 30.5457 Variable LepBPt : 11.4206 Variable MetSigLeptonsJets : 5.83799 Variable MetSpec : 37.8193 Variable SumEtLeptonsJets : 41.9663 Variable VSumJetLeptonsPt : 41.2152 Variable addEt : 79.7868 Variable dPhiLepSumMet : 2.89654 Variable dPhiLeptons : 0.426439 Variable dRLeptons : 0.500605 Variable lep1_E : 30.5607 Variable lep2_E : 11.7282 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 79.7868 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 30.546 LepAPt = 30.5457 LepBEt = 11.4214 LepBPt = 11.4206 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 37.8193 MetDelPhi = 2.58497 MetSig = 2.81024 MetSigLeptonsJets = 5.83799 MetSpec = 37.8193 Mjj = 0 MostCentralJetEta = 0 MtllMet = 80.0898 Njets = 0 SB = 0 SumEt = 181.11 SumEtJets = 0 SumEtLeptonsJets = 41.9663 Target = 0 TrainWeight = 3.6439 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.2152 addEt = 79.7868 dPhiLepSumMet = 2.89654 dPhiLeptons = 0.426439 dRLeptons = 0.500605 diltype = 41 dimass = 9.31149 event = 6639571 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 30.5607 lep2_E = 11.7282 rand = 0.999742 run = 271566 weight = 0.0404189 ===Show End Prepared event 10000 for Background with 141815 events Prepared event 20000 for Background with 141815 events Prepared event 30000 for Background with 141815 events Prepared event 40000 for Background with 141815 events Prepared event 50000 for Background with 141815 events Prepared event 60000 for Background with 141815 events Prepared event 70000 for Background with 141815 events Prepared event 80000 for Background with 141815 events Prepared event 90000 for Background with 141815 events Prepared event 100000 for Background with 141815 events Prepared event 110000 for Background with 141815 events Prepared event 120000 for Background with 141815 events Prepared event 130000 for Background with 141815 events Prepared event 140000 for Background with 141815 events Warning: found 4534 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 198612 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4534 negative weights. Signal fraction: 63.0434761 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 47.3116226 61.6016922 63.7241135 65.5624695 67.2155304 68.1959534 69.3699646 70.3651581 71.1796417 71.9918137 73.0380554 73.8354874 74.4836273 75.2628174 75.8267517 76.6691437 77.3147736 78.0756531 78.7808914 79.439682 80.0431061 80.9284973 81.7781677 82.6377487 83.5665436 84.5931168 85.7649231 86.9180145 88.386673 89.6746826 91.2997131 93.0244904 94.6157684 96.3232269 97.969635 99.4781036 101.019821 102.66214 104.374435 106.02047 107.594055 109.281807 110.826706 112.522408 114.127365 115.864471 117.733673 119.632706 121.266312 123.025261 124.875961 126.631981 128.475006 130.061401 131.760498 133.388763 134.961777 136.560349 138.107132 139.651779 141.217331 142.850769 144.445053 145.957458 147.38443 148.705185 150.245117 151.769348 153.141357 154.546509 155.897522 157.376007 158.893555 160.263855 161.809906 163.485535 165.214951 167.167145 169.089569 171.215485 173.676331 176.314545 178.999649 181.684448 184.557526 187.565964 190.394318 193.853485 196.965088 200.939011 205.178619 209.868576 215.033478 220.58905 227.667007 235.656342 245.401367 258.478729 277.046753 312.385071 782.079102 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.3337822 20.6684017 20.9820366 21.258337 21.5066528 21.7650948 22.0268822 22.2506218 22.4720802 22.7273064 22.963459 23.1945267 23.4418602 23.6759987 23.9121819 24.1565418 24.4063377 24.6337166 24.879406 25.1335964 25.3582134 25.5954552 25.8317909 26.0640259 26.2835121 26.5284424 26.8038673 27.0352039 27.2828789 27.5540314 27.8245583 28.0477905 28.2905006 28.5582371 28.8143673 29.0753746 29.2996712 29.5947418 29.8264351 30.0792313 30.3448982 30.6425552 30.9154358 31.2132492 31.4902611 31.7761955 32.0771942 32.3877525 32.6815224 32.9678802 33.2520218 33.5530128 33.8428917 34.1475945 34.4483757 34.7775269 35.0918808 35.38097 35.6835899 36.0357742 36.3483315 36.6796036 36.9910889 37.3278885 37.666481 37.99683 38.3174591 38.6744995 39.0130043 39.3303528 39.6737862 40.0165672 40.362854 40.7217941 41.0803528 41.4631805 41.8192978 42.1968842 42.5707855 42.9986343 43.4344292 43.8841438 44.3732681 44.8638725 45.3844109 45.9041519 46.4308853 47.0162735 47.7833633 48.5297203 49.3686829 50.3575821 51.4672852 52.7994843 54.2812729 56.2153625 58.6546249 62.4217606 68.4642792 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0000381 10.1126232 10.2354441 10.3792696 10.4969358 10.6365232 10.7742996 10.9120998 11.0292683 11.16786 11.3228054 11.4743423 11.5938816 11.7535734 11.9135742 12.0699463 12.2387829 12.4073782 12.5877647 12.7462559 12.9275522 13.1394281 13.3430653 13.5454626 13.7066078 13.8955917 14.0672379 14.289463 14.4776535 14.6870289 14.9162331 15.1231813 15.3369274 15.5421047 15.77563 15.976099 16.2491455 16.5270691 16.796154 17.0503731 17.3658676 17.6299171 17.8873539 18.1862755 18.4398022 18.7254848 19.0166836 19.3180923 19.6570511 19.9719944 20.2745399 20.5566101 20.8275452 21.1340561 21.4218063 21.7137871 22.0496922 22.3618698 22.6891518 23.0030384 23.3554077 23.6752949 24.0157738 24.3623657 24.6612549 25.0099182 25.3561096 25.6982079 26.0345459 26.3778152 26.7555828 27.1411057 27.5212173 27.8895836 28.2683029 28.6332207 29.0020695 29.4101677 29.8192482 30.1952896 30.5798454 30.9833069 31.3597717 31.7453289 32.196701 32.5902328 33.0106277 33.452919 33.8973389 34.4306068 34.9628906 35.565712 36.1347656 36.7987022 37.5879135 38.5336075 39.7368927 41.2921448 43.7005157 47.990593 74.918541 ------------------------------ Transdef: Tab for variable 5 1.13770258 2.52909422 2.8942852 3.09093547 3.31772351 3.51315546 3.68112493 3.83315039 3.95915246 4.06499815 4.16644859 4.26791477 4.36556959 4.45493984 4.52816629 4.62829876 4.70259953 4.78780937 4.86944675 4.96356058 5.04745007 5.12498903 5.19446659 5.26686001 5.33211327 5.389781 5.45311594 5.51860142 5.56367588 5.61928749 5.67491245 5.73381329 5.79310894 5.84814262 5.90031624 5.95330334 6.01414108 6.05692577 6.10095501 6.1496892 6.20172882 6.25387764 6.2967844 6.34943104 6.40001678 6.44795275 6.50029373 6.55157757 6.60540342 6.65230846 6.7053194 6.75008154 6.80481005 6.85973978 6.90886641 6.96564674 7.01979446 7.07840157 7.14468288 7.20507526 7.26078796 7.32218075 7.38043499 7.44798851 7.51299095 7.58004665 7.64482737 7.70737648 7.77239895 7.83755589 7.90201473 7.96452427 8.0298481 8.09654045 8.15711975 8.21819019 8.27893066 8.34454536 8.41245842 8.47914505 8.54661655 8.61490726 8.68406105 8.75300694 8.82159138 8.89277267 8.9694767 9.05104637 9.13359833 9.22449875 9.32121468 9.42057991 9.54056835 9.6661911 9.82202339 10.015461 10.2540359 10.5359859 10.9563484 11.7708302 18.373724 ------------------------------ Transdef: Tab for variable 6 15.0119848 20.3705215 25.0420666 25.4639664 25.9318275 26.5163727 27.0560188 27.5253868 28.0914383 28.7142944 29.3577042 29.9364891 30.6507339 31.2957783 31.87603 32.35783 32.9166565 33.4719162 33.9486504 34.4251785 34.9013824 35.401062 35.8854675 36.2606392 36.6942825 37.1280746 37.5086479 37.9285393 38.3289413 38.7457199 39.1705856 39.5575447 40.0057678 40.4745522 40.9493103 41.4169884 41.8995399 42.4322586 42.9155884 43.4210739 43.9256363 44.5615273 45.2412872 45.9779701 46.7018051 47.4642982 48.2332611 48.9149399 49.7426987 50.4542618 51.3350525 52.2136459 53.0782661 53.9933205 54.8000565 55.7132416 56.5563126 57.3210716 58.1701851 58.9725876 59.8189735 60.703064 61.5923958 62.4421997 63.3150368 64.2213898 65.0025635 65.855011 66.7003021 67.6006317 68.4009247 69.1681976 70.0106659 70.8140259 71.6567612 72.4953461 73.319252 74.1044464 74.8776474 75.6646271 76.5107193 77.373909 78.2275543 79.1061249 80.0309296 80.965332 81.9623642 82.9115143 84.1146088 85.3923721 86.802536 88.4187698 90.3165741 92.5172424 94.808075 98.015686 101.946243 107.027634 114.379089 126.421265 289.142151 ------------------------------ Transdef: Tab for variable 7 30.1199875 32.5572853 33.6424026 34.3966751 35.0455017 35.6151237 36.1230545 36.5989914 37.0074081 37.4471207 37.8270073 38.2412949 38.5908203 38.9102402 39.2730637 39.5688324 39.8933029 40.2629166 40.6777725 41.0987244 41.4877281 41.8895607 42.3403931 42.8025208 43.3374557 43.9881668 44.6648102 45.295517 46.043457 46.8896103 47.8968697 48.7100067 49.7129135 50.7288628 51.7870178 52.6308784 53.5719147 54.6461868 55.635231 56.5442505 57.4351578 58.2989197 59.1130371 60.1312828 61.1759796 62.1480408 63.0743713 64.1090012 65.0100555 65.9839706 66.8598633 67.8486404 68.7018585 69.625618 70.4222641 71.2810898 72.0344162 72.9279633 73.7149887 74.4705734 75.2629318 76.212822 76.9606934 77.7610931 78.6104889 79.5332794 80.4749832 81.6074677 82.7408905 83.8384247 85.0261383 86.3625336 87.8074951 89.3339996 90.9301834 92.561203 94.149765 95.7249146 97.4919205 99.2913971 101.098831 103.03064 105.071091 107.073936 109.265594 111.779114 114.188965 117.051392 119.985741 123.180923 126.81707 130.693069 135.094452 140.559753 146.027542 152.342026 161.2323 172.376282 188.169846 216.728851 458.64209 ------------------------------ Transdef: Tab for variable 8 0.172416463 23.6275749 28.3652916 30.4635315 31.5695305 32.3904419 33.0785828 33.7173004 34.2338104 34.7141495 35.1413803 35.528923 35.8909836 36.222271 36.5914383 36.9655762 37.3192291 37.6188469 37.9257355 38.2107315 38.5306015 38.8226395 39.083847 39.3991165 39.6825104 40.020874 40.3662834 40.7247543 41.1123962 41.4334335 41.8175507 42.1855202 42.6063728 43.0493088 43.5341949 44.0123558 44.5054016 45.1344604 45.730484 46.348587 46.9631348 47.6484833 48.229248 48.806839 49.4360619 50.1376801 50.814827 51.5098038 52.1428757 52.8340225 53.5652809 54.2825356 54.9502335 55.7129326 56.4154892 57.1041718 57.8356171 58.5503006 59.2624817 59.9790001 60.7233734 61.4485168 62.2412415 62.9975853 63.7042999 64.4367981 65.1397552 65.89888 66.6062164 67.3302002 68.059494 68.7818451 69.5734558 70.3115997 71.017334 71.7606659 72.5322495 73.209259 73.9150467 74.5527725 75.3398819 76.1543961 76.8707428 77.6927948 78.5465546 79.4057922 80.3572311 81.491333 82.605423 83.8885269 85.3382263 87.0886536 89.1905365 91.3532104 93.9744263 97.3258362 100.971245 106.089615 113.885506 127.974396 326.868286 ------------------------------ Transdef: Tab for variable 9 47.3116226 61.1271591 62.9410782 64.6773224 66.0103455 67.3265533 68.227005 69.158371 70.0604401 70.7548981 71.4524231 72.0629883 72.9662399 73.6636505 74.2977448 74.9247208 75.4723816 76.0393829 76.7461014 77.2692184 77.9011841 78.5380402 79.1338577 79.669632 80.2026062 80.9580612 81.5993958 82.3736725 83.1465607 83.8994904 84.7529144 85.6274414 86.6060638 87.7229919 88.8105545 89.9894104 91.1422043 92.4972382 93.716095 95.2078094 96.4855652 97.9936066 99.4274902 101.072449 102.418045 103.934387 105.28299 106.676117 108.082771 109.545227 110.868927 112.37719 113.733414 115.092545 116.535301 117.793167 119.25663 120.666534 121.964165 123.324661 124.736977 126.144218 127.486794 128.870422 130.226074 131.501144 132.867554 134.11908 135.425308 136.806854 138.106247 139.370743 140.592209 141.867416 143.128082 144.406204 145.580582 146.681229 147.817184 148.923615 150.085358 151.279449 152.404144 153.49942 154.638306 155.76239 156.932098 158.123627 159.32489 160.492096 161.791016 163.342865 164.874298 166.716446 168.728058 171.043365 174.147064 178.249725 185.229614 198.426025 424.390259 ------------------------------ Transdef: Tab for variable 10 0.0050833188 1.14412665 1.42032552 1.61949384 1.76905477 1.88019872 1.97746634 2.05369425 2.12328124 2.18808699 2.24098015 2.2926569 2.33403754 2.37711477 2.41299152 2.44697976 2.483253 2.5118227 2.54000759 2.56736135 2.59376383 2.61725831 2.64038563 2.6619277 2.6822412 2.7021575 2.71843815 2.73640203 2.75377893 2.76918364 2.7828455 2.79599857 2.80827951 2.82057524 2.83258677 2.84443092 2.85400391 2.86423731 2.87359977 2.88150883 2.89062023 2.89974403 2.90800714 2.91636515 2.9243288 2.9313581 2.938591 2.94571018 2.95190048 2.95859051 2.96475673 2.97096586 2.97699022 2.98204422 2.98707366 2.99127722 2.99671888 3.00213909 3.00637937 3.01126337 3.01558471 3.02061534 3.02515578 3.02907324 3.03338242 3.0368042 3.04075599 3.04474211 3.04851913 3.05261469 3.05628967 3.05962086 3.06325817 3.06654286 3.07005978 3.07358837 3.07646823 3.07937288 3.08266354 3.08599186 3.08910751 3.09197021 3.09483624 3.09773302 3.10059452 3.10329628 3.10603094 3.1090889 3.11167502 3.11412764 3.1167233 3.11911392 3.12152815 3.12429762 3.12683344 3.12974334 3.13210726 3.13442254 3.13691831 3.139117 3.1415906 ------------------------------ Transdef: Tab for variable 11 7.15255737E-07 0.006289165 0.0112733804 0.0186879635 0.025153406 0.0316368937 0.0396012515 0.0463454723 0.0541206002 0.060351789 0.066268757 0.0731617212 0.0794761181 0.086163044 0.0923848152 0.0982999802 0.10533452 0.111955047 0.118676186 0.125327826 0.131792068 0.138458222 0.144173622 0.150667906 0.156366095 0.162493706 0.168416873 0.173655197 0.179541588 0.185215384 0.191066578 0.19583112 0.200871944 0.205271482 0.209494829 0.213766932 0.219010592 0.223347425 0.228120446 0.232855901 0.237290919 0.241902828 0.246437818 0.250938177 0.255420655 0.260145545 0.264804721 0.269747049 0.274714768 0.279076338 0.283997536 0.288635969 0.293866754 0.298721641 0.303444326 0.308338404 0.31341511 0.318205476 0.32343024 0.328305125 0.334269762 0.340162396 0.345521241 0.351303101 0.357598424 0.362661302 0.369152129 0.374762952 0.380913109 0.387914896 0.393729419 0.399753213 0.406703711 0.413349688 0.419412017 0.42681253 0.434454203 0.440750718 0.448235154 0.456080794 0.46435231 0.473037183 0.480792999 0.489854395 0.50058341 0.510106802 0.519206285 0.530497789 0.54413271 0.556143641 0.570228875 0.584987521 0.600903511 0.616181374 0.63815856 0.656513453 0.682794154 0.716777325 0.758662224 0.824084282 1.13201857 ------------------------------ Transdef: Tab for variable 12 0.200001657 0.207211956 0.213767275 0.220009416 0.225668326 0.231803462 0.237461939 0.242726386 0.248536453 0.254058868 0.259509027 0.264756203 0.270435154 0.275671244 0.280712903 0.285772383 0.290626943 0.295620024 0.300516367 0.305087268 0.309697866 0.314138055 0.318671793 0.323452473 0.327552974 0.332237661 0.337106109 0.341882408 0.346678883 0.351297081 0.356160134 0.361017168 0.365380764 0.370155603 0.375217497 0.380361855 0.385965049 0.390721709 0.3950032 0.3992607 0.40304333 0.406969041 0.410768926 0.414057732 0.417877257 0.421629965 0.425575793 0.429195583 0.432797462 0.436797142 0.440779269 0.444875091 0.448579311 0.452968895 0.457021922 0.461637914 0.466156483 0.470408171 0.474989355 0.479751527 0.484232575 0.488762528 0.493065 0.497860581 0.502778292 0.507879198 0.513202548 0.519379079 0.524493575 0.530104697 0.536585152 0.542081892 0.548110008 0.553746104 0.559755564 0.566392004 0.57274425 0.580486059 0.587532997 0.595454216 0.603181601 0.610687613 0.619464636 0.626933813 0.635742307 0.645659685 0.655901074 0.664497733 0.674940705 0.685470104 0.698553801 0.710516453 0.724666834 0.739196718 0.754434884 0.771938503 0.795887113 0.819657087 0.850001574 0.904917002 1.15372479 ------------------------------ Transdef: Tab for variable 13 20.0277634 21.5968018 22.3974857 23.1145725 23.679142 24.243166 24.7256107 25.2480488 25.6917686 26.0717087 26.5294266 26.9627285 27.3679848 27.7729511 28.1920338 28.5590897 28.9152832 29.3010769 29.6290092 30.0197659 30.3920765 30.7035141 31.0545025 31.3656349 31.7031898 32.0583153 32.3651505 32.7061081 33.1080246 33.442543 33.7981567 34.1865387 34.5033188 34.8554688 35.2162628 35.5054779 35.8688278 36.1900826 36.5201302 36.8687134 37.2713585 37.6230011 37.9517746 38.3457642 38.6654434 39.0095062 39.3604012 39.699585 40.0365791 40.3699837 40.7532921 41.1304703 41.5310287 41.9015503 42.3118973 42.6575165 43.0241165 43.4021606 43.7963562 44.1807251 44.5888519 44.9720688 45.3427238 45.721817 46.1498528 46.5294151 46.9445305 47.3672714 47.8634071 48.3196411 48.8163452 49.3182526 49.8228722 50.3512917 50.8576965 51.4628258 52.067215 52.6602707 53.2958221 54.0207748 54.7055664 55.4216156 56.1476135 56.9407997 57.9149666 58.7849197 59.8607407 60.9512787 62.1087265 63.350605 64.8091431 66.4441376 68.063446 70.1930542 72.6410141 75.344101 78.5549316 82.9890442 88.6026001 97.5973434 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0034533 10.6391287 11.0810633 11.4373894 11.7991447 12.1601162 12.425703 12.7715302 13.0559063 13.4369011 13.7254705 13.9785881 14.3364697 14.6333809 14.9605265 15.277916 15.5590363 15.8750057 16.1381721 16.4681015 16.7611923 17.0111465 17.3003044 17.6088047 17.8923035 18.1831779 18.4892197 18.7717533 19.0835228 19.3686752 19.7181473 20.0332222 20.3718643 20.7279243 21.0773315 21.421402 21.7231503 22.0165024 22.4023037 22.7810001 23.1373444 23.46455 23.7825546 24.1186981 24.4145126 24.7485695 25.0926666 25.444397 25.7662582 26.0834103 26.4019012 26.7452812 27.0685501 27.4041672 27.7465858 28.0949059 28.4496803 28.831749 29.1881752 29.5332985 29.8912468 30.2272835 30.5721684 30.9420052 31.2848988 31.6168652 31.9677086 32.3225517 32.6478119 33.0278473 33.4139748 33.8018723 34.1346245 34.5258026 34.9051361 35.3337708 35.7610321 36.1776009 36.6219864 37.0907822 37.5794678 38.0541687 38.5043106 39.0463524 39.6060829 40.1883621 40.7569733 41.4512444 42.2227859 42.9972382 43.8604851 44.7691154 45.8518867 47.0236053 48.3397598 50.0761452 52.1450996 54.6812439 58.4743042 65.2404633 120.256287 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 67.7 51.0 56.9 41.7 56.5 61.8 58.0 68.1 -18.9 -18.0 -32.5 23.3 38.8 2 67.7 100.0 66.5 64.0 40.6 70.6 94.0 75.1 91.3 -40.4 -24.0 -44.4 38.4 45.5 3 51.0 66.5 100.0 52.9 12.2 34.7 70.0 53.6 71.5 -13.6 -26.6 -51.0 67.4 36.0 4 56.9 64.0 52.9 100.0 24.1 42.2 63.5 56.6 71.1 -8.1 -30.2 -57.2 29.1 78.2 5 41.7 40.6 12.2 24.1 100.0 86.7 11.6 61.3 61.9 25.6 -8.2 -11.3 2.4 17.7 6 56.5 70.6 34.7 42.2 86.7 100.0 47.8 79.5 81.5 -2.2 -15.3 -25.4 17.2 30.7 7 61.8 94.0 70.0 63.5 11.6 47.8 100.0 65.2 78.6 -51.0 -23.4 -45.1 41.2 44.4 8 58.0 75.1 53.6 56.6 61.3 79.5 65.2 100.0 83.6 -5.4 -23.2 -39.5 29.1 40.3 9 68.1 91.3 71.5 71.1 61.9 81.5 78.6 83.6 100.0 -14.2 -27.1 -49.6 42.9 52.2 10 -18.9 -40.4 -13.6 -8.1 25.6 -2.2 -51.0 -5.4 -14.2 100.0 2.2 5.7 -5.7 -2.9 11 -18.0 -24.0 -26.6 -30.2 -8.2 -15.3 -23.4 -23.2 -27.1 2.2 100.0 54.8 -11.6 -20.7 12 -32.5 -44.4 -51.0 -57.2 -11.3 -25.4 -45.1 -39.5 -49.6 5.7 54.8 100.0 -24.4 -36.1 13 23.3 38.4 67.4 29.1 2.4 17.2 41.2 29.1 42.9 -5.7 -11.6 -24.4 100.0 51.5 14 38.8 45.5 36.0 78.2 17.7 30.7 44.4 40.3 52.2 -2.9 -20.7 -36.1 51.5 100.0 TOTAL CORRELATION TO TARGET (diagonal) 175.992996 TOTAL CORRELATION OF ALL VARIABLES 74.9669601 ROUND 1: MAX CORR ( 74.9667171) AFTER KILLING INPUT VARIABLE 6 CONTR 0.190897525 ROUND 2: MAX CORR ( 74.9650154) AFTER KILLING INPUT VARIABLE 11 CONTR 0.505111175 ROUND 3: MAX CORR ( 74.9561025) AFTER KILLING INPUT VARIABLE 14 CONTR 1.15595526 ROUND 4: MAX CORR ( 74.9248499) AFTER KILLING INPUT VARIABLE 8 CONTR 2.16429198 ROUND 5: MAX CORR ( 74.8600662) AFTER KILLING INPUT VARIABLE 2 CONTR 3.11506492 ROUND 6: MAX CORR ( 74.681403) AFTER KILLING INPUT VARIABLE 10 CONTR 5.16890202 ROUND 7: MAX CORR ( 74.4356123) AFTER KILLING INPUT VARIABLE 12 CONTR 6.05405432 ROUND 8: MAX CORR ( 74.155729) AFTER KILLING INPUT VARIABLE 13 CONTR 6.44889428 ROUND 9: MAX CORR ( 72.9654957) AFTER KILLING INPUT VARIABLE 9 CONTR 13.23286 ROUND 10: MAX CORR ( 72.6667913) AFTER KILLING INPUT VARIABLE 3 CONTR 6.59552885 ROUND 11: MAX CORR ( 70.9628511) AFTER KILLING INPUT VARIABLE 4 CONTR 15.6440511 ROUND 12: MAX CORR ( 61.8467371) AFTER KILLING INPUT VARIABLE 5 CONTR 34.7952201 LAST REMAINING VARIABLE: 7 total correlation to target: 74.9669601 % total significance: 159.698618 sigma correlations of single variables to target: variable 2: 67.6626146 % , in sigma: 144.138512 variable 3: 50.9718859 % , in sigma: 108.583031 variable 4: 56.9288502 % , in sigma: 121.272874 variable 5: 41.7408506 % , in sigma: 88.9185871 variable 6: 56.5059231 % , in sigma: 120.371932 variable 7: 61.8467371 % , in sigma: 131.749219 variable 8: 58.0266523 % , in sigma: 123.611471 variable 9: 68.1291562 % , in sigma: 145.132364 variable 10: -18.8665766 % , in sigma: 40.1905881 variable 11: -18.0420635 % , in sigma: 38.4341663 variable 12: -32.4687851 % , in sigma: 69.1667384 variable 13: 23.29871 % , in sigma: 49.6321551 variable 14: 38.8439331 % , in sigma: 82.7474188 variables sorted by significance: 1 most relevant variable 7 corr 61.8467369 , in sigma: 131.749219 2 most relevant variable 5 corr 34.7952194 , in sigma: 74.1226329 3 most relevant variable 4 corr 15.6440506 , in sigma: 33.3257913 4 most relevant variable 3 corr 6.59552908 , in sigma: 14.050148 5 most relevant variable 9 corr 13.2328596 , in sigma: 28.1893436 6 most relevant variable 13 corr 6.4488945 , in sigma: 13.7377791 7 most relevant variable 12 corr 6.05405426 , in sigma: 12.8966694 8 most relevant variable 10 corr 5.16890192 , in sigma: 11.0110707 9 most relevant variable 2 corr 3.11506486 , in sigma: 6.63587737 10 most relevant variable 8 corr 2.1642921 , in sigma: 4.6104905 11 most relevant variable 14 corr 1.15595531 , in sigma: 2.46247769 12 most relevant variable 11 corr 0.505111158 , in sigma: 1.07601474 13 most relevant variable 6 corr 0.190897524 , in sigma: 0.406660092 global correlations between input variables: variable 2: 99.2077215 % variable 3: 92.581799 % variable 4: 93.2848665 % variable 5: 95.8235465 % variable 6: 95.7178992 % variable 7: 98.8421158 % variable 8: 89.5191037 % variable 9: 98.6653533 % variable 10: 72.4160493 % variable 11: 55.2636478 % variable 12: 73.1172064 % variable 13: 86.0743008 % variable 14: 90.2789152 % significance loss when removing single variables: variable 2: corr = 2.13068337 % , sigma = 4.5388954 variable 3: corr = 15.1817392 % , sigma = 32.3409509 variable 4: corr = 12.3175397 % , sigma = 26.2394803 variable 5: corr = 17.8068351 % , sigma = 37.9330703 variable 6: corr = 0.190897525 % , sigma = 0.406660094 variable 7: corr = 7.19568274 % , sigma = 15.3286273 variable 8: corr = 1.9824172 % , sigma = 4.2230509 variable 9: corr = 10.6582803 % , sigma = 22.7048373 variable 10: corr = 4.22521236 % , sigma = 9.0007728 variable 11: corr = 0.503044852 % , sigma = 1.07161298 variable 12: corr = 5.387894 % , sigma = 11.4775793 variable 13: corr = 5.89789035 % , sigma = 12.5640007 variable 14: corr = 1.12297216 % , sigma = 2.39221521 Keep only 10 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 11 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 21.1807556 sigma out 15 active outputs RANK 2 NODE 5 --> 19.4289265 sigma out 15 active outputs RANK 3 NODE 9 --> 19.2394371 sigma out 15 active outputs RANK 4 NODE 3 --> 16.2301521 sigma out 15 active outputs RANK 5 NODE 1 --> 15.346488 sigma out 15 active outputs RANK 6 NODE 7 --> 14.1504812 sigma out 15 active outputs RANK 7 NODE 4 --> 11.8752203 sigma out 15 active outputs RANK 8 NODE 10 --> 11.3411999 sigma out 15 active outputs RANK 9 NODE 6 --> 10.8311872 sigma out 15 active outputs RANK 10 NODE 8 --> 8.45821095 sigma out 15 active outputs RANK 11 NODE 11 --> 7.1685133 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 23.5188999 sigma in 11act. ( 32.2689285 sig out 1act.) RANK 2 NODE 5 --> 22.1673355 sigma in 11act. ( 26.206955 sig out 1act.) RANK 3 NODE 6 --> 20.2213326 sigma in 11act. ( 21.3233852 sig out 1act.) RANK 4 NODE 8 --> 15.9746571 sigma in 11act. ( 15.9530239 sig out 1act.) RANK 5 NODE 10 --> 14.9827423 sigma in 11act. ( 16.4326725 sig out 1act.) RANK 6 NODE 1 --> 11.364131 sigma in 11act. ( 13.7637777 sig out 1act.) RANK 7 NODE 9 --> 8.84295845 sigma in 11act. ( 10.6491175 sig out 1act.) RANK 8 NODE 13 --> 8.75298977 sigma in 11act. ( 7.69222212 sig out 1act.) RANK 9 NODE 12 --> 8.39031219 sigma in 11act. ( 10.2988482 sig out 1act.) RANK 10 NODE 14 --> 7.10761642 sigma in 11act. ( 7.8339572 sig out 1act.) RANK 11 NODE 4 --> 6.52501535 sigma in 11act. ( 7.46774721 sig out 1act.) RANK 12 NODE 11 --> 3.35133958 sigma in 11act. ( 6.45059586 sig out 1act.) RANK 13 NODE 7 --> 2.87557554 sigma in 11act. ( 2.50670815 sig out 1act.) RANK 14 NODE 3 --> 1.72139597 sigma in 11act. ( 1.52386642 sig out 1act.) RANK 15 NODE 15 --> 1.51793635 sigma in 11act. ( 2.31769013 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 32.2689285 sigma out 1act.( 23.5188999 sig in 11act.) RANK 2 NODE 5 --> 26.206955 sigma out 1act.( 22.1673355 sig in 11act.) RANK 3 NODE 6 --> 21.3233852 sigma out 1act.( 20.2213326 sig in 11act.) RANK 4 NODE 10 --> 16.4326725 sigma out 1act.( 14.9827423 sig in 11act.) RANK 5 NODE 8 --> 15.9530239 sigma out 1act.( 15.9746571 sig in 11act.) RANK 6 NODE 1 --> 13.7637777 sigma out 1act.( 11.364131 sig in 11act.) RANK 7 NODE 9 --> 10.6491175 sigma out 1act.( 8.84295845 sig in 11act.) RANK 8 NODE 12 --> 10.2988482 sigma out 1act.( 8.39031219 sig in 11act.) RANK 9 NODE 14 --> 7.8339572 sigma out 1act.( 7.10761642 sig in 11act.) RANK 10 NODE 13 --> 7.69222212 sigma out 1act.( 8.75298977 sig in 11act.) RANK 11 NODE 4 --> 7.46774721 sigma out 1act.( 6.52501535 sig in 11act.) RANK 12 NODE 11 --> 6.45059586 sigma out 1act.( 3.35133958 sig in 11act.) RANK 13 NODE 7 --> 2.50670815 sigma out 1act.( 2.87557554 sig in 11act.) RANK 14 NODE 15 --> 2.31769013 sigma out 1act.( 1.51793635 sig in 11act.) RANK 15 NODE 3 --> 1.52386642 sigma out 1act.( 1.72139597 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 57.8628883 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 32.9723663 sigma out 15 active outputs RANK 2 NODE 5 --> 20.716177 sigma out 15 active outputs RANK 3 NODE 3 --> 18.8035526 sigma out 15 active outputs RANK 4 NODE 9 --> 18.7657032 sigma out 15 active outputs RANK 5 NODE 1 --> 18.0582027 sigma out 15 active outputs RANK 6 NODE 7 --> 14.960041 sigma out 15 active outputs RANK 7 NODE 4 --> 13.8562384 sigma out 15 active outputs RANK 8 NODE 6 --> 12.9393206 sigma out 15 active outputs RANK 9 NODE 10 --> 12.4610043 sigma out 15 active outputs RANK 10 NODE 8 --> 10.1381063 sigma out 15 active outputs RANK 11 NODE 11 --> 9.67391682 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 28.263504 sigma in 11act. ( 30.8231773 sig out 1act.) RANK 2 NODE 5 --> 25.1075268 sigma in 11act. ( 24.7262974 sig out 1act.) RANK 3 NODE 6 --> 17.5973778 sigma in 11act. ( 18.3951721 sig out 1act.) RANK 4 NODE 1 --> 16.9347839 sigma in 11act. ( 13.7050228 sig out 1act.) RANK 5 NODE 8 --> 15.6644926 sigma in 11act. ( 14.2520552 sig out 1act.) RANK 6 NODE 10 --> 15.2049599 sigma in 11act. ( 15.2844038 sig out 1act.) RANK 7 NODE 11 --> 14.0668716 sigma in 11act. ( 7.49427414 sig out 1act.) RANK 8 NODE 12 --> 12.0809946 sigma in 11act. ( 10.2023249 sig out 1act.) RANK 9 NODE 9 --> 11.7603407 sigma in 11act. ( 10.4331799 sig out 1act.) RANK 10 NODE 15 --> 10.8767929 sigma in 11act. ( 5.43102455 sig out 1act.) RANK 11 NODE 4 --> 10.5218925 sigma in 11act. ( 7.02314043 sig out 1act.) RANK 12 NODE 14 --> 8.61993313 sigma in 11act. ( 7.56715345 sig out 1act.) RANK 13 NODE 7 --> 8.51850224 sigma in 11act. ( 3.6938343 sig out 1act.) RANK 14 NODE 3 --> 7.99236488 sigma in 11act. ( 2.71559453 sig out 1act.) RANK 15 NODE 13 --> 7.67451525 sigma in 11act. ( 6.38762856 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 30.8231773 sigma out 1act.( 28.263504 sig in 11act.) RANK 2 NODE 5 --> 24.7262974 sigma out 1act.( 25.1075268 sig in 11act.) RANK 3 NODE 6 --> 18.3951721 sigma out 1act.( 17.5973778 sig in 11act.) RANK 4 NODE 10 --> 15.2844038 sigma out 1act.( 15.2049599 sig in 11act.) RANK 5 NODE 8 --> 14.2520552 sigma out 1act.( 15.6644926 sig in 11act.) RANK 6 NODE 1 --> 13.7050228 sigma out 1act.( 16.9347839 sig in 11act.) RANK 7 NODE 9 --> 10.4331799 sigma out 1act.( 11.7603407 sig in 11act.) RANK 8 NODE 12 --> 10.2023249 sigma out 1act.( 12.0809946 sig in 11act.) RANK 9 NODE 14 --> 7.56715345 sigma out 1act.( 8.61993313 sig in 11act.) RANK 10 NODE 11 --> 7.49427414 sigma out 1act.( 14.0668716 sig in 11act.) RANK 11 NODE 4 --> 7.02314043 sigma out 1act.( 10.5218925 sig in 11act.) RANK 12 NODE 13 --> 6.38762856 sigma out 1act.( 7.67451525 sig in 11act.) RANK 13 NODE 15 --> 5.43102455 sigma out 1act.( 10.8767929 sig in 11act.) RANK 14 NODE 7 --> 3.6938343 sigma out 1act.( 8.51850224 sig in 11act.) RANK 15 NODE 3 --> 2.71559453 sigma out 1act.( 7.99236488 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 54.6937637 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.43791762 *** contribution from regularisation: 0.00462465547 *** contribution from error: -0.442542285 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.496994913 *** contribution from regularisation: 0.00241072406 *** contribution from error: -0.499405622 *********************************************** -----------------> Test sample ENTER BFGS code START -49364.5075 0.186641589 -0.886460245 EXIT FROM BFGS code FG_START 0. 0.186641589 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.51602608 *** contribution from regularisation: 0.00240948866 *** contribution from error: -0.518435597 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -51244.4857 0.186641589 -663.631042 EXIT FROM BFGS code FG_LNSRCH 0. 0.0965795517 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.54974395 *** contribution from regularisation: 0.00308923074 *** contribution from error: -0.5528332 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -54592.8743 0.0965795517 -488.23642 EXIT FROM BFGS code NEW_X -54592.8743 0.0965795517 -488.23642 ENTER BFGS code NEW_X -54592.8743 0.0965795517 -488.23642 EXIT FROM BFGS code FG_LNSRCH 0. 0.0340996347 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.552908957 *** contribution from regularisation: 0.00312172202 *** contribution from error: -0.556030691 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -54907.1759 0.0340996347 -272.364563 EXIT FROM BFGS code NEW_X -54907.1759 0.0340996347 -272.364563 ENTER BFGS code NEW_X -54907.1759 0.0340996347 -272.364563 EXIT FROM BFGS code FG_LNSRCH 0. -0.056268651 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.555197358 *** contribution from regularisation: 0.00312815746 *** contribution from error: -0.558325529 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55134.4314 -0.056268651 -25.4993286 EXIT FROM BFGS code NEW_X -55134.4314 -0.056268651 -25.4993286 ENTER BFGS code NEW_X -55134.4314 -0.056268651 -25.4993286 EXIT FROM BFGS code FG_LNSRCH 0. -0.0776489004 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.555481791 *** contribution from regularisation: 0.00309977098 *** contribution from error: -0.558581591 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55162.6744 -0.0776489004 16.8134918 EXIT FROM BFGS code NEW_X -55162.6744 -0.0776489004 16.8134918 ENTER BFGS code NEW_X -55162.6744 -0.0776489004 16.8134918 EXIT FROM BFGS code FG_LNSRCH 0. -0.0956088826 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.555778384 *** contribution from regularisation: 0.00306037138 *** contribution from error: -0.558838785 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55192.1253 -0.0956088826 29.8393459 EXIT FROM BFGS code NEW_X -55192.1253 -0.0956088826 29.8393459 ENTER BFGS code NEW_X -55192.1253 -0.0956088826 29.8393459 EXIT FROM BFGS code FG_LNSRCH 0. -0.114708781 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.556131363 *** contribution from regularisation: 0.0030037011 *** contribution from error: -0.559135079 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55227.1795 -0.114708781 -5.90396357 EXIT FROM BFGS code NEW_X -55227.1795 -0.114708781 -5.90396357 ENTER BFGS code NEW_X -55227.1795 -0.114708781 -5.90396357 EXIT FROM BFGS code FG_LNSRCH 0. -0.190224037 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 44.9868889 sigma out 15 active outputs RANK 2 NODE 1 --> 25.7477608 sigma out 15 active outputs RANK 3 NODE 3 --> 22.3915424 sigma out 15 active outputs RANK 4 NODE 5 --> 16.7947502 sigma out 15 active outputs RANK 5 NODE 6 --> 16.0335865 sigma out 15 active outputs RANK 6 NODE 7 --> 15.8979053 sigma out 15 active outputs RANK 7 NODE 10 --> 15.4825201 sigma out 15 active outputs RANK 8 NODE 9 --> 14.8355217 sigma out 15 active outputs RANK 9 NODE 8 --> 14.0886078 sigma out 15 active outputs RANK 10 NODE 4 --> 13.399313 sigma out 15 active outputs RANK 11 NODE 11 --> 12.1888494 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 29.0678005 sigma in 11act. ( 23.5643578 sig out 1act.) RANK 2 NODE 11 --> 28.2446194 sigma in 11act. ( 24.0503616 sig out 1act.) RANK 3 NODE 4 --> 26.8889256 sigma in 11act. ( 22.2436924 sig out 1act.) RANK 4 NODE 1 --> 24.2214794 sigma in 11act. ( 17.7055149 sig out 1act.) RANK 5 NODE 5 --> 23.2989349 sigma in 11act. ( 20.890276 sig out 1act.) RANK 6 NODE 12 --> 18.3287296 sigma in 11act. ( 18.018959 sig out 1act.) RANK 7 NODE 3 --> 17.276144 sigma in 11act. ( 11.4890242 sig out 1act.) RANK 8 NODE 9 --> 15.019475 sigma in 11act. ( 12.374176 sig out 1act.) RANK 9 NODE 15 --> 14.2305193 sigma in 11act. ( 12.7757196 sig out 1act.) RANK 10 NODE 10 --> 10.6944447 sigma in 11act. ( 9.89082146 sig out 1act.) RANK 11 NODE 7 --> 10.4972 sigma in 11act. ( 7.96586704 sig out 1act.) RANK 12 NODE 8 --> 9.43438053 sigma in 11act. ( 7.64864731 sig out 1act.) RANK 13 NODE 6 --> 6.73780727 sigma in 11act. ( 5.29699564 sig out 1act.) RANK 14 NODE 14 --> 5.92747498 sigma in 11act. ( 4.49025345 sig out 1act.) RANK 15 NODE 13 --> 3.82069516 sigma in 11act. ( 2.03599453 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 24.0503616 sigma out 1act.( 28.2446194 sig in 11act.) RANK 2 NODE 2 --> 23.5643578 sigma out 1act.( 29.0678005 sig in 11act.) RANK 3 NODE 4 --> 22.2436924 sigma out 1act.( 26.8889256 sig in 11act.) RANK 4 NODE 5 --> 20.890276 sigma out 1act.( 23.2989349 sig in 11act.) RANK 5 NODE 12 --> 18.018959 sigma out 1act.( 18.3287296 sig in 11act.) RANK 6 NODE 1 --> 17.7055149 sigma out 1act.( 24.2214794 sig in 11act.) RANK 7 NODE 15 --> 12.7757196 sigma out 1act.( 14.2305193 sig in 11act.) RANK 8 NODE 9 --> 12.374176 sigma out 1act.( 15.019475 sig in 11act.) RANK 9 NODE 3 --> 11.4890242 sigma out 1act.( 17.276144 sig in 11act.) RANK 10 NODE 10 --> 9.89082146 sigma out 1act.( 10.6944447 sig in 11act.) RANK 11 NODE 7 --> 7.96586704 sigma out 1act.( 10.4972 sig in 11act.) RANK 12 NODE 8 --> 7.64864731 sigma out 1act.( 9.43438053 sig in 11act.) RANK 13 NODE 6 --> 5.29699564 sigma out 1act.( 6.73780727 sig in 11act.) RANK 14 NODE 14 --> 4.49025345 sigma out 1act.( 5.92747498 sig in 11act.) RANK 15 NODE 13 --> 2.03599453 sigma out 1act.( 3.82069516 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 58.5110283 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.556897581 *** contribution from regularisation: 0.0028883724 *** contribution from error: -0.559785962 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -55303.2731 -0.190224037 42.0591049 EXIT FROM BFGS code NEW_X -55303.2731 -0.190224037 42.0591049 ENTER BFGS code NEW_X -55303.2731 -0.190224037 42.0591049 EXIT FROM BFGS code FG_LNSRCH 0. -0.28152886 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.556024134 *** contribution from regularisation: 0.0031716472 *** contribution from error: -0.559195757 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55216.5332 -0.28152886 -233.064377 EXIT FROM BFGS code FG_LNSRCH 0. -0.225320861 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.557871461 *** contribution from regularisation: 0.00247642561 *** contribution from error: -0.560347915 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55399.9829 -0.225320861 -57.8316879 EXIT FROM BFGS code NEW_X -55399.9829 -0.225320861 -57.8316879 ENTER BFGS code NEW_X -55399.9829 -0.225320861 -57.8316879 EXIT FROM BFGS code FG_LNSRCH 0. -0.274030805 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.557829261 *** contribution from regularisation: 0.00290275109 *** contribution from error: -0.560732007 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55395.7939 -0.274030805 19.7700882 EXIT FROM BFGS code FG_LNSRCH 0. -0.241079763 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.55791384 *** contribution from regularisation: 0.0026474956 *** contribution from error: -0.560561359 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55404.1891 -0.241079763 -32.1305618 EXIT FROM BFGS code NEW_X -55404.1891 -0.241079763 -32.1305618 ENTER BFGS code NEW_X -55404.1891 -0.241079763 -32.1305618 EXIT FROM BFGS code FG_LNSRCH 0. -0.254203022 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.558003068 *** contribution from regularisation: 0.00279115071 *** contribution from error: -0.560794234 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55413.0547 -0.254203022 47.509079 EXIT FROM BFGS code NEW_X -55413.0547 -0.254203022 47.509079 ENTER BFGS code NEW_X -55413.0547 -0.254203022 47.509079 EXIT FROM BFGS code FG_LNSRCH 0. -0.243415311 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.558170021 *** contribution from regularisation: 0.00268044043 *** contribution from error: -0.560850441 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55429.6311 -0.243415311 43.8306465 EXIT FROM BFGS code NEW_X -55429.6311 -0.243415311 43.8306465 ENTER BFGS code NEW_X -55429.6311 -0.243415311 43.8306465 EXIT FROM BFGS code FG_LNSRCH 0. -0.211722001 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.558274627 *** contribution from regularisation: 0.00265543023 *** contribution from error: -0.560930073 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55440.022 -0.211722001 36.2304039 EXIT FROM BFGS code NEW_X -55440.022 -0.211722001 36.2304039 ENTER BFGS code NEW_X -55440.022 -0.211722001 36.2304039 EXIT FROM BFGS code FG_LNSRCH 0. -0.104851276 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.558526754 *** contribution from regularisation: 0.00263028475 *** contribution from error: -0.561157048 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55465.0584 -0.104851276 -9.92196941 EXIT FROM BFGS code NEW_X -55465.0584 -0.104851276 -9.92196941 ENTER BFGS code NEW_X -55465.0584 -0.104851276 -9.92196941 EXIT FROM BFGS code FG_LNSRCH 0. 0.0308445841 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.558630824 *** contribution from regularisation: 0.00270879455 *** contribution from error: -0.561339617 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55475.3939 0.0308445841 43.8544579 EXIT FROM BFGS code NEW_X -55475.3939 0.0308445841 43.8544579 ENTER BFGS code NEW_X -55475.3939 0.0308445841 43.8544579 EXIT FROM BFGS code FG_LNSRCH 0. 0.0714779496 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 49.0145798 sigma out 15 active outputs RANK 2 NODE 1 --> 31.9924698 sigma out 15 active outputs RANK 3 NODE 3 --> 25.8610992 sigma out 15 active outputs RANK 4 NODE 10 --> 24.2269211 sigma out 15 active outputs RANK 5 NODE 5 --> 23.2379837 sigma out 15 active outputs RANK 6 NODE 8 --> 22.9823761 sigma out 15 active outputs RANK 7 NODE 4 --> 22.1103706 sigma out 15 active outputs RANK 8 NODE 6 --> 19.1343746 sigma out 15 active outputs RANK 9 NODE 9 --> 18.8505764 sigma out 15 active outputs RANK 10 NODE 7 --> 18.7543468 sigma out 15 active outputs RANK 11 NODE 11 --> 11.2011185 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 40.2073364 sigma in 11act. ( 35.8772507 sig out 1act.) RANK 2 NODE 2 --> 36.8481064 sigma in 11act. ( 32.6887856 sig out 1act.) RANK 3 NODE 11 --> 36.2453575 sigma in 11act. ( 33.4693108 sig out 1act.) RANK 4 NODE 5 --> 30.209322 sigma in 11act. ( 29.0362892 sig out 1act.) RANK 5 NODE 12 --> 26.5984631 sigma in 11act. ( 28.9914761 sig out 1act.) RANK 6 NODE 1 --> 22.8934269 sigma in 11act. ( 17.5091171 sig out 1act.) RANK 7 NODE 9 --> 20.3841724 sigma in 11act. ( 21.4468193 sig out 1act.) RANK 8 NODE 3 --> 14.8858452 sigma in 11act. ( 10.3858366 sig out 1act.) RANK 9 NODE 15 --> 11.5723476 sigma in 11act. ( 10.9811535 sig out 1act.) RANK 10 NODE 10 --> 10.8627272 sigma in 11act. ( 10.1035929 sig out 1act.) RANK 11 NODE 8 --> 6.73694372 sigma in 11act. ( 5.36660147 sig out 1act.) RANK 12 NODE 7 --> 6.61259842 sigma in 11act. ( 4.73809624 sig out 1act.) RANK 13 NODE 6 --> 3.73246217 sigma in 11act. ( 2.392663 sig out 1act.) RANK 14 NODE 13 --> 2.77050972 sigma in 11act. ( 1.72750759 sig out 1act.) RANK 15 NODE 14 --> 2.31042194 sigma in 11act. ( 0.517845988 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 35.8772507 sigma out 1act.( 40.2073364 sig in 11act.) RANK 2 NODE 11 --> 33.4693108 sigma out 1act.( 36.2453575 sig in 11act.) RANK 3 NODE 2 --> 32.6887856 sigma out 1act.( 36.8481064 sig in 11act.) RANK 4 NODE 5 --> 29.0362892 sigma out 1act.( 30.209322 sig in 11act.) RANK 5 NODE 12 --> 28.9914761 sigma out 1act.( 26.5984631 sig in 11act.) RANK 6 NODE 9 --> 21.4468193 sigma out 1act.( 20.3841724 sig in 11act.) RANK 7 NODE 1 --> 17.5091171 sigma out 1act.( 22.8934269 sig in 11act.) RANK 8 NODE 15 --> 10.9811535 sigma out 1act.( 11.5723476 sig in 11act.) RANK 9 NODE 3 --> 10.3858366 sigma out 1act.( 14.8858452 sig in 11act.) RANK 10 NODE 10 --> 10.1035929 sigma out 1act.( 10.8627272 sig in 11act.) RANK 11 NODE 8 --> 5.36660147 sigma out 1act.( 6.73694372 sig in 11act.) RANK 12 NODE 7 --> 4.73809624 sigma out 1act.( 6.61259842 sig in 11act.) RANK 13 NODE 6 --> 2.392663 sigma out 1act.( 3.73246217 sig in 11act.) RANK 14 NODE 13 --> 1.72750759 sigma out 1act.( 2.77050972 sig in 11act.) RANK 15 NODE 14 --> 0.517845988 sigma out 1act.( 2.31042194 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 79.4785309 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.558784366 *** contribution from regularisation: 0.00284619327 *** contribution from error: -0.561630547 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -55490.6414 0.0714779496 -68.1164017 EXIT FROM BFGS code NEW_X -55490.6414 0.0714779496 -68.1164017 ENTER BFGS code NEW_X -55490.6414 0.0714779496 -68.1164017 EXIT FROM BFGS code FG_LNSRCH 0. 0.0445573032 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.558920324 *** contribution from regularisation: 0.00281800027 *** contribution from error: -0.561738312 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55504.1427 0.0445573032 -18.5833149 EXIT FROM BFGS code NEW_X -55504.1427 0.0445573032 -18.5833149 ENTER BFGS code NEW_X -55504.1427 0.0445573032 -18.5833149 EXIT FROM BFGS code FG_LNSRCH 0. 0.0453485213 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.559076071 *** contribution from regularisation: 0.00281504332 *** contribution from error: -0.561891139 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55519.6081 0.0453485213 20.7110138 EXIT FROM BFGS code NEW_X -55519.6081 0.0453485213 20.7110138 ENTER BFGS code NEW_X -55519.6081 0.0453485213 20.7110138 EXIT FROM BFGS code FG_LNSRCH 0. 0.0872588679 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.559518933 *** contribution from regularisation: 0.0028969869 *** contribution from error: -0.562415898 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55563.5893 0.0872588679 126.213524 EXIT FROM BFGS code NEW_X -55563.5893 0.0872588679 126.213524 ENTER BFGS code NEW_X -55563.5893 0.0872588679 126.213524 EXIT FROM BFGS code FG_LNSRCH 0. 0.165018037 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.559940279 *** contribution from regularisation: 0.00316249579 *** contribution from error: -0.563102782 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55605.4322 0.165018037 142.110474 EXIT FROM BFGS code NEW_X -55605.4322 0.165018037 142.110474 ENTER BFGS code NEW_X -55605.4322 0.165018037 142.110474 EXIT FROM BFGS code FG_LNSRCH 0. 0.188897416 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.559899628 *** contribution from regularisation: 0.00316327205 *** contribution from error: -0.563062906 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55601.3903 0.188897416 -42.3923798 EXIT FROM BFGS code FG_LNSRCH 0. 0.177090004 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.560470462 *** contribution from regularisation: 0.00291326619 *** contribution from error: -0.563383698 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55658.0813 0.177090004 55.2471008 EXIT FROM BFGS code NEW_X -55658.0813 0.177090004 55.2471008 ENTER BFGS code NEW_X -55658.0813 0.177090004 55.2471008 EXIT FROM BFGS code FG_LNSRCH 0. 0.17538619 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.560455978 *** contribution from regularisation: 0.0029462846 *** contribution from error: -0.563402236 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55656.6388 0.17538619 18.8773136 EXIT FROM BFGS code FG_LNSRCH 0. 0.176716015 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.560353577 *** contribution from regularisation: 0.00303958124 *** contribution from error: -0.563393176 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55646.4698 0.176716015 47.1197472 EXIT FROM BFGS code FG_LNSRCH 0. 0.177078411 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.560381353 *** contribution from regularisation: 0.0030026543 *** contribution from error: -0.563383996 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55649.2315 0.177078411 54.9431763 EXIT FROM BFGS code FG_LNSRCH 0. 0.177089989 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 63.4559174 sigma out 15 active outputs RANK 2 NODE 1 --> 48.7127876 sigma out 15 active outputs RANK 3 NODE 8 --> 41.3079758 sigma out 15 active outputs RANK 4 NODE 10 --> 36.8557472 sigma out 15 active outputs RANK 5 NODE 5 --> 34.3613472 sigma out 15 active outputs RANK 6 NODE 3 --> 33.8481178 sigma out 15 active outputs RANK 7 NODE 4 --> 33.0047951 sigma out 15 active outputs RANK 8 NODE 9 --> 32.800972 sigma out 15 active outputs RANK 9 NODE 7 --> 32.4310989 sigma out 15 active outputs RANK 10 NODE 6 --> 24.6185913 sigma out 15 active outputs RANK 11 NODE 11 --> 14.8757267 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 69.2920609 sigma in 11act. ( 66.8461609 sig out 1act.) RANK 2 NODE 11 --> 53.4486961 sigma in 11act. ( 54.7006645 sig out 1act.) RANK 3 NODE 2 --> 52.1504326 sigma in 11act. ( 47.7759972 sig out 1act.) RANK 4 NODE 12 --> 41.3212013 sigma in 11act. ( 46.264595 sig out 1act.) RANK 5 NODE 5 --> 40.3277054 sigma in 11act. ( 38.1461372 sig out 1act.) RANK 6 NODE 9 --> 30.1473179 sigma in 11act. ( 37.6660194 sig out 1act.) RANK 7 NODE 1 --> 26.1879883 sigma in 11act. ( 22.3573933 sig out 1act.) RANK 8 NODE 3 --> 16.2985497 sigma in 11act. ( 13.3424158 sig out 1act.) RANK 9 NODE 10 --> 11.8021269 sigma in 11act. ( 11.0137463 sig out 1act.) RANK 10 NODE 15 --> 8.03030205 sigma in 11act. ( 7.84819269 sig out 1act.) RANK 11 NODE 8 --> 5.96670008 sigma in 11act. ( 5.20548391 sig out 1act.) RANK 12 NODE 7 --> 3.37874985 sigma in 11act. ( 2.35603189 sig out 1act.) RANK 13 NODE 6 --> 2.74574566 sigma in 11act. ( 1.84617496 sig out 1act.) RANK 14 NODE 13 --> 1.71761584 sigma in 11act. ( 1.81857502 sig out 1act.) RANK 15 NODE 14 --> 1.3630178 sigma in 11act. ( 0.911514282 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 66.8461609 sigma out 1act.( 69.2920609 sig in 11act.) RANK 2 NODE 11 --> 54.7006645 sigma out 1act.( 53.4486961 sig in 11act.) RANK 3 NODE 2 --> 47.7759972 sigma out 1act.( 52.1504326 sig in 11act.) RANK 4 NODE 12 --> 46.264595 sigma out 1act.( 41.3212013 sig in 11act.) RANK 5 NODE 5 --> 38.1461372 sigma out 1act.( 40.3277054 sig in 11act.) RANK 6 NODE 9 --> 37.6660194 sigma out 1act.( 30.1473179 sig in 11act.) RANK 7 NODE 1 --> 22.3573933 sigma out 1act.( 26.1879883 sig in 11act.) RANK 8 NODE 3 --> 13.3424158 sigma out 1act.( 16.2985497 sig in 11act.) RANK 9 NODE 10 --> 11.0137463 sigma out 1act.( 11.8021269 sig in 11act.) RANK 10 NODE 15 --> 7.84819269 sigma out 1act.( 8.03030205 sig in 11act.) RANK 11 NODE 8 --> 5.20548391 sigma out 1act.( 5.96670008 sig in 11act.) RANK 12 NODE 7 --> 2.35603189 sigma out 1act.( 3.37874985 sig in 11act.) RANK 13 NODE 6 --> 1.84617496 sigma out 1act.( 2.74574566 sig in 11act.) RANK 14 NODE 13 --> 1.81857502 sigma out 1act.( 1.71761584 sig in 11act.) RANK 15 NODE 14 --> 0.911514282 sigma out 1act.( 1.3630178 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 125.133339 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.560379088 *** contribution from regularisation: 0.00300467387 *** contribution from error: -0.563383758 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -55649.004 0.177089989 55.1651535 EXIT FROM BFGS code FG_LNSRCH 0. 0.177090004 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.560373008 *** contribution from regularisation: 0.00301071419 *** contribution from error: -0.563383698 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55648.4041 0.177090004 55.1433411 EXIT FROM BFGS code FG_LNSRCH 0. 0.177090004 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.560368121 *** contribution from regularisation: 0.00301562296 *** contribution from error: -0.563383758 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55647.9167 0.177090004 55.1276398 EXIT FROM BFGS code FG_LNSRCH 0. 0.177090004 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.560378969 *** contribution from regularisation: 0.00300476886 *** contribution from error: -0.563383758 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55648.9945 0.177090004 55.1184654 EXIT FROM BFGS code NEW_X -55648.9945 0.177090004 55.1184654 ENTER BFGS code NEW_X -55648.9945 0.177090004 55.1184654 EXIT FROM BFGS code CONVERGENC -55648.9945 0.177090004 55.1184654 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 91.4402542 sigma out 15 active outputs RANK 2 NODE 1 --> 71.587265 sigma out 15 active outputs RANK 3 NODE 8 --> 59.5543976 sigma out 15 active outputs RANK 4 NODE 10 --> 52.9093437 sigma out 15 active outputs RANK 5 NODE 4 --> 49.4860382 sigma out 15 active outputs RANK 6 NODE 3 --> 49.0744553 sigma out 15 active outputs RANK 7 NODE 5 --> 48.7392769 sigma out 15 active outputs RANK 8 NODE 7 --> 48.6782684 sigma out 15 active outputs RANK 9 NODE 9 --> 47.692646 sigma out 15 active outputs RANK 10 NODE 6 --> 36.1352654 sigma out 15 active outputs RANK 11 NODE 11 --> 21.6756401 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 100.938919 sigma in 11act. ( 100.254539 sig out 1act.) RANK 2 NODE 11 --> 78.6472397 sigma in 11act. ( 83.2934036 sig out 1act.) RANK 3 NODE 2 --> 74.3546753 sigma in 11act. ( 72.6416626 sig out 1act.) RANK 4 NODE 5 --> 59.6402168 sigma in 11act. ( 56.6331062 sig out 1act.) RANK 5 NODE 12 --> 59.6249008 sigma in 11act. ( 70.1892853 sig out 1act.) RANK 6 NODE 9 --> 43.8912392 sigma in 11act. ( 56.6849594 sig out 1act.) RANK 7 NODE 1 --> 38.5550423 sigma in 11act. ( 33.173584 sig out 1act.) RANK 8 NODE 3 --> 23.4924088 sigma in 11act. ( 20.4452782 sig out 1act.) RANK 9 NODE 10 --> 16.5500584 sigma in 11act. ( 16.8041229 sig out 1act.) RANK 10 NODE 15 --> 11.2247534 sigma in 11act. ( 11.6054173 sig out 1act.) RANK 11 NODE 8 --> 8.19078064 sigma in 11act. ( 8.08139896 sig out 1act.) RANK 12 NODE 7 --> 4.28581905 sigma in 11act. ( 3.47509074 sig out 1act.) RANK 13 NODE 6 --> 3.30495524 sigma in 11act. ( 2.747679 sig out 1act.) RANK 14 NODE 13 --> 2.42828703 sigma in 11act. ( 2.62382507 sig out 1act.) RANK 15 NODE 14 --> 1.69226098 sigma in 11act. ( 1.34175026 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 100.254539 sigma out 1act.( 100.938919 sig in 11act.) RANK 2 NODE 11 --> 83.2934036 sigma out 1act.( 78.6472397 sig in 11act.) RANK 3 NODE 2 --> 72.6416626 sigma out 1act.( 74.3546753 sig in 11act.) RANK 4 NODE 12 --> 70.1892853 sigma out 1act.( 59.6249008 sig in 11act.) RANK 5 NODE 9 --> 56.6849594 sigma out 1act.( 43.8912392 sig in 11act.) RANK 6 NODE 5 --> 56.6331062 sigma out 1act.( 59.6402168 sig in 11act.) RANK 7 NODE 1 --> 33.173584 sigma out 1act.( 38.5550423 sig in 11act.) RANK 8 NODE 3 --> 20.4452782 sigma out 1act.( 23.4924088 sig in 11act.) RANK 9 NODE 10 --> 16.8041229 sigma out 1act.( 16.5500584 sig in 11act.) RANK 10 NODE 15 --> 11.6054173 sigma out 1act.( 11.2247534 sig in 11act.) RANK 11 NODE 8 --> 8.08139896 sigma out 1act.( 8.19078064 sig in 11act.) RANK 12 NODE 7 --> 3.47509074 sigma out 1act.( 4.28581905 sig in 11act.) RANK 13 NODE 6 --> 2.747679 sigma out 1act.( 3.30495524 sig in 11act.) RANK 14 NODE 13 --> 2.62382507 sigma out 1act.( 2.42828703 sig in 11act.) RANK 15 NODE 14 --> 1.34175026 sigma out 1act.( 1.69226098 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 188.790314 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.560384333 *** contribution from regularisation: 0.00299939211 *** contribution from error: -0.563383698 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 29164 Closing output file done