NNInput NNInputs_160.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 43669 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 13225 nbkg = 30444 Bkg Entries: 30444 Sig Entries: 13225 Chosen entries: 13225 Signal fraction: 1 Background fraction: 0.434404 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 30444 Actual Signal Entries: 13225 Entries to split: 13225 Test with : 6612 Train with : 6612 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 13225 for Signal Prepared event 0 for Signal with 13225 events ====Entry 0 Variable Ht : 120.598 Variable LepAPt : 29.9056 Variable LepBPt : 13.5082 Variable MetSigLeptonsJets : 11.7141 Variable MetSpec : 77.1834 Variable SumEtLeptonsJets : 43.4138 Variable VSumJetLeptonsPt : 43.304 Variable addEt : 120.598 Variable dPhiLepSumMet : 2.82377 Variable dPhiLeptons : 0.153702 Variable dRLeptons : 0.732968 Variable lep1_E : 40.451 Variable lep2_E : 32.7603 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2160 Ht = 120.598 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 29.9058 LepAPt = 29.9056 LepBEt = 13.5084 LepBPt = 13.5082 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 77.1834 MetDelPhi = 2.77599 MetSig = 6.78472 MetSigLeptonsJets = 11.7141 MetSpec = 77.1834 Mjj = 0 MostCentralJetEta = 0 MtllMet = 139.14 Njets = 0 SB = 0 SumEt = 129.415 SumEtJets = 0 SumEtLeptonsJets = 43.4138 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 43.304 addEt = 120.598 dPhiLepSumMet = 2.82377 dPhiLeptons = 0.153702 dRLeptons = 0.732968 diltype = 34 dimass = 15.0375 event = 2600 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 40.451 lep2_E = 32.7603 rand = 0.999742 run = 233650 weight = 2.30027e-06 ===Show End Prepared event 10000 for Signal with 13225 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 30444 for Background Prepared event 0 for Background with 30444 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 1.63538 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 30444 events Prepared event 20000 for Background with 30444 events Prepared event 30000 for Background with 30444 events Warning: found 994 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 43669 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 994 negative weights. Signal fraction: 61.6932335 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 55.9230614 62.2148323 64.5382919 66.4875107 68.0593872 69.8274536 71.2380676 72.9458542 74.4829102 76.3665619 78.0447769 79.7054138 81.5658264 83.6172943 85.26474 86.6390381 88.0919037 90.0427856 91.4541245 93.0792847 94.5868835 96.020752 97.5218964 98.9010773 100.368317 101.847565 103.433311 104.726669 106.117996 107.5457 108.669106 110.149406 111.450157 112.683228 114.212387 115.499733 116.922821 118.257248 119.753654 121.003555 122.308571 123.612289 124.896355 126.04425 127.43956 128.686554 129.853149 131.168716 132.431946 133.760803 134.859482 135.982025 137.305176 138.552979 139.718903 140.805145 141.848816 142.784607 143.821609 144.84021 146.010529 147.257416 148.246735 149.119553 150.364349 151.5672 152.697983 153.820892 154.978851 156.041595 157.288971 158.704437 160.065674 161.485992 163.287598 165.135742 166.789322 169.052765 171.135681 173.542175 175.774933 177.884369 180.207458 182.709381 185.439713 188.267456 190.90097 194.082764 198.200302 201.835846 205.705338 211.02359 216.28772 222.249344 228.735352 237.869385 248.077347 261.966797 278.254089 311.371857 586.383667 ------------------------------ Transdef: Tab for variable 3 20.0021973 20.4868317 20.9162426 21.3730392 21.7494221 22.1704407 22.5784721 22.9251595 23.2577362 23.6255283 24.0223885 24.3600712 24.7316895 25.0776997 25.3959579 25.6709785 26.0409355 26.342186 26.6735802 26.9915733 27.2765579 27.6301842 27.932127 28.2185478 28.4980049 28.808363 29.064043 29.3355541 29.5720825 29.8311806 30.1014919 30.3734512 30.6691608 30.8692627 31.1596985 31.4167175 31.6897736 31.9479008 32.1834335 32.4465828 32.7475395 33.028862 33.3136787 33.5828552 33.8332748 34.1251755 34.3448944 34.6306496 34.8880386 35.1434479 35.3882904 35.6416626 35.907341 36.1857262 36.4002876 36.6310425 36.8849945 37.1949005 37.4274597 37.6409187 37.9114113 38.1849289 38.4819412 38.7476807 39.0476074 39.2900772 39.5811653 39.7744522 40.0672302 40.3302536 40.6460075 40.9452248 41.2608337 41.5716858 41.9175453 42.2032471 42.5023575 42.8271713 43.1884232 43.5266495 43.8386841 44.2496414 44.6927872 45.1861038 45.6897888 46.2292709 46.8217773 47.3627853 48.0302124 48.6680679 49.4798737 50.3052483 51.3046684 52.5149765 54.0641403 55.6217308 57.3515854 60.0602226 64.5304108 72.2717438 169.996979 ------------------------------ Transdef: Tab for variable 4 10.0010262 10.2687988 10.4781609 10.674304 10.8990059 11.1360264 11.35777 11.6038361 11.8802233 12.1347113 12.3800907 12.6470871 12.9193611 13.1783247 13.4137554 13.6696815 13.891633 14.2332058 14.498085 14.7798767 15.073925 15.32092 15.5936165 15.8823757 16.1984634 16.5247269 16.8372326 17.0958385 17.3968525 17.6896515 17.9848938 18.3269081 18.6049728 18.9145546 19.2873459 19.5458241 19.8550186 20.1037025 20.3573799 20.5610924 20.7708817 21.0134411 21.2452545 21.480545 21.7273788 21.9749031 22.2177143 22.4879818 22.6861954 22.9501991 23.1926594 23.4016399 23.6543427 23.907114 24.1698799 24.4456177 24.7324352 25.0185356 25.3055611 25.5853138 25.8799114 26.1643505 26.4248924 26.7018299 26.9899979 27.3323975 27.6294823 27.8901844 28.1484604 28.4376831 28.7623577 29.06147 29.3724976 29.7304478 30.0902519 30.3672028 30.6631413 31.0002747 31.3931541 31.7550278 32.108429 32.4887695 32.8459206 33.1724663 33.5503464 33.958725 34.3674774 34.7934914 35.2183533 35.6208954 36.1970978 36.7322311 37.3315773 37.9158554 38.6258659 39.5020523 40.5329132 42.0686798 44.0956497 48.040741 71.4863358 ------------------------------ Transdef: Tab for variable 5 1.55223811 2.62429953 2.96536541 3.129251 3.35676956 3.50944519 3.66020513 3.81489706 3.92408276 4.02268457 4.11195326 4.22023678 4.3147459 4.40145493 4.47581959 4.54689407 4.62834835 4.70836544 4.79436588 4.89065313 4.98661804 5.05885696 5.14883804 5.22862387 5.30923796 5.37338734 5.45473242 5.53953075 5.6016798 5.68039036 5.75675392 5.84823084 5.91137028 5.98206377 6.06414986 6.12434292 6.20004654 6.26338959 6.33652496 6.39719057 6.46688652 6.52578068 6.57075119 6.63067532 6.67415428 6.72830248 6.7840538 6.83435059 6.89908791 6.93610239 6.9922123 7.05025673 7.10524559 7.1623621 7.21937609 7.27123928 7.31818104 7.36844158 7.40902233 7.45343494 7.50137758 7.54958725 7.60214424 7.64389229 7.69088268 7.73809099 7.78741026 7.83214998 7.88395214 7.93331099 7.98609447 8.03527832 8.09545135 8.14453506 8.19104576 8.23370171 8.27957535 8.33869362 8.39776993 8.4479351 8.50304317 8.55408478 8.60985374 8.67256355 8.72881508 8.79085255 8.86746216 8.94366646 9.0226059 9.10213089 9.19783306 9.30733871 9.39823341 9.53914547 9.67848396 9.85617352 10.1031857 10.3659039 10.7661638 11.3987083 16.6717033 ------------------------------ Transdef: Tab for variable 6 25.0019188 25.3386993 25.8060837 26.1638947 26.5781937 27.0536728 27.4545231 27.9244347 28.4197311 29.0896797 29.6264038 30.3652439 31.2015667 31.8229942 32.4770355 33.1853714 33.9750061 34.8071365 35.74823 36.5044479 37.3097687 38.1570969 38.8776588 39.6505394 40.5219269 41.2341957 41.8684616 42.6828766 43.3667526 44.002739 44.6208572 45.3740425 46.1001816 46.7170029 47.3427048 47.9684753 48.6104736 49.1495247 49.7082977 50.3179359 50.9796143 51.5557671 52.3263092 52.9519348 53.6279831 54.1825066 54.8299179 55.3616943 55.9844818 56.6736679 57.3120728 57.9480057 58.5333786 59.1742897 59.7588577 60.4871407 61.0035782 61.5864563 62.2238541 62.8125572 63.4458847 64.1065903 64.788681 65.3697662 66.0079346 66.5717316 67.1846161 67.8390045 68.4219589 69.0666275 69.6360626 70.2009735 70.7673569 71.3527603 72.0035095 72.530365 73.1422348 73.7478333 74.4237366 75.0612946 75.7705231 76.5386658 77.2516174 77.8620148 78.7620544 79.5932465 80.5651093 81.59729 82.7407074 84.0769348 85.7007904 87.4637833 89.0895691 90.9573669 93.5920868 96.5073547 99.8851013 104.673698 112.362144 124.754547 218.598831 ------------------------------ Transdef: Tab for variable 7 30.1229668 33.5295944 35.2657394 36.2638702 37.2274094 37.9440842 38.9346237 39.8154678 40.4690132 41.2396393 42.0286636 42.6954994 43.5996132 44.3979225 45.0920486 45.8189087 46.5984268 47.4015045 48.2412376 48.9342194 49.7141876 50.5698929 51.3027649 52.1380997 52.8191032 53.5595093 54.2282486 55.0301514 55.7145081 56.4787216 57.3239517 58.1624184 58.9129677 59.6158218 60.3772812 61.203064 61.8822784 62.5684242 63.3419762 64.0767517 64.7726898 65.3644409 65.9773102 66.66745 67.3476105 68.0009079 68.7318573 69.4635162 70.1194687 70.7894592 71.3572617 71.9180603 72.6003036 73.3070526 73.9564819 74.6066513 75.2178345 76.0067215 76.642807 77.3494492 77.9984131 78.784935 79.50811 80.3965149 81.3808136 82.2067871 83.3901672 84.6485519 86.0307922 87.3984451 88.7045746 90.1457062 91.6048889 93.1281738 94.5527954 95.7705078 97.1724243 99.0374603 100.646622 102.048386 103.608292 105.444481 107.510284 109.865723 112.154556 114.603043 117.260941 119.829353 123.023758 126.376419 130.541229 133.951691 137.996124 143.888275 150.356445 157.737915 165.584564 175.130722 191.633057 216.184845 484.780029 ------------------------------ Transdef: Tab for variable 8 1.768857 26.2223816 30.0403042 31.7459164 33.019352 34.1123009 34.9543762 35.7739029 36.4219589 37.0759201 37.6713867 38.2842216 38.9970856 39.5252991 40.1347618 40.6398849 41.1188812 41.6764832 42.2259216 42.8126526 43.4252777 43.929184 44.4806671 45.066288 45.6195145 46.2214699 46.7570953 47.3145218 47.8367996 48.3576813 48.9011002 49.4512215 50.1349144 50.6845856 51.2330399 51.7725983 52.3036575 52.8021927 53.2674332 53.8132744 54.3827438 54.9249039 55.4505005 55.9677658 56.4964294 57.0086441 57.5231857 58.1903305 58.7501678 59.3229828 59.8784828 60.4904785 60.9736557 61.4535217 61.9316788 62.4347382 62.9409294 63.4873238 64.0777283 64.6050491 65.0551453 65.5672684 66.0487366 66.5626755 67.0713501 67.6246796 68.182663 68.7427063 69.2812042 69.8656158 70.4821472 70.995224 71.4853363 72.0419312 72.6235504 73.2432632 73.8527374 74.4469299 75.0199966 75.6186142 76.3321075 76.9604111 77.7584381 78.3574677 79.1735153 80.0510101 81.0380859 81.8525696 83.2243042 84.5192108 86.2313232 87.8755646 89.9121857 92.1630707 94.5774689 97.4018936 101.212219 105.635223 113.354675 126.012085 227.683762 ------------------------------ Transdef: Tab for variable 9 55.9230614 61.3168221 63.5397644 65.4012146 66.846344 68.0682526 69.4369812 70.6529999 71.9649048 73.0145493 74.4760132 75.8506317 77.2263489 78.7345734 80.1281891 81.5498199 83.0990372 84.2897339 85.5020142 86.6390381 87.7789764 89.0783539 90.3713684 91.6236115 93.0156479 94.1513367 95.1675644 96.2561417 97.5805969 98.6787262 99.878334 101.157562 102.255089 103.444824 104.491692 105.747635 106.697289 107.737366 108.725296 109.936356 110.963348 111.840446 112.919037 114.183121 115.328171 116.276047 117.245239 118.289551 119.457748 120.368118 121.404045 122.382065 123.515808 124.538559 125.557114 126.689247 127.68129 128.667542 129.592194 130.593506 131.614532 132.531799 133.577301 134.420288 135.464279 136.412933 137.391876 138.417755 139.477264 140.318451 141.263199 142.017548 142.900177 143.711578 144.489075 145.354553 146.282333 147.221817 148.013428 148.899872 149.733643 150.693253 151.582001 152.470123 153.233978 154.161316 155.040421 155.915741 156.96756 158.150665 159.41571 160.833496 162.357666 163.935837 165.890411 168.394089 171.695984 177.422028 185.21994 199.885132 379.21344 ------------------------------ Transdef: Tab for variable 10 0.0517990664 1.16028786 1.40258694 1.58227348 1.71752691 1.81743968 1.92717171 2.01042366 2.08410835 2.14987373 2.2061336 2.26057053 2.30441236 2.34832621 2.37806273 2.41146636 2.44329762 2.4824791 2.5041256 2.53171849 2.55939221 2.58591604 2.60815954 2.63082743 2.65103626 2.66979122 2.68793631 2.70929337 2.7293098 2.74854803 2.76286101 2.77548075 2.78843784 2.80109262 2.81600094 2.82873058 2.83905458 2.85211086 2.86147332 2.87334347 2.88428545 2.89336777 2.90219927 2.91099644 2.92030263 2.92810678 2.93552232 2.94370365 2.95202613 2.9598608 2.96562195 2.97207665 2.97818899 2.98406029 2.98864841 2.99489713 2.99981737 3.00471306 3.00911665 3.01375651 3.01850557 3.02344656 3.02847672 3.03262448 3.0358181 3.04015589 3.04388046 3.04782081 3.05157852 3.05557632 3.05919051 3.06261873 3.0659852 3.06929421 3.07258534 3.07599521 3.0787921 3.08175683 3.08462715 3.08749485 3.09037018 3.09328556 3.0963397 3.09861779 3.10161614 3.10432625 3.10679245 3.10917497 3.11182356 3.11386204 3.11654687 3.11915779 3.1219449 3.12464881 3.12737608 3.1296308 3.13204145 3.13459301 3.13689113 3.13903093 3.14156747 ------------------------------ Transdef: Tab for variable 11 7.65919685E-05 0.00822154433 0.0176644325 0.025970459 0.0338499546 0.0410877466 0.0477581024 0.0551552773 0.0611350536 0.0675460398 0.0739426613 0.0803672671 0.0868026018 0.0933808982 0.099609971 0.105338573 0.111678898 0.116960764 0.12348336 0.128786445 0.133853793 0.139335394 0.144365609 0.150252104 0.155329689 0.160115048 0.16524592 0.170938462 0.175781012 0.181304574 0.186687708 0.191260993 0.19542551 0.199316859 0.203997329 0.207564235 0.211368799 0.215066671 0.218754709 0.222086191 0.22629261 0.230566382 0.234704137 0.238200545 0.242095485 0.245524645 0.249672532 0.253647923 0.257547677 0.261690974 0.265968829 0.270184219 0.273990214 0.27802676 0.282540679 0.287022471 0.292042345 0.296404243 0.301017433 0.305241436 0.310337186 0.315125346 0.319756418 0.325025856 0.329240561 0.334976792 0.340228677 0.346462816 0.35138005 0.356163979 0.36201334 0.367612988 0.373630762 0.37914741 0.385035515 0.391144753 0.398153603 0.404469728 0.410737693 0.417635083 0.424387932 0.432087123 0.439186335 0.447561711 0.455230027 0.463045597 0.473546088 0.483386517 0.495632172 0.509400129 0.521270633 0.534671962 0.551583767 0.572200239 0.592968941 0.619161725 0.64902389 0.677661061 0.720534623 0.8090868 1.09418297 ------------------------------ Transdef: Tab for variable 12 0.20001629 0.20540002 0.210701808 0.215790093 0.221378297 0.226597995 0.230924264 0.2353587 0.239943564 0.244263127 0.248004332 0.252751768 0.256694138 0.261286736 0.265536815 0.270235181 0.274634153 0.279589862 0.284237355 0.288226545 0.292117238 0.296127737 0.299869537 0.303453863 0.306852162 0.311048329 0.314724803 0.318578959 0.32172811 0.325638801 0.329351395 0.332909942 0.336942017 0.340537161 0.343838453 0.3470788 0.350670278 0.354381621 0.358109415 0.361995161 0.365501344 0.369032025 0.372445285 0.376340926 0.380043447 0.383745432 0.387890458 0.391491354 0.394947708 0.398503751 0.402322888 0.405772775 0.41013211 0.414089292 0.417794347 0.421746552 0.426024914 0.430052936 0.434657335 0.439579517 0.443434358 0.447105527 0.451559842 0.455861747 0.459801137 0.463947773 0.468259633 0.472767979 0.477916747 0.484232605 0.49021095 0.495804429 0.501186788 0.507135868 0.513463616 0.519223571 0.524922967 0.531819582 0.537868738 0.5440588 0.551168919 0.557521582 0.565703392 0.575390875 0.584749818 0.594216585 0.603398919 0.612875521 0.624780238 0.63409102 0.646173477 0.659639716 0.672452509 0.686942279 0.704260051 0.726186156 0.757511258 0.787025511 0.832248747 0.891932666 1.116027 ------------------------------ Transdef: Tab for variable 13 20.0978699 21.3548183 22.2287636 22.7961388 23.3923321 23.891716 24.5715485 25.0319386 25.5489159 26.0156555 26.4997253 26.8957443 27.3658218 27.793745 28.2334347 28.5814362 28.9783936 29.3581543 29.7103577 30.0979958 30.4305096 30.7655907 31.0693932 31.3672638 31.6826897 32.0425797 32.3519592 32.6526413 32.9732704 33.2792473 33.5762634 33.8657303 34.2873077 34.5629349 34.9033356 35.217514 35.5547333 35.8701057 36.1636124 36.4258118 36.6809959 36.9922676 37.2765312 37.516098 37.7997208 38.0416794 38.3300743 38.6025581 38.9149094 39.2324867 39.5334549 39.7929382 40.0512772 40.3588219 40.6426926 40.9344292 41.2405853 41.5330009 41.8374786 42.1199303 42.4442978 42.7189026 43.0118637 43.4064407 43.7656708 44.0386314 44.4143562 44.8119659 45.1454124 45.5308418 45.9718246 46.3455658 46.7759781 47.1267319 47.5239639 47.972229 48.4344749 48.999485 49.4826202 50.0385818 50.6842842 51.2382393 51.7533417 52.3008499 52.9466705 53.6223221 54.2988129 55.0419846 55.78619 56.7280693 57.6275291 58.8285179 60.1168289 61.5254135 63.244133 65.4050751 67.6546173 71.006073 76.1559753 86.3407135 201.326797 ------------------------------ Transdef: Tab for variable 14 10.0034533 10.7799778 11.2659121 11.6714907 12.0749626 12.4240646 12.8337727 13.1773615 13.5403528 13.8909073 14.2049389 14.5563774 14.8480358 15.1878872 15.4845209 15.8379192 16.1204643 16.4971848 16.8382645 17.1640778 17.4519272 17.8239536 18.1332436 18.440979 18.8108463 19.1510944 19.5207005 19.8583717 20.2142868 20.5409203 20.8685532 21.2188301 21.4947853 21.7395592 22.0600548 22.3598003 22.6456738 22.9517975 23.2474022 23.5301189 23.7789726 24.072403 24.3311329 24.6250076 24.9569664 25.2624168 25.4788971 25.7597656 26.0099144 26.2743263 26.5685844 26.8823013 27.1617584 27.4911098 27.8486004 28.0837193 28.376009 28.7431068 29.0555191 29.3448143 29.6619205 29.9837475 30.2814674 30.5572853 30.8860931 31.2629929 31.6313133 31.9931736 32.3402023 32.7321129 33.0229225 33.3299713 33.6306763 33.9894295 34.363472 34.7296219 35.0703506 35.4012794 35.8015213 36.1879616 36.6073303 37.0513077 37.5021896 37.9561958 38.4466629 39.0311699 39.5334396 40.0845795 40.7209702 41.3306122 42.0180511 42.8035622 43.5648346 44.4385834 45.470108 46.8465881 48.1548157 49.9332237 52.6630478 57.473732 103.30143 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 53.3 30.7 47.0 30.6 45.6 47.4 43.6 52.7 -13.6 -18.8 -29.9 26.9 44.0 2 53.3 100.0 60.2 59.4 35.5 70.9 94.4 72.0 88.7 -39.1 -27.5 -44.3 54.7 54.5 3 30.7 60.2 100.0 47.3 6.0 30.0 63.1 45.9 66.9 -6.4 -29.4 -51.4 91.2 42.4 4 47.0 59.4 47.3 100.0 20.2 40.4 57.5 51.2 69.1 -1.5 -33.6 -58.4 43.3 92.1 5 30.6 35.5 6.0 20.2 100.0 85.7 5.7 60.8 60.6 32.4 -8.9 -10.3 5.4 19.1 6 45.6 70.9 30.0 40.4 85.7 100.0 47.4 80.7 83.1 3.1 -18.1 -26.4 27.3 37.6 7 47.4 94.4 63.1 57.5 5.7 47.4 100.0 59.2 74.6 -51.1 -27.1 -45.0 57.5 52.6 8 43.6 72.0 45.9 51.2 60.8 80.7 59.2 100.0 82.6 2.3 -24.0 -37.8 41.6 47.3 9 52.7 88.7 66.9 69.1 60.6 83.1 74.6 82.6 100.0 -7.2 -30.5 -50.4 60.9 63.4 10 -13.6 -39.1 -6.4 -1.5 32.4 3.1 -51.1 2.3 -7.2 100.0 1.9 2.4 -5.5 -1.0 11 -18.8 -27.5 -29.4 -33.6 -8.9 -18.1 -27.1 -24.0 -30.5 1.9 100.0 55.9 -25.9 -32.8 12 -29.9 -44.3 -51.4 -58.4 -10.3 -26.4 -45.0 -37.8 -50.4 2.4 55.9 100.0 -46.9 -52.2 13 26.9 54.7 91.2 43.3 5.4 27.3 57.5 41.6 60.9 -5.5 -25.9 -46.9 100.0 47.3 14 44.0 54.5 42.4 92.1 19.1 37.6 52.6 47.3 63.4 -1.0 -32.8 -52.2 47.3 100.0 TOTAL CORRELATION TO TARGET (diagonal) 141.520457 TOTAL CORRELATION OF ALL VARIABLES 59.5774659 ROUND 1: MAX CORR ( 59.5774493) AFTER KILLING INPUT VARIABLE 12 CONTR 0.0444184733 ROUND 2: MAX CORR ( 59.5766466) AFTER KILLING INPUT VARIABLE 11 CONTR 0.309267603 ROUND 3: MAX CORR ( 59.5745189) AFTER KILLING INPUT VARIABLE 8 CONTR 0.503505317 ROUND 4: MAX CORR ( 59.5550208) AFTER KILLING INPUT VARIABLE 6 CONTR 1.52407186 ROUND 5: MAX CORR ( 59.5318142) AFTER KILLING INPUT VARIABLE 7 CONTR 1.66240949 ROUND 6: MAX CORR ( 59.4176024) AFTER KILLING INPUT VARIABLE 14 CONTR 3.68584104 ROUND 7: MAX CORR ( 59.3567702) AFTER KILLING INPUT VARIABLE 13 CONTR 2.68799269 ROUND 8: MAX CORR ( 59.1751732) AFTER KILLING INPUT VARIABLE 10 CONTR 4.63950928 ROUND 9: MAX CORR ( 58.7882327) AFTER KILLING INPUT VARIABLE 3 CONTR 6.75609542 ROUND 10: MAX CORR ( 58.036813) AFTER KILLING INPUT VARIABLE 9 CONTR 9.36934582 ROUND 11: MAX CORR ( 56.6316844) AFTER KILLING INPUT VARIABLE 5 CONTR 12.6934619 ROUND 12: MAX CORR ( 53.3061589) AFTER KILLING INPUT VARIABLE 4 CONTR 19.1206985 LAST REMAINING VARIABLE: 2 total correlation to target: 59.5774659 % total significance: 61.6825777 sigma correlations of single variables to target: variable 2: 53.3061589 % , in sigma: 55.1896802 variable 3: 30.6970444 % , in sigma: 31.7816946 variable 4: 47.0358533 % , in sigma: 48.6978195 variable 5: 30.5756271 % , in sigma: 31.6559872 variable 6: 45.594498 % , in sigma: 47.2055353 variable 7: 47.3852964 % , in sigma: 49.0596098 variable 8: 43.6213507 % , in sigma: 45.1626687 variable 9: 52.6721207 % , in sigma: 54.5332389 variable 10: -13.6229772 % , in sigma: 14.1043318 variable 11: -18.8297528 % , in sigma: 19.4950838 variable 12: -29.9175933 % , in sigma: 30.9747024 variable 13: 26.8561821 % , in sigma: 27.8051192 variable 14: 44.0350221 % , in sigma: 45.5909568 variables sorted by significance: 1 most relevant variable 2 corr 53.30616 , in sigma: 55.1896813 2 most relevant variable 4 corr 19.1206989 , in sigma: 19.7963102 3 most relevant variable 5 corr 12.6934624 , in sigma: 13.1419735 4 most relevant variable 9 corr 9.36934566 , in sigma: 9.70040239 5 most relevant variable 3 corr 6.75609541 , in sigma: 6.99481548 6 most relevant variable 10 corr 4.6395092 , in sigma: 4.80344175 7 most relevant variable 13 corr 2.68799281 , in sigma: 2.78297043 8 most relevant variable 14 corr 3.68584108 , in sigma: 3.8160767 9 most relevant variable 7 corr 1.66240954 , in sigma: 1.72114917 10 most relevant variable 6 corr 1.52407181 , in sigma: 1.57792341 11 most relevant variable 8 corr 0.50350529 , in sigma: 0.52129616 12 most relevant variable 11 corr 0.30926761 , in sigma: 0.320195281 13 most relevant variable 12 corr 0.0444184728 , in sigma: 0.0459879564 global correlations between input variables: variable 2: 99.4187452 % variable 3: 95.7130087 % variable 4: 95.7175417 % variable 5: 96.4581965 % variable 6: 96.6781567 % variable 7: 99.143946 % variable 8: 88.4319221 % variable 9: 98.7025728 % variable 10: 75.5557563 % variable 11: 57.0117297 % variable 12: 73.4110485 % variable 13: 93.8518417 % variable 14: 94.5020543 % significance loss when removing single variables: variable 2: corr = 4.84668502 % , sigma = 5.01793792 variable 3: corr = 7.84141437 % , sigma = 8.11848313 variable 4: corr = 8.52698098 % , sigma = 8.82827357 variable 5: corr = 10.7355531 % , sigma = 11.1148834 variable 6: corr = 1.2390199 % , sigma = 1.28279947 variable 7: corr = 1.35870475 % , sigma = 1.40671326 variable 8: corr = 0.511309485 % , sigma = 0.529376109 variable 9: corr = 10.1355073 % , sigma = 10.4936356 variable 10: corr = 4.59438698 % , sigma = 4.75672518 variable 11: corr = 0.295295066 % , sigma = 0.305729031 variable 12: corr = 0.0444184733 % , sigma = 0.0459879569 variable 13: corr = 4.2609301 % , sigma = 4.41148593 variable 14: corr = 3.65972081 % , sigma = 3.78903348 Keep only 6 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 7 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 19.1889954 sigma out 15 active outputs RANK 2 NODE 5 --> 17.7616367 sigma out 15 active outputs RANK 3 NODE 7 --> 13.8176775 sigma out 15 active outputs RANK 4 NODE 6 --> 13.4652672 sigma out 15 active outputs RANK 5 NODE 4 --> 13.1057377 sigma out 15 active outputs RANK 6 NODE 1 --> 13.0825338 sigma out 15 active outputs RANK 7 NODE 2 --> 11.928793 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 20.4638996 sigma in 7act. ( 20.3707771 sig out 1act.) RANK 2 NODE 3 --> 20.1894417 sigma in 7act. ( 21.9434795 sig out 1act.) RANK 3 NODE 5 --> 13.7853794 sigma in 7act. ( 14.1447668 sig out 1act.) RANK 4 NODE 6 --> 12.6740313 sigma in 7act. ( 12.162468 sig out 1act.) RANK 5 NODE 15 --> 10.6209354 sigma in 7act. ( 12.9101171 sig out 1act.) RANK 6 NODE 1 --> 9.50708199 sigma in 7act. ( 10.6372042 sig out 1act.) RANK 7 NODE 14 --> 7.79689121 sigma in 7act. ( 7.67470312 sig out 1act.) RANK 8 NODE 9 --> 5.48202705 sigma in 7act. ( 5.27672052 sig out 1act.) RANK 9 NODE 10 --> 4.42358255 sigma in 7act. ( 4.77875566 sig out 1act.) RANK 10 NODE 4 --> 3.26340246 sigma in 7act. ( 3.22054744 sig out 1act.) RANK 11 NODE 2 --> 3.23725963 sigma in 7act. ( 3.0190258 sig out 1act.) RANK 12 NODE 12 --> 3.19434237 sigma in 7act. ( 3.48782134 sig out 1act.) RANK 13 NODE 8 --> 2.69364238 sigma in 7act. ( 2.99689269 sig out 1act.) RANK 14 NODE 7 --> 2.67251992 sigma in 7act. ( 2.336725 sig out 1act.) RANK 15 NODE 13 --> 2.18150663 sigma in 7act. ( 2.03506517 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 21.9434795 sigma out 1act.( 20.1894417 sig in 7act.) RANK 2 NODE 11 --> 20.3707771 sigma out 1act.( 20.4638996 sig in 7act.) RANK 3 NODE 5 --> 14.1447668 sigma out 1act.( 13.7853794 sig in 7act.) RANK 4 NODE 15 --> 12.9101171 sigma out 1act.( 10.6209354 sig in 7act.) RANK 5 NODE 6 --> 12.162468 sigma out 1act.( 12.6740313 sig in 7act.) RANK 6 NODE 1 --> 10.6372042 sigma out 1act.( 9.50708199 sig in 7act.) RANK 7 NODE 14 --> 7.67470312 sigma out 1act.( 7.79689121 sig in 7act.) RANK 8 NODE 9 --> 5.27672052 sigma out 1act.( 5.48202705 sig in 7act.) RANK 9 NODE 10 --> 4.77875566 sigma out 1act.( 4.42358255 sig in 7act.) RANK 10 NODE 12 --> 3.48782134 sigma out 1act.( 3.19434237 sig in 7act.) RANK 11 NODE 4 --> 3.22054744 sigma out 1act.( 3.26340246 sig in 7act.) RANK 12 NODE 2 --> 3.0190258 sigma out 1act.( 3.23725963 sig in 7act.) RANK 13 NODE 8 --> 2.99689269 sigma out 1act.( 2.69364238 sig in 7act.) RANK 14 NODE 7 --> 2.336725 sigma out 1act.( 2.67251992 sig in 7act.) RANK 15 NODE 13 --> 2.03506517 sigma out 1act.( 2.18150663 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 41.0380554 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 19.0191784 sigma out 15 active outputs RANK 2 NODE 2 --> 16.8050652 sigma out 15 active outputs RANK 3 NODE 5 --> 16.2517624 sigma out 15 active outputs RANK 4 NODE 6 --> 15.2762842 sigma out 15 active outputs RANK 5 NODE 1 --> 14.0580616 sigma out 15 active outputs RANK 6 NODE 4 --> 13.8255386 sigma out 15 active outputs RANK 7 NODE 7 --> 13.6584806 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 19.3783455 sigma in 7act. ( 18.7545471 sig out 1act.) RANK 2 NODE 3 --> 18.3543167 sigma in 7act. ( 19.0491371 sig out 1act.) RANK 3 NODE 15 --> 14.970849 sigma in 7act. ( 13.3675308 sig out 1act.) RANK 4 NODE 5 --> 13.1318054 sigma in 7act. ( 13.1491241 sig out 1act.) RANK 5 NODE 6 --> 12.2969131 sigma in 7act. ( 11.2026653 sig out 1act.) RANK 6 NODE 1 --> 9.76631927 sigma in 7act. ( 9.25529861 sig out 1act.) RANK 7 NODE 14 --> 8.62125778 sigma in 7act. ( 7.47621727 sig out 1act.) RANK 8 NODE 9 --> 7.67903948 sigma in 7act. ( 4.93809223 sig out 1act.) RANK 9 NODE 4 --> 7.48212767 sigma in 7act. ( 4.56333685 sig out 1act.) RANK 10 NODE 12 --> 6.64648104 sigma in 7act. ( 4.33777714 sig out 1act.) RANK 11 NODE 2 --> 6.11137962 sigma in 7act. ( 2.89168954 sig out 1act.) RANK 12 NODE 8 --> 5.29901934 sigma in 7act. ( 3.13442278 sig out 1act.) RANK 13 NODE 13 --> 5.21555805 sigma in 7act. ( 2.61423922 sig out 1act.) RANK 14 NODE 7 --> 4.34219837 sigma in 7act. ( 1.74263573 sig out 1act.) RANK 15 NODE 10 --> 4.05570745 sigma in 7act. ( 3.545784 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 19.0491371 sigma out 1act.( 18.3543167 sig in 7act.) RANK 2 NODE 11 --> 18.7545471 sigma out 1act.( 19.3783455 sig in 7act.) RANK 3 NODE 15 --> 13.3675308 sigma out 1act.( 14.970849 sig in 7act.) RANK 4 NODE 5 --> 13.1491241 sigma out 1act.( 13.1318054 sig in 7act.) RANK 5 NODE 6 --> 11.2026653 sigma out 1act.( 12.2969131 sig in 7act.) RANK 6 NODE 1 --> 9.25529861 sigma out 1act.( 9.76631927 sig in 7act.) RANK 7 NODE 14 --> 7.47621727 sigma out 1act.( 8.62125778 sig in 7act.) RANK 8 NODE 9 --> 4.93809223 sigma out 1act.( 7.67903948 sig in 7act.) RANK 9 NODE 4 --> 4.56333685 sigma out 1act.( 7.48212767 sig in 7act.) RANK 10 NODE 12 --> 4.33777714 sigma out 1act.( 6.64648104 sig in 7act.) RANK 11 NODE 10 --> 3.545784 sigma out 1act.( 4.05570745 sig in 7act.) RANK 12 NODE 8 --> 3.13442278 sigma out 1act.( 5.29901934 sig in 7act.) RANK 13 NODE 2 --> 2.89168954 sigma out 1act.( 6.11137962 sig in 7act.) RANK 14 NODE 13 --> 2.61423922 sigma out 1act.( 5.21555805 sig in 7act.) RANK 15 NODE 7 --> 1.74263573 sigma out 1act.( 4.34219837 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 37.9196815 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.345283359 *** contribution from regularisation: 0.010948047 *** contribution from error: -0.356231391 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.407448739 *** contribution from regularisation: 0.00672133453 *** contribution from error: -0.414170086 *********************************************** -----------------> Test sample ENTER BFGS code START -8898.41947 -0.571502745 0.0211072061 EXIT FROM BFGS code FG_START 0. -0.571502745 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.427838892 *** contribution from regularisation: 0.00588736637 *** contribution from error: -0.433726251 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -9341.43449 -0.571502745 -27.613554 EXIT FROM BFGS code FG_LNSRCH 0. -0.587317705 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.456026107 *** contribution from regularisation: 0.00752745615 *** contribution from error: -0.463553578 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9956.87417 -0.587317705 2.95183682 EXIT FROM BFGS code NEW_X -9956.87417 -0.587317705 2.95183682 ENTER BFGS code NEW_X -9956.87417 -0.587317705 2.95183682 EXIT FROM BFGS code FG_LNSRCH 0. -0.585954845 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.458016247 *** contribution from regularisation: 0.00744024199 *** contribution from error: -0.465456486 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10000.3269 -0.585954845 3.22825933 EXIT FROM BFGS code NEW_X -10000.3269 -0.585954845 3.22825933 ENTER BFGS code NEW_X -10000.3269 -0.585954845 3.22825933 EXIT FROM BFGS code FG_LNSRCH 0. -0.58364594 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.458286136 *** contribution from regularisation: 0.00752653368 *** contribution from error: -0.465812683 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10006.2198 -0.58364594 3.27225208 EXIT FROM BFGS code NEW_X -10006.2198 -0.58364594 3.27225208 ENTER BFGS code NEW_X -10006.2198 -0.58364594 3.27225208 EXIT FROM BFGS code FG_LNSRCH 0. -0.577094495 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.458763838 *** contribution from regularisation: 0.00741364062 *** contribution from error: -0.466177493 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10016.6499 -0.577094495 4.29640341 EXIT FROM BFGS code NEW_X -10016.6499 -0.577094495 4.29640341 ENTER BFGS code NEW_X -10016.6499 -0.577094495 4.29640341 EXIT FROM BFGS code FG_LNSRCH 0. -0.565096319 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.459573209 *** contribution from regularisation: 0.00713271927 *** contribution from error: -0.466705918 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10034.3216 -0.565096319 3.2168386 EXIT FROM BFGS code NEW_X -10034.3216 -0.565096319 3.2168386 ENTER BFGS code NEW_X -10034.3216 -0.565096319 3.2168386 EXIT FROM BFGS code FG_LNSRCH 0. -0.52591157 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.461571306 *** contribution from regularisation: 0.00655044708 *** contribution from error: -0.468121767 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10077.9479 -0.52591157 -12.6551886 EXIT FROM BFGS code NEW_X -10077.9479 -0.52591157 -12.6551886 ENTER BFGS code NEW_X -10077.9479 -0.52591157 -12.6551886 EXIT FROM BFGS code FG_LNSRCH 0. -0.551508963 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 6 --> 24.6485691 sigma out 15 active outputs RANK 2 NODE 2 --> 21.432806 sigma out 15 active outputs RANK 3 NODE 1 --> 19.7934895 sigma out 15 active outputs RANK 4 NODE 5 --> 12.5024939 sigma out 15 active outputs RANK 5 NODE 7 --> 10.4979677 sigma out 15 active outputs RANK 6 NODE 4 --> 7.29407597 sigma out 15 active outputs RANK 7 NODE 3 --> 6.10416651 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 24.8806133 sigma in 7act. ( 21.7962093 sig out 1act.) RANK 2 NODE 15 --> 22.815918 sigma in 7act. ( 25.1136532 sig out 1act.) RANK 3 NODE 4 --> 14.4440737 sigma in 7act. ( 13.6131258 sig out 1act.) RANK 4 NODE 5 --> 12.353734 sigma in 7act. ( 10.8649044 sig out 1act.) RANK 5 NODE 2 --> 9.4837904 sigma in 7act. ( 8.69971371 sig out 1act.) RANK 6 NODE 14 --> 7.158741 sigma in 7act. ( 6.92122984 sig out 1act.) RANK 7 NODE 9 --> 6.77235174 sigma in 7act. ( 6.5194416 sig out 1act.) RANK 8 NODE 12 --> 5.43801022 sigma in 7act. ( 4.9435482 sig out 1act.) RANK 9 NODE 7 --> 5.13126326 sigma in 7act. ( 4.5489912 sig out 1act.) RANK 10 NODE 13 --> 4.73676968 sigma in 7act. ( 4.20048952 sig out 1act.) RANK 11 NODE 8 --> 4.72652292 sigma in 7act. ( 4.19897127 sig out 1act.) RANK 12 NODE 6 --> 2.91130733 sigma in 7act. ( 0.311827421 sig out 1act.) RANK 13 NODE 3 --> 2.84976101 sigma in 7act. ( 0.837587476 sig out 1act.) RANK 14 NODE 1 --> 2.7318449 sigma in 7act. ( 0.303654611 sig out 1act.) RANK 15 NODE 10 --> 1.74162471 sigma in 7act. ( 1.02474129 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 25.1136532 sigma out 1act.( 22.815918 sig in 7act.) RANK 2 NODE 11 --> 21.7962093 sigma out 1act.( 24.8806133 sig in 7act.) RANK 3 NODE 4 --> 13.6131258 sigma out 1act.( 14.4440737 sig in 7act.) RANK 4 NODE 5 --> 10.8649044 sigma out 1act.( 12.353734 sig in 7act.) RANK 5 NODE 2 --> 8.69971371 sigma out 1act.( 9.4837904 sig in 7act.) RANK 6 NODE 14 --> 6.92122984 sigma out 1act.( 7.158741 sig in 7act.) RANK 7 NODE 9 --> 6.5194416 sigma out 1act.( 6.77235174 sig in 7act.) RANK 8 NODE 12 --> 4.9435482 sigma out 1act.( 5.43801022 sig in 7act.) RANK 9 NODE 7 --> 4.5489912 sigma out 1act.( 5.13126326 sig in 7act.) RANK 10 NODE 13 --> 4.20048952 sigma out 1act.( 4.73676968 sig in 7act.) RANK 11 NODE 8 --> 4.19897127 sigma out 1act.( 4.72652292 sig in 7act.) RANK 12 NODE 10 --> 1.02474129 sigma out 1act.( 1.74162471 sig in 7act.) RANK 13 NODE 3 --> 0.837587476 sigma out 1act.( 2.84976101 sig in 7act.) RANK 14 NODE 6 --> 0.311827421 sigma out 1act.( 2.91130733 sig in 7act.) RANK 15 NODE 1 --> 0.303654611 sigma out 1act.( 2.7318449 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 40.7133102 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.462850332 *** contribution from regularisation: 0.00761706661 *** contribution from error: -0.470467389 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -10105.8739 -0.551508963 43.3703499 EXIT FROM BFGS code NEW_X -10105.8739 -0.551508963 43.3703499 ENTER BFGS code NEW_X -10105.8739 -0.551508963 43.3703499 EXIT FROM BFGS code FG_LNSRCH 0. -0.519364595 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.464044452 *** contribution from regularisation: 0.0076556569 *** contribution from error: -0.471700102 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10131.9465 -0.519364595 -22.0042896 EXIT FROM BFGS code FG_LNSRCH 0. -0.535270631 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.464837372 *** contribution from regularisation: 0.00690759299 *** contribution from error: -0.471744955 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10149.259 -0.535270631 16.6236286 EXIT FROM BFGS code NEW_X -10149.259 -0.535270631 16.6236286 ENTER BFGS code NEW_X -10149.259 -0.535270631 16.6236286 EXIT FROM BFGS code FG_LNSRCH 0. -0.523801446 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.465132266 *** contribution from regularisation: 0.00717854314 *** contribution from error: -0.472310811 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10155.6982 -0.523801446 -7.68253946 EXIT FROM BFGS code NEW_X -10155.6982 -0.523801446 -7.68253946 ENTER BFGS code NEW_X -10155.6982 -0.523801446 -7.68253946 EXIT FROM BFGS code FG_LNSRCH 0. -0.523323476 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.465558708 *** contribution from regularisation: 0.00701909186 *** contribution from error: -0.47257781 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10165.009 -0.523323476 -9.64032936 EXIT FROM BFGS code NEW_X -10165.009 -0.523323476 -9.64032936 ENTER BFGS code NEW_X -10165.009 -0.523323476 -9.64032936 EXIT FROM BFGS code FG_LNSRCH 0. -0.523210227 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.466226161 *** contribution from regularisation: 0.00697615556 *** contribution from error: -0.473202318 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10179.5818 -0.523210227 -2.61716962 EXIT FROM BFGS code NEW_X -10179.5818 -0.523210227 -2.61716962 ENTER BFGS code NEW_X -10179.5818 -0.523210227 -2.61716962 EXIT FROM BFGS code FG_LNSRCH 0. -0.505135477 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.46605432 *** contribution from regularisation: 0.0070206169 *** contribution from error: -0.473074943 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10175.8301 -0.505135477 -27.8251476 EXIT FROM BFGS code FG_LNSRCH 0. -0.515810966 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.466331452 *** contribution from regularisation: 0.00717427582 *** contribution from error: -0.473505735 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10181.8808 -0.515810966 -13.5124111 EXIT FROM BFGS code NEW_X -10181.8808 -0.515810966 -13.5124111 ENTER BFGS code NEW_X -10181.8808 -0.515810966 -13.5124111 EXIT FROM BFGS code FG_LNSRCH 0. -0.518790543 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.466875106 *** contribution from regularisation: 0.00696347794 *** contribution from error: -0.473838598 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10193.7508 -0.518790543 12.4718981 EXIT FROM BFGS code NEW_X -10193.7508 -0.518790543 12.4718981 ENTER BFGS code NEW_X -10193.7508 -0.518790543 12.4718981 EXIT FROM BFGS code FG_LNSRCH 0. -0.50991863 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.466909379 *** contribution from regularisation: 0.00706068054 *** contribution from error: -0.473970056 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10194.4991 -0.50991863 1.49577165 EXIT FROM BFGS code NEW_X -10194.4991 -0.50991863 1.49577165 ENTER BFGS code NEW_X -10194.4991 -0.50991863 1.49577165 EXIT FROM BFGS code FG_LNSRCH 0. -0.503295958 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 36.981308 sigma out 15 active outputs RANK 2 NODE 6 --> 30.5221996 sigma out 15 active outputs RANK 3 NODE 2 --> 26.650034 sigma out 15 active outputs RANK 4 NODE 7 --> 13.3633499 sigma out 15 active outputs RANK 5 NODE 4 --> 12.2393169 sigma out 15 active outputs RANK 6 NODE 5 --> 9.64457512 sigma out 15 active outputs RANK 7 NODE 3 --> 6.12769461 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 41.2193527 sigma in 7act. ( 37.7216415 sig out 1act.) RANK 2 NODE 15 --> 32.5914726 sigma in 7act. ( 37.2115517 sig out 1act.) RANK 3 NODE 4 --> 15.5978594 sigma in 7act. ( 15.1408834 sig out 1act.) RANK 4 NODE 5 --> 14.8405905 sigma in 7act. ( 13.659976 sig out 1act.) RANK 5 NODE 2 --> 10.7717924 sigma in 7act. ( 10.4698915 sig out 1act.) RANK 6 NODE 9 --> 6.7647748 sigma in 7act. ( 6.97398949 sig out 1act.) RANK 7 NODE 7 --> 4.81780577 sigma in 7act. ( 4.45194292 sig out 1act.) RANK 8 NODE 14 --> 4.68665886 sigma in 7act. ( 4.19259071 sig out 1act.) RANK 9 NODE 6 --> 3.15142822 sigma in 7act. ( 2.19173861 sig out 1act.) RANK 10 NODE 12 --> 2.62138987 sigma in 7act. ( 1.89622128 sig out 1act.) RANK 11 NODE 8 --> 2.56254721 sigma in 7act. ( 1.69275999 sig out 1act.) RANK 12 NODE 3 --> 2.44271302 sigma in 7act. ( 1.64089024 sig out 1act.) RANK 13 NODE 13 --> 2.40922523 sigma in 7act. ( 1.86045873 sig out 1act.) RANK 14 NODE 1 --> 0.913748741 sigma in 7act. ( 0.391041219 sig out 1act.) RANK 15 NODE 10 --> 0.819349587 sigma in 7act. ( 0.352825135 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 37.7216415 sigma out 1act.( 41.2193527 sig in 7act.) RANK 2 NODE 15 --> 37.2115517 sigma out 1act.( 32.5914726 sig in 7act.) RANK 3 NODE 4 --> 15.1408834 sigma out 1act.( 15.5978594 sig in 7act.) RANK 4 NODE 5 --> 13.659976 sigma out 1act.( 14.8405905 sig in 7act.) RANK 5 NODE 2 --> 10.4698915 sigma out 1act.( 10.7717924 sig in 7act.) RANK 6 NODE 9 --> 6.97398949 sigma out 1act.( 6.7647748 sig in 7act.) RANK 7 NODE 7 --> 4.45194292 sigma out 1act.( 4.81780577 sig in 7act.) RANK 8 NODE 14 --> 4.19259071 sigma out 1act.( 4.68665886 sig in 7act.) RANK 9 NODE 6 --> 2.19173861 sigma out 1act.( 3.15142822 sig in 7act.) RANK 10 NODE 12 --> 1.89622128 sigma out 1act.( 2.62138987 sig in 7act.) RANK 11 NODE 13 --> 1.86045873 sigma out 1act.( 2.40922523 sig in 7act.) RANK 12 NODE 8 --> 1.69275999 sigma out 1act.( 2.56254721 sig in 7act.) RANK 13 NODE 3 --> 1.64089024 sigma out 1act.( 2.44271302 sig in 7act.) RANK 14 NODE 1 --> 0.391041219 sigma out 1act.( 0.913748741 sig in 7act.) RANK 15 NODE 10 --> 0.352825135 sigma out 1act.( 0.819349587 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 58.6243286 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.467282891 *** contribution from regularisation: 0.00680481689 *** contribution from error: -0.474087715 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -10202.6545 -0.503295958 1.84546709 EXIT FROM BFGS code NEW_X -10202.6545 -0.503295958 1.84546709 ENTER BFGS code NEW_X -10202.6545 -0.503295958 1.84546709 EXIT FROM BFGS code FG_LNSRCH 0. -0.471343905 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.467455208 *** contribution from regularisation: 0.0064225439 *** contribution from error: -0.473877758 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10206.417 -0.471343905 -4.27973223 EXIT FROM BFGS code NEW_X -10206.417 -0.471343905 -4.27973223 ENTER BFGS code NEW_X -10206.417 -0.471343905 -4.27973223 EXIT FROM BFGS code FG_LNSRCH 0. -0.458556235 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.466697752 *** contribution from regularisation: 0.00671505323 *** contribution from error: -0.473412812 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10189.8785 -0.458556235 -46.867569 EXIT FROM BFGS code FG_LNSRCH 0. -0.46781534 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.467239678 *** contribution from regularisation: 0.00688758446 *** contribution from error: -0.474127263 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10201.711 -0.46781534 -15.4725504 EXIT FROM BFGS code FG_LNSRCH 0. -0.470807016 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.467140168 *** contribution from regularisation: 0.00679280842 *** contribution from error: -0.473932981 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10199.5387 -0.470807016 -5.85386944 EXIT FROM BFGS code FG_LNSRCH 0. -0.471327215 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.467093468 *** contribution from regularisation: 0.00678607356 *** contribution from error: -0.473879546 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10198.5191 -0.471327215 -4.21958447 EXIT FROM BFGS code FG_LNSRCH 0. -0.471343875 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.467141837 *** contribution from regularisation: 0.00673590554 *** contribution from error: -0.473877728 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10199.5752 -0.471343875 -4.11763573 EXIT FROM BFGS code FG_LNSRCH 0. -0.471343905 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.467100143 *** contribution from regularisation: 0.0067776111 *** contribution from error: -0.473877758 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10198.6645 -0.471343905 -4.11888313 EXIT FROM BFGS code FG_LNSRCH 0. -0.471343905 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.467102706 *** contribution from regularisation: 0.00677503319 *** contribution from error: -0.473877728 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10198.7208 -0.471343905 -4.08989429 EXIT FROM BFGS code FG_LNSRCH 0. -0.471343905 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.467033684 *** contribution from regularisation: 0.00684406608 *** contribution from error: -0.473877758 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -10197.2135 -0.471343905 -4.12094688 EXIT FROM BFGS code NEW_X -10197.2135 -0.471343905 -4.12094688 ENTER BFGS code NEW_X -10197.2135 -0.471343905 -4.12094688 EXIT FROM BFGS code CONVERGENC -10197.2135 -0.471343905 -4.12094688 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 66.8295212 sigma out 15 active outputs RANK 2 NODE 2 --> 43.0539436 sigma out 15 active outputs RANK 3 NODE 6 --> 42.9685059 sigma out 15 active outputs RANK 4 NODE 4 --> 20.0489712 sigma out 15 active outputs RANK 5 NODE 7 --> 18.2061558 sigma out 15 active outputs RANK 6 NODE 5 --> 10.8451118 sigma out 15 active outputs RANK 7 NODE 3 --> 7.75564051 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 70.4715042 sigma in 7act. ( 66.0368423 sig out 1act.) RANK 2 NODE 15 --> 54.1511116 sigma in 7act. ( 64.2605667 sig out 1act.) RANK 3 NODE 5 --> 19.7945099 sigma in 7act. ( 19.6563244 sig out 1act.) RANK 4 NODE 4 --> 19.7878647 sigma in 7act. ( 19.9098225 sig out 1act.) RANK 5 NODE 2 --> 15.5094385 sigma in 7act. ( 15.7189684 sig out 1act.) RANK 6 NODE 9 --> 8.80298615 sigma in 7act. ( 9.16165638 sig out 1act.) RANK 7 NODE 7 --> 4.92111683 sigma in 7act. ( 4.68756628 sig out 1act.) RANK 8 NODE 14 --> 4.74688911 sigma in 7act. ( 4.48853922 sig out 1act.) RANK 9 NODE 6 --> 3.0919919 sigma in 7act. ( 2.63001275 sig out 1act.) RANK 10 NODE 3 --> 2.63183856 sigma in 7act. ( 2.38700652 sig out 1act.) RANK 11 NODE 12 --> 2.26212263 sigma in 7act. ( 1.86305583 sig out 1act.) RANK 12 NODE 8 --> 2.20736122 sigma in 7act. ( 1.5225445 sig out 1act.) RANK 13 NODE 13 --> 1.87851834 sigma in 7act. ( 1.52064979 sig out 1act.) RANK 14 NODE 1 --> 0.569503725 sigma in 7act. ( 0.402039677 sig out 1act.) RANK 15 NODE 10 --> 0.48700875 sigma in 7act. ( 0.208750069 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 66.0368423 sigma out 1act.( 70.4715042 sig in 7act.) RANK 2 NODE 15 --> 64.2605667 sigma out 1act.( 54.1511116 sig in 7act.) RANK 3 NODE 4 --> 19.9098225 sigma out 1act.( 19.7878647 sig in 7act.) RANK 4 NODE 5 --> 19.6563244 sigma out 1act.( 19.7945099 sig in 7act.) RANK 5 NODE 2 --> 15.7189684 sigma out 1act.( 15.5094385 sig in 7act.) RANK 6 NODE 9 --> 9.16165638 sigma out 1act.( 8.80298615 sig in 7act.) RANK 7 NODE 7 --> 4.68756628 sigma out 1act.( 4.92111683 sig in 7act.) RANK 8 NODE 14 --> 4.48853922 sigma out 1act.( 4.74688911 sig in 7act.) RANK 9 NODE 6 --> 2.63001275 sigma out 1act.( 3.0919919 sig in 7act.) RANK 10 NODE 3 --> 2.38700652 sigma out 1act.( 2.63183856 sig in 7act.) RANK 11 NODE 12 --> 1.86305583 sigma out 1act.( 2.26212263 sig in 7act.) RANK 12 NODE 8 --> 1.5225445 sigma out 1act.( 2.20736122 sig in 7act.) RANK 13 NODE 13 --> 1.52064979 sigma out 1act.( 1.87851834 sig in 7act.) RANK 14 NODE 1 --> 0.402039677 sigma out 1act.( 0.569503725 sig in 7act.) RANK 15 NODE 10 --> 0.208750069 sigma out 1act.( 0.48700875 sig in 7act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 98.3215256 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.467154711 *** contribution from regularisation: 0.00672305282 *** contribution from error: -0.473877758 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 25167 Closing output file done