2018-07-17-梯度高级优化Octave实现

GNU Octave, version 4.4.0
Copyright (C) 2018 John W. Eaton and others.
This is free software; see the source code for copying conditions.
There is ABSOLUTELY NO WARRANTY; not even for MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE.  For details, type 'warranty'.

Octave was configured for "x86_64-w64-mingw32".

Additional information about Octave is available at https://www.octave.org.

Please contribute if you find this software useful.
For more information, visit https://www.octave.org/get-involved.html

Read https://www.octave.org/bugs.html to learn how to submit bug reports.
For information about changes from previous versions, type 'news'.

>> options = optimset('GradObj', 'on', 'MaxIter', '1
00')
options =

  scalar structure containing the fields:

    GradObj = on
    MaxIter = 100

>> initialTheta = zeros(2,1)
initialTheta =

   0
   0

>> [optTheta, functionVal, exitFlag] = fminunc(@costFunction,initialTheta, options)
error: @costFunction: no function and no method found
>>
>> [optTheta, functionVal, exitFlag] = fminunc(@costFunction,initialTheta, options)
jVal =  50
gradient =

   0
   0

gradient =

  -10
    0

gradient =

  -10
  -10

jVal =  50
gradient =

   0
   0

gradient =

  -10
    0

gradient =

  -10
  -10

jVal =  48.596
gradient =

   0
   0

gradient =

  -9.85858
   0.00000

gradient =

  -9.8586
  -9.8586

jVal =  48.596
gradient =

   0
   0

gradient =

  -9.85858
   0.00000

gradient =

  -9.8586
  -9.8586

jVal =  46.644
gradient =

   0
   0

gradient =

  -9.65858
   0.00000

gradient =

  -9.6586
  -9.6586

jVal =  46.644
gradient =

   0
   0

gradient =

  -9.65858
   0.00000

gradient =

  -9.6586
  -9.6586

jVal =  43.952
gradient =

   0
   0

gradient =

  -9.37574
   0.00000

gradient =

  -9.3757
  -9.3757

jVal =  43.952
gradient =

   0
   0

gradient =

  -9.37574
   0.00000

gradient =

  -9.3757
  -9.3757

jVal =  40.282
gradient =

   0
   0

gradient =

  -8.97575
   0.00000

gradient =

  -8.9758
  -8.9758

jVal =  40.282
gradient =

   0
   0

gradient =

  -8.97575
   0.00000

gradient =

  -8.9758
  -8.9758

jVal =  35.365
gradient =

   0
   0

gradient =

  -8.41009
   0.00000

gradient =

  -8.4101
  -8.4101

jVal =  35.365
gradient =

   0
   0

gradient =

  -8.41009
   0.00000

gradient =

  -8.4101
  -8.4101

jVal =  28.957
gradient =

   0
   0

gradient =

  -7.61013
   0.00000

gradient =

  -7.6101
  -7.6101

jVal =  28.957
gradient =

   0
   0

gradient =

  -7.61013
   0.00000

gradient =

  -7.6101
  -7.6101

jVal =  20.988
gradient =

   0
   0

gradient =

  -6.47882
   0.00000

gradient =

  -6.4788
  -6.4788

jVal =  20.988
gradient =

   0
   0

gradient =

  -6.47882
   0.00000

gradient =

  -6.4788
  -6.4788

jVal =  11.902
gradient =

   0
   0

gradient =

  -4.87893
   0.00000

gradient =

  -4.8789
  -4.8789

jVal =  11.902
gradient =

   0
   0

gradient =

  -4.87893
   0.00000

gradient =

  -4.8789
  -4.8789

jVal =  3.4227
gradient =

   0
   0

gradient =

  -2.61636
   0.00000

gradient =

  -2.6164
  -2.6164

jVal =  3.4227
gradient =

   0
   0

gradient =

  -2.61636
   0.00000

gradient =

  -2.6164
  -2.6164

jVal =   1.5777e-030
gradient =

   0
   0

gradient =

  -1.7764e-015
  0.0000e+000

gradient =

  -1.7764e-015
  -1.7764e-015

jVal =   1.5777e-030
gradient =

   0
   0

gradient =

  -1.7764e-015
  0.0000e+000

gradient =

  -1.7764e-015
  -1.7764e-015

optTheta =

   5.0000
   5.0000

functionVal =   1.5777e-030
exitFlag =  1

打赏一个呗

取消

感谢您的支持,我会继续努力的!

扫码支持
扫码支持
扫码打赏,你说多少就多少

打开支付宝扫一扫,即可进行扫码打赏哦