python - How to get dimensions right using fmin_cg in scipy.optimize -


i have been trying use fmin_cg minimize cost function logistic regression.

xopt = fmin_cg(costfn, fprime=grad, x0= initial_theta,                                   args = (x, y, m), maxiter = 400, disp = true, full_output = true ) 

this how call fmin_cg

here costfn:

def costfn(theta, x, y, m):     h = sigmoid(x.dot(theta))     j = 0     j = 1 / m * np.sum((-(y * np.log(h))) - ((1-y) * np.log(1-h)))     return j.flatten() 

here grad:

def grad(theta, x, y, m):     h = sigmoid(x.dot(theta))     j = 1 / m * np.sum((-(y * np.log(h))) - ((1-y) * np.log(1-h)))     gg = 1 / m * (x.t.dot(h-y))     return gg.flatten() 

it seems throwing error:

/users/sugethakch/miniconda2/lib/python2.7/site-packages/scipy/optimize/linesearch.pyc in phi(s)      85     def phi(s):      86         fc[0] += 1 ---> 87         return f(xk + s*pk, *args)      88       89     def derphi(s):  valueerror: operands not broadcast shapes (3,) (300,)  

i know it's dimensions. can't seem figure out. noob, might making obvious mistake.

i have read link:

fmin_cg: desired error not achieved due precision loss

but, somehow doesn't seem work me.

any help?


updated size x,y,m,theta

(100, 3) ----> x

(100, 1) -----> y

100 ----> m

(3, 1) ----> theta


this how initialize x,y,m:

data = pd.read_csv('ex2data1.txt', sep=",", header=none)                         data.columns = ['x1', 'x2', 'y']                                                        x1 = data.iloc[:, 0].values[:, none]                                                      x2 = data.iloc[:, 1].values[:, none]                                                     y = data.iloc[:, 2].values[:, none] # join x1 , x2 make 1 array of x x = np.concatenate((x1, x2), axis=1) m, n = x.shape 

ex2data1.txt:

34.62365962451697,78.0246928153624,0 30.28671076822607,43.89499752400101,0 35.84740876993872,72.90219802708364,0 ..... 

if helps, trying re-code 1 of homework assignments coursera's ml course andrew ng in python

finally, figured out problem in initial program was.

my 'y' (100, 1) , fmin_cg expects (100, ). once flattened 'y' no longer threw initial error. but, optimization wasn't working still.

 warning: desired error not achieved due precision loss.      current function value: 0.693147      iterations: 0      function evaluations: 43      gradient evaluations: 41 

this same achieved without optimization.

i figured out way optimize use 'nelder-mead' method. followed answer: scipy not optimizing , returns "desired error not achieved due precision loss"

result = op.minimize(fun = costfn,                  x0 = initial_theta,                  args = (x, y, m),                 method = 'nelder-mead',                 options={'disp': true})#,                 #jac = grad) 

this method doesn't need 'jacobian'. got results looking for,

optimization terminated successfully.      current function value: 0.203498      iterations: 157      function evaluations: 287 

Comments