Logistic Regression Using Gradient Descent in R -
i new here. please pardon me if breaking stackoverflow rules. trying implement logistic regression using gradient descent in r. results gradient descent not matching newton's method (solution). cannot figure out going wrong. please help? in advance.
source dataset: http://openclassroom.stanford.edu/mainfolder/courses/machinelearning/exercises/ex4materials/ex4data.zip
this theta = [-0.2268167, 0.6366124, -0.4850165] correct theta values [-16.37875042 0.14834094 0.15890845]
rm(list=ls()) rawdataset <- read.csv("ex4data1.csv") # x1,x2,y -> csv format dataset = cbind(o = rep_len(1,nrow(rawdataset)), rawdataset) x = data.matrix(dataset[1:ncol(rawdataset)]) y = data.matrix(dataset[ncol(rawdataset)+1]) iterations = 500 alpha = 0.01 m = nrow(y) theta = matrix(c(rep(0,ncol(x))),1,ncol(x)) costhistory = matrix(c(0),iterations,1) for(i in 1:iterations) { predictions = 1 / (1 + exp(-x %*% t(theta))) for(j in 1:ncol(x)) { errors = (predictions - y) * x[,j] theta[,j] = theta[,j] - alpha * (1.0 / m) * sum(errors) } predictions = 1 / (1 + exp(-x %*% t(theta))) costhistory[i,] = sum(-y*log(predictions) - (1-y)*log(1-predictions))/m if (i > 1) { if (costhistory[i] > costhistory[i-1]) { break } } } t(theta) plot(costhistory)
Comments
Post a Comment