My feature data is defined in such a way that I believe all weights must be non-negative.

I am looking for a reference discussing how to optimize the weights of a logistic regression classifier with the constraint that the weights must be non-negative, and perhaps also a constraint on the sign of the bias.Will replacing weights with squared weights and using the regular ML cost function with a local optimization scheme work?

**Answer**

I have done this successfully using projected gradient descent.

The algorithm is very simple – take a gradient step, then set all negative coefficients to zero (i.e. project onto the feasible set).

I started with Leon Bottou’s code here: http://leon.bottou.org/projects/sgd.

**Attribution***Source : Link , Question Author : o17t H1H’ S’k , Answer Author : user1149913*