Is there a way to force a relationship between coefficients in logistic regression?

I would like to specify a logistic regression model where I have the following relationship:

E[Yi|Xi]=f(βxi1+β2xi2) where f is the inverse logit function.

Is there a “quick” way to do this with pre-existing R functions or is there a name for a model like this? I realize I can modify the Newton-Raphson algorithm used for logistic regression but this is a lot of theoretic and coding work and I’m looking for a short cut.

EDIT: getting point estimates for β is pretty easy using optim() or some other optimizer in R to maximize the likelihood. But I need standard errors on these guys.


This is fairly easy to do with the optim function in R. My understanding is that you want to run a logistic regression where y is binary. You simply write the function and then stick it into optim. Below is some code I didn’t run (pseudo code).

#d is your data frame and y is normalized to 0,1

    EXP=exp(d$x1*b +d$x2*b^2)

    VALS=( EXP/(1+EXP) )^(d$y)*( 1/(1+EXP) )^(1-d$y) 

# estimates 
    #standard errors
# maximum log likelihood

Notice that is the negative of a log likelihood function. So optim is maximizing the log likelihood (by default optim minimizes everything which is why I made the function negative). If Y is not binary go to for multinomial and conditional function forms in logit models.

Source : Link , Question Author : TrynnaDoStat , Answer Author : eipi10

Leave a Comment