Add regression association to latent variable model
Add regression association to latent variable model
Define regression association between variables in a lvm-object and define linear constraints between model equations.
## S3 method for class 'lvm'regression(object = lvm(), to, from, fn =NA,messages = lava.options()$messages, additive=TRUE, y, x, value,...)## S3 replacement method for class 'lvm'regression(object, to=NULL, quick=FALSE,...)<- value
Arguments
object: lvm-object.
...: Additional arguments to be passed to the low level functions
value: A formula specifying the linear constraints or if to=NULL a list of parameter values.
to: Character vector of outcome(s) or formula object.
from: Character vector of predictor(s).
fn: Real function defining the functional form of predictors (for simulation only).
messages: Controls which messages are turned on/off (0: all off)
additive: If FALSE and predictor is categorical a non-additive effect is assumed
y: Alias for 'to'
x: Alias for 'from'
quick: Faster implementation without parameter constraints
Returns
A lvm-object
Details
The regression function is used to specify linear associations between variables of a latent variable model, and offers formula syntax resembling the model specification of e.g. lm.
For instance, to add the following linear regression model, to the lvm-object, m:
E(Y∣X1,X2)=β1X1+β2X2
We can write
regression(m) <- y ~ x1 + x2
Multivariate models can be specified by successive calls with regression, but multivariate formulas are also supported, e.g.
regression(m) <- c(y1,y2) ~ x1 + x2
defines
E(Yi∣X1,X2)=β1iX1+β2iX2
The special function, f, can be used in the model specification to specify linear constraints. E.g. to fix β1=β2
, we could write
regression(m) <- y ~ f(x1,beta) + f(x2,beta)
The second argument of f can also be a number (e.g. defining an offset) or be set to NA in order to clear any previously defined linear constraints.
Alternatively, a more straight forward notation can be used:
regression(m) <- y ~ beta*x1 + beta*x2
All the parameter values of the linear constraints can be given as the right handside expression of the assigment function regression<- (or regfix<-) if the first (and possibly second) argument is defined as well. E.g:
regression(m,y1~x1+x2) <- list("a1","b1")
defines E(Y1∣X1,X2)=a1X1+b1X2. The rhs argument can be a mixture of character and numeric values (and NA's to remove constraints).
The function regression (called without additional arguments) can be used to inspect the linear constraints of a lvm-object.
Note
Variables will be added to the model if not already present.
Examples
m <- lvm()## Initialize empty lvm-object### E(y1|z,v) = beta1*z + beta2*vregression(m)<- y1 ~ z + v
### E(y2|x,z,v) = beta*x + beta*z + 2*v + beta3*uregression(m)<- y2 ~ f(x,beta)+ f(z,beta)+ f(v,2)+ u
### Clear restriction on association between y and### fix slope coefficient of u to betaregression(m, y2 ~ v+u)<- list(NA,"beta")regression(m)## Examine current linear parameter constraints## ## A multivariate model, E(yi|x1,x2) = beta[1i]*x1 + beta[2i]*x2:m2 <- lvm(c(y1,y2)~ x1+x2)