Monday, January 12, 2026

Nonlinear multilevel mixed-effects fashions – The Stata Weblog


You might have a mannequin that’s nonlinear within the parameters. Maybe it’s a mannequin of tree progress and due to this fact asymptotes to a most worth. Maybe it’s a mannequin of serum concentrations of a drug that rise quickly to a peak focus after which decay exponentially. Straightforward sufficient, use nonlinear regression ([R] nl) to suit your mannequin. However … what when you have repeated measures for every tree or repeated blood serum ranges for every affected person? You would possibly wish to account for the correlation inside tree or affected person. You would possibly even imagine that every tree has its personal asymptotic progress. You want nonlinear mixed-effects fashions—additionally referred to as nonlinear hierarchical fashions or nonlinear multilevel fashions.

The menl command, launched in Stata 15, matches NLME fashions. Classical nonlinear fashions assume there may be one remark per topic and that topics are impartial. You possibly can consider NLME fashions as an extension of nonlinear fashions to the case the place a number of measurements could also be taken over a topic and these within-subject observations are normally correlated. You might also consider NLME fashions as a generalization of linear mixed-effects fashions the place some or all random results enter the mannequin in a nonlinear trend. No matter the way you consider them, NLME fashions are used to explain a response variable as a (nonlinear) perform of covariates, accounting for the correlation amongst observations on the identical topic.

On this weblog, I’ll stroll you thru a number of examples of becoming nonlinear mixed-effects fashions utilizing menl. I’ll first match a nonlinear mannequin to the information on the expansion of bushes, ignoring the correlation amongst measurements on the identical tree. I’ll then show alternative ways of accounting for this correlation and the way to incorporate random results into the mannequin parameters to offer the parameters tree-specific interpretation.

We’ve knowledge (Draper and Smith 1998) on trunk circumference of 5 completely different orange bushes the place trunk circumference (in mm) was measured on seven completely different events, over roughly a four-year interval of progress. We wish to mannequin the expansion of orange bushes. Let’s plot the information first.

. webuse orange
. twoway scatter circumf age, join(L) ylabel(#6)

There may be some variation within the progress curves, however the identical general form is noticed for all bushes. Particularly, the expansion tends to stage off towards the top. Pinheiro and Bates (2000) instructed the next nonlinear progress mannequin for these knowledge:

[
{mathsf{circumf}}_{ij}=frac{phi_{1}}{1+expleft{ -left( {mathsf{age}}_{ij}-phi_{2}right)/phi_{3}right}} + epsilon_{ij}, quad j=1,dots,5; i=1,dots,7
]

Parameters of NLME fashions usually have scientifically significant interpretations and kind the idea of analysis questions. On this mannequin, (phi_{1}) is the common asymptotic trunk circumference of the bushes as (mathsf{age}_{ij} to infty), (phi_{2}) is the common age at which the tree attains half the asymptotic trunk circumference (phi_{1}) (half-life), and (phi_{3}) is a scale parameter.

For now, let’s ignore the repercussions (much less correct estimates of uncertainty in parameter estimates, probably much less highly effective assessments of hypotheses, and fewer correct interval estimates) of not accounting for the correlation and potential heterogeneity among the many observations. As an alternative, we are going to deal with all observations as being i.i.d.. We are able to use nl to suit the above mannequin, however right here we are going to use menl. (As of 15.1, menl now not requires that you just embrace random results within the mannequin specification.)

. menl circumf = {phi1}/(1+exp(-(age-{phi2})/{phi3})), stddeviations

Acquiring beginning values:

NLS algorithm:

Iteration 0:   residual SS =  17480.234
Iteration 1:   residual SS =  17480.234

Computing normal errors:

Combined-effects ML nonlinear regression           Variety of obs     =         35
Log Chance = -158.39871
------------------------------------------------------------------------------
     circumf |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
       /phi1 |   192.6876   20.24411     9.52   0.000     153.0099    232.3653
       /phi2 |   728.7564   107.2984     6.79   0.000     518.4555    939.0573
       /phi3 |   353.5337   81.47184     4.34   0.000     193.8518    513.2156
------------------------------------------------------------------------------

------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
                sd(Residual) |   22.34805   2.671102      17.68079    28.24734
------------------------------------------------------------------------------

The stddeviations possibility instructs menl to report the usual deviation of the error time period as a substitute of variance. As a result of one progress curve is used for all bushes, the person variations as depicted within the above graph are included within the residuals, thus inflating the residual normal deviation.

We’ve a number of observations from the identical tree which might be prone to be correlated. One approach to account for the dependence between observations throughout the similar tree is by together with a random impact, (u_j), shared by all of the observations throughout the (j)th tree. Thus, our mannequin turns into

$$
mathsf{circumf}_{ij}=frac{phi_{1}}{1+expleft{ -left( mathsf{age}_{ij}-phi_{2}proper)/phi_{3}proper}} + u_j + epsilon_{ij}, quad j=1,dots,5; i=1,dots,7
$$

This mannequin could also be match by typing

. menl circumf = {phi1}/(1+exp(-(age-{phi2})/{phi3}))+{U[tree]}, stddev

Acquiring beginning values by EM:

Alternating PNLS/LME algorithm:

Iteration 1:    linearization log probability = -147.631786
Iteration 2:    linearization log probability = -147.631786

Computing normal errors:

Combined-effects ML nonlinear regression           Variety of obs     =         35
Group variable: tree                            Variety of teams  =          5

                                                Obs per group:
                                                              min =          7
                                                              avg =        7.0
                                                              max =          7
Linearization log probability = -147.63179
------------------------------------------------------------------------------
     circumf |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
       /phi1 |   192.2526   17.06127    11.27   0.000     158.8131    225.6921
       /phi2 |   729.3642   68.05493    10.72   0.000      595.979    862.7494
       /phi3 |    352.405   58.25042     6.05   0.000     238.2363    466.5738
------------------------------------------------------------------------------

------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
tree: Identification               |
                       sd(U) |   17.65093   6.065958      8.999985    34.61732
-----------------------------+------------------------------------------------
                sd(Residual) |    13.7099    1.76994      10.64497    17.65728
------------------------------------------------------------------------------

The fixed-effects estimates are related in each fashions, however their normal errors are smaller within the above mannequin. The residual normal deviation can also be smaller as a result of a few of the particular person variations are actually being accounted for by the random impact. Let (y_{sj}) and (y_{tj}) be two observations on the (j)th tree. The inclusion of the random impact (u_j) implies that ({rm Cov}(y_{sj},y_{tj}) > 0) (in actual fact, it is the same as ({rm Var}(u_j))). Subsequently, (u_j) induces a constructive correlation amongst any two observations throughout the similar tree.

There may be one other approach to account for the correlation amongst observations inside a gaggle. You possibly can mannequin the residual covariance construction explicitly utilizing menl‘s rescovariance() possibility. For instance, the above random-intercept nonlinear mannequin is equal to a nonlinear marginal mannequin with an exchangeable covariance matrix. Technically, the 2 fashions are equal solely when the correlation of the within-group observations is constructive.

. menl circumf = {phi1}/(1+exp(-(age-{phi2})/{phi3})),
    rescovariance(exchangeable, group(tree)) stddev

Acquiring beginning values:

Alternating GNLS/ML algorithm:

Iteration 1:    log probability = -147.632441
Iteration 2:    log probability = -147.631786
Iteration 3:    log probability = -147.631786
Iteration 4:    log probability = -147.631786

Computing normal errors:

Combined-effects ML nonlinear regression           Variety of obs     =         35
Group variable: tree                            Variety of teams  =          5

                                                Obs per group:
                                                              min =          7
                                                              avg =        7.0
                                                              max =          7
Log Chance = -147.63179
------------------------------------------------------------------------------
     circumf |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
       /phi1 |   192.2526   17.06127    11.27   0.000     158.8131    225.6921
       /phi2 |   729.3642   68.05493    10.72   0.000      595.979    862.7494
       /phi3 |    352.405   58.25042     6.05   0.000     238.2363    466.5738
------------------------------------------------------------------------------

------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
Residual: Exchangeable       |
                          sd |   22.34987    4.87771      14.57155    34.28026
                        corr |   .6237137   .1741451      .1707327    .8590478
------------------------------------------------------------------------------

The fixed-effects portion of the output is an identical within the two fashions. These two fashions additionally produce the identical estimates of the marginal covariance matrix. You possibly can confirm this by working estat wcorrelation after every mannequin. Though the 2 fashions are equal, by together with a random impact on bushes, we cannot solely account for the dependence of observations inside bushes, but in addition estimate between-tree variability and predict tree-specific results after estimation.

The earlier two fashions indicate that the expansion curves for all bushes have the identical form and differ solely by a vertical shift that equals (u_j). This can be too sturdy an assumption in apply. If we glance again on the graph, we are going to discover there may be an growing variability within the trunk circumferences of bushes as they strategy their limiting peak. Subsequently, it could be extra affordable to permit (phi_1) to fluctuate between bushes. This can even give our parameters of curiosity a tree-specific interpretation. That’s, we assume

$$
mathsf{circumf}_{ij}=frac{phi_{1}}{1+expleft{ -left( mathsf{age}_{ij}-phi_{2}proper)/phi_{3}proper}} + epsilon_{ij}
$$

with

$$
phi_1 = phi_{1j} = beta_1 + u_j
$$

Chances are you’ll interpret (beta_1) as the common asymptotic peak and (u_j) as a random deviation from the common that’s particular to the (j)th tree. The above mannequin could also be match as follows:

. menl circumf = ({b1}+{U1[tree]})/(1+exp(-(age-{phi2})/{phi3}))

Acquiring beginning values by EM:

Alternating PNLS/LME algorithm:

Iteration 1:    linearization log probability = -131.584579

Computing normal errors:

Combined-effects ML nonlinear regression           Variety of obs     =         35
Group variable: tree                            Variety of teams  =          5

                                                Obs per group:
                                                              min =          7
                                                              avg =        7.0
                                                              max =          7
Linearization log probability = -131.58458
------------------------------------------------------------------------------
     circumf |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
         /b1 |    191.049   16.15403    11.83   0.000     159.3877    222.7103
       /phi2 |    722.556   35.15082    20.56   0.000     653.6616    791.4503
       /phi3 |   344.1624   27.14739    12.68   0.000     290.9545    397.3703
------------------------------------------------------------------------------

------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
tree: Identification               |
                     var(U1) |   991.1514   639.4636      279.8776    3510.038
-----------------------------+------------------------------------------------
               var(Residual) |   61.56371   15.89568      37.11466    102.1184
------------------------------------------------------------------------------

If desired, we will moreover mannequin the within-tree error covariance construction utilizing, for instance, an exchangeable construction:

. menl circumf = ({b1}+{U1[tree]})/(1+exp(-(age-{phi2})/{phi3})),
    rescovariance(exchangeable)

Acquiring beginning values by EM:

Alternating PNLS/LME algorithm:

Iteration 1:    linearization log probability = -131.468559
Iteration 2:    linearization log probability = -131.470388
Iteration 3:    linearization log probability = -131.470791
Iteration 4:    linearization log probability = -131.470813
Iteration 5:    linearization log probability = -131.470813

Computing normal errors:

Combined-effects ML nonlinear regression           Variety of obs     =         35
Group variable: tree                            Variety of teams  =          5

                                                Obs per group:
                                                              min =          7
                                                              avg =        7.0
                                                              max =          7
Linearization log probability = -131.47081
------------------------------------------------------------------------------
     circumf |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
         /b1 |   191.2005   15.59015    12.26   0.000     160.6444    221.7566
       /phi2 |   721.5232   35.66132    20.23   0.000     651.6283    791.4182
       /phi3 |   344.3675   27.20839    12.66   0.000     291.0401     397.695
------------------------------------------------------------------------------

------------------------------------------------------------------------------
  Random-effects Parameters  |   Estimate   Std. Err.     [95% Conf. Interval]
-----------------------------+------------------------------------------------
tree: Identification               |
                     var(U1) |   921.3895    582.735      266.7465    3182.641
-----------------------------+------------------------------------------------
Residual: Exchangeable       |
                         var |   54.85736   14.16704      33.06817    91.00381
                         cov |  -9.142893   2.378124     -13.80393   -4.481856
------------------------------------------------------------------------------

We didn’t specify possibility group() within the above as a result of, within the presence of random results, rescovariance() mechanically determines the lowest-level group based mostly on the required random results.

The NLME fashions we used up to now are all linear within the random impact. In NLME fashions, random results can enter the mannequin nonlinearly, identical to the fastened results, they usually usually do. As an example, along with (phi_1), we will let different parameters fluctuate between bushes and have their very own random results:

. menl circumf = {phi1:}/(1+exp(-(age-{phi2:})/{phi3:})),
    outline(phi1:{b1}+{U1[tree]})
    outline(phi2:{b2}+{U2[tree]})
    outline(phi3:{b3}+{U3[tree]})
(output omitted)

The above mannequin is just too sophisticated for our knowledge containing solely 5 bushes and is included purely for demonstration functions. However from this instance, you may see that the outline() possibility is beneficial when you could have a sophisticated nonlinear expression, and also you wish to break it down into smaller items. Parameters specified inside outline() might also be predicted for every tree after estimation. For instance, after we match our mannequin, we might want to predict the asymptotic peak, (phi_{1j}), for every tree (j). Under, we’re requesting to assemble a variable named phi1 that incorporates the expected values for the expression {phi1:}:

. predict (phi1 = {phi1:})
(output omitted)

If we had extra bushes, we might have additionally specified a covariance construction for these random results as follows:

. menl circumf = {phi1:}/(1+exp(-(age-{phi2:})/{phi3:})),
    outline(phi1:{b1}+{U1[tree]})
    outline(phi2:{b2}+{U2[tree]})
    outline(phi3:{b3}+{U3[tree]})
    covariance(U1 U2 U3, unstructured)
(output omitted)

I demonstrated only some fashions on this article, however menl can do far more. For instance, menl can incorporate larger ranges of nesting similar to three-level fashions (for instance, repeated observations nested inside bushes and bushes nested inside planting zones). See [ME] menl for extra examples.

References

Draper, N., and H. Smith. 1998. Utilized Regression Evaluation. third ed. New York: Wiley.

Pinheiro, J. C., and D. M. Bates. 2000. Combined-Results Fashions in S and S-PLUS. New York: Springer.



Related Articles

Latest Articles