[R-meta] No convergence meta analysis of identical multinomial logistic models.

Ricardo de Boer r@o@@@deboer @end|ng |rom t||burgun|ver@|ty@edu
Thu Nov 19 11:14:04 CET 2020


Hi,

I am fairly new to meta analysis and at the moment I ran into an
convergence error and I was wondering what steps need to be undertaken
to achieve convergence.

Let me first explain to you my problem. The dataset I am using
contains 8.5 million observations of pitch data in major league
baseball. I am trying to investigate whether the pitchers
I have 2000 multinomial models with the same explanatory variables.
just to name a few explanatory variables: count (factorvariable with
amount of pitches and balls), the number of outs beforethe pitch took
place, if the pitcher is left/right handed. The dependent variable is
categorical with the values 'Fastball', 'Breakingball' and 'Changeup'.

Every model has data from 1 pitcher, where the dataset varies much in
the amount of observations per model. The goal is to investigate
whether these individuals behave differently, as well as what
characteristics of the pitcher make them behave differently.

I already received feedback from Wolgang on how to conduct the meta analysis.
For every pitcher, I have a multinomial model where the model
coefficients are extracted  and put in a list (call it: b). This is
also done for the var-cov of those coefficients(call it: V). Then, I
unlisted b and V. I created a player.id vector as well as a
variable.id vector.
Afterwards it is possible to run a meta analysis on the coefficients
in the following way:
(I ran a meta analysis on the first 50 models, not 2000 yet)

rma.mv(b, V, mods = ~ factor(variable.id) - 1, random = ~
factor(variable.id) | player.id, struct="UN", sparse=TRUE,
verbose=TRUE)

The above mentioned meta analysis works when struct is set to default,
"CS" or "HCS", but not with struct = "UN". Although in any meta
analysis that I have run so far, I receive an warning message that
says:

Warning message:
In rma.mv(b, V, mods = ~factor(variable.id) - 1, random =
~factor(variable.id) |  :
  Ratio of largest to smallest sampling variance extremely large. May
not be able to obtain stable results.

When I set the struct ="UN", I get the following error after many
hours. I seems that it is stuck in a local maxima after many hours of
running.

Error in rma.mv(b, V, mods = ~factor(variable.id) - 1, random =
~factor(variable.id) |  :
  Optimizer (nlminb) did not achieve convergence (convergence = 1).
In addition: Warning message:
In rma.mv(b, V, mods = ~factor(variable.id) - 1, random =
~factor(variable.id) |  :
Ratio of largest to smallest sampling variance extremely large. May
not be able to obtain stable results.

Is there a way to reach convergence? Is there anything I can do about
these warning messages?

I don't know if it helps, but I will mention the ll, tau2 and rho,
just before I got an error of no convergence. (at the bottom of this
message).

I believe that I need to manually set tau2 and rho, but I'm not sure.

Lastly, what is the difference in interpretation if I run the below
mentioned meta analysis compared to the one Wolgang recommended me.
The difference lies in the random part where I altered it to random =
~ 1 | player.id and leaving out the struct = "UN", because that is
disregarded either way then.  What I think is that I ran a mixed
effects model with only varying intercepts for the pitchers in the
below mentioned case.

rma.mv(b, V, mods = ~ factor(variable.id) - 1, random = ~ 1 |
player.id,  sparse=TRUE, verbose=TRUE)

To summarize my questions:
Is there a way to reach convergence?
Is there anything I can do about these warning messages?
What is the difference in interpretation?

Any advice on how to proceed is highly appreciated.


ll = -18603.6426
tau2 = c(0.1508, 0.2016, 0.0593, 0.0729, 0.2317, 0.3542, 0.0452,
0.0604, 0.0690, 0.0669, 0.1402, 0.3604, 0.2772, 0.2115, 0.1817,
0.1524, 0.1301, 0.3421, 0.6428, 0.3078, 0.3403, 0.2608, 0.1659,
0.4243, 0.0119, 0.0363, 0.0862, 0.4163)
rho = c(0.2289,-0.1471, 0.0524, 0.1389,-0.0911, 0.3515, 0.1262,
0.2239, 0.4314, 0.1383,-0.0725,-0.0780, 0.0833, 0.5362,-0.2861,
0.1613,-0.1386,-0.3373, 0.0174,-0.2871, 0.1560, 0.2784, 0.3446,
0.1405, 0.4257, 0.5349, 0.0060,-0.1630, 0.1319, 0.2658, 0.7631,
0.5231, 0.5883, 0.0441,-0.0861, 0.5200, 0.1121,-0.0348, 0.2969,
0.8384, 0.1838, 0.5219,-0.0372, 0.6533, 0.5988,-0.0022, 0.1343,
0.4633, 0.1505, 0.8475,-0.3028,-0.0324, 0.4166, 0.6198, 0.1990,
0.0920, 0.0044, 0.0830, 0.5364,-0.1806, 0.8285, 0.1668, 0.2280,
0.1424, 0.5914,-0.2417, 0.3360,-0.0623,-0.4578,
0.0022,-0.0712,-0.0746, 0.5757, 0.1784,-0.1073, 0.0367,-0.1341,
0.0013, 0.0931, 0.3820, 0.1053, 0.0035, 0.1518, 0.1013,-0.0153,
0.4991, 0.1533, 0.2000, 0.1243, 0.2846, 0.2767,
0.3803,-0.0541,-0.1176, 0.3366, 0.1470, 0.0853, 0.6318, 0.3464,
0.2588, 0.3308, 0.1981, 0.1800, 0.7962, 0.2815, 0.1303, 0.4582,
0.1903, 0.2537, 0.3107, 0.2717,-0.1092, 0.7615, 0.3814, 0.5234,
0.1995, 0.4969, 0.1606, 0.8527, 0.2849, 0.1483, 0.1083, 0.2908,
0.2007, 0.7324,-0.1987, 0.2675, 0.4506, 0.5507, 0.2096,
0.8548,-0.1065, 0.2166, 0.2160, 0.5937, 0.2572, 0.2003, 0.0453,
0.0985, 0.5423, 0.0211, 0.6948, 0.0843, 0.5098, 0.2643,
0.7222,-0.1023, 0.8114, 0.2588, 0.5064, 0.3885, 0.7213, 0.0550,
0.5369,-0.0114,-0.3025,-0.1885,-0.0535,-0.3893,-0.0170,-0.0777,-0.1449,-0.2225,-0.1658,-0.2592,
0.3576,-0.0914, 0.0188,-0.2216,-0.2069,-0.2428, 0.2988, 0.1031,
0.1030, 0.1722, 0.0957, 0.3964,-0.1560, 0.1722, 0.0907, 0.2357,
0.0104, 0.2482,-0.0033, 0.5061, 0.0755, 0.4075,-0.0022, 0.3639,
0.0933, 0.5728, 0.2217,-0.1232, 0.1043, 0.1923, 0.1117, 0.2038,
0.3263, 0.0959, 0.1589, 0.0522, 0.2504, 0.6467, 0.5219, 0.5852,
0.4965, 0.3067, 0.5289, 0.2853, 0.2642,-0.0712, 0.3174,
0.0415,-0.0727, 0.0858, 0.1702,-0.1773, 0.3730,-0.0070, 0.0960,
0.0545, 0.3180,-0.1435, 0.7252,-0.0970, 0.6729,-0.0139,
0.3325,-0.2098, 0.5342, 0.0148, 0.3279, 0.1504,-0.0100, 0.1517,
0.4155, 0.1276, 0.3376, 0.3259, 0.1824, 0.1891, 0.3813, 0.1976,
0.5315, 0.3752, 0.6874, 0.4051, 0.6222, 0.3945, 0.0516, 0.1515,
0.7049,-0.0276,-0.0703, 0.1794, 0.0877, 0.1596, 0.1051,
0.3875,-0.2280, 0.4786, 0.1099, 0.3915,-0.0411, 0.5162,-0.0749,
0.5694,-0.0772, 0.7401,-0.1371, 0.6767,-0.2185, 0.3935, 0.1458,
0.7784, 0.0420,-0.2098,-0.4941, 0.1300, 0.2519, 0.0660,-0.0708,
0.2505, 0.0839, 0.1530, 0.2263, 0.1785,-0.1884, 0.0688,-0.4143,
0.2752,-0.2915, 0.2570,-0.0885,-0.3592,-0.4473,-0.3541,-0.4311,
0.0270,-0.2614,-0.1877,-0.2318, 0.0422, 0.0571,
0.0623,-0.0238,-0.0887, 0.1438,-0.0286, 0.0653,
0.0626,-0.0020,-0.0627,-0.0744,-0.0528,-0.0138, 0.0122,
0.0434,-0.2359,-0.2191,-0.2303, 0.2720,-0.1875, 0.3434,
0.3613,-0.0998, 0.0538, 0.3671, 0.4282, 0.4665, 0.2392,-0.0677,
0.5099, 0.4577, 0.4424, 0.3033, 0.1845, 0.2091, 0.1229, 0.3054,
0.2979, 0.3781, 0.3892,-0.3077,-0.0617, 0.1617,-0.0626, 0.3297,
0.1311, 0.3860, 0.2505,-0.4281,-0.4427, 0.0005,-0.0658,-0.4593,
0.0899,-0.2325,-0.2140,-0.2812, 0.0446,-0.3755, 0.0915,-0.2307,
0.1002,-0.3179, 0.0127,-0.5230,-0.0075,-0.1111, 0.1253,-0.3566,
0.2783,-0.4834, 0.2381,-0.0100,0.1206,-0.3672)



More information about the R-sig-meta-analysis mailing list