gpt4 book ai didi

r - 我可以在没有任何滞后变量的情况下使用 dynlm 吗?

转载 作者:行者123 更新时间:2023-12-04 08:47:29 27 4
gpt4 key购买 nike

我试图在 R 编程中使用 dynlm 命令使用动态线性回归,因为我需要分析我的面板数据但我不想使用面板回归。

但是,我的模型规范根本不包含任何滞后变量。在这种情况下,我还可以使用动态线性模型 (dynlm) 吗?它提供的输出仍然非常好且有帮助。

例如,我得到以下信息。

 Call:
dynlm(formula = y ~ a + b + c + d*g + e*g +
f*g + h + i + j)

Estimate Std. Error t value Pr(>|t|)
(Intercept) 2.12175142 1.87591046 1.131 0.258860
a 0.00019267 0.02859444 0.007 0.994628
b -0.01091167 0.02133546 -0.511 0.609392
c 0.17635258 0.05616125 3.140 0.001842 **
d -0.12717373 0.04706829 -2.702 0.007253 **
g -0.39693637 0.09144441 -4.341 1.894e-05 ***
e -0.15394576 0.05059879 -3.042 0.002536 **
f -0.22525696 0.07412517 -3.039 0.002565 **
h -0.10063528 0.01242704 -8.098 1.108e-14 ***
i 0.00098993 0.00240669 0.411 0.681102
j -0.11337655 0.30151860 -0.376 0.707146
d:g 0.06875835 0.09451161 0.728 0.467431
g:e 0.09787315 0.11409165 0.858 0.391603
g:f 0.19367624 0.14990202 1.292 0.197260
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.1648 on 331 degrees of freedom
Multiple R-squared: 0.5586 , Adjusted R-squared: 0.5452
F-statistic: 31.9 on 10 and 331 DF, p-value: < 1.401e-50

Durbin-Watson statistic
(original): 1.34863 , p-value: 1.861e-10
(transformed): 2.09349 , p-value: 8.068e-01>

我使用的数据集的第一部分如下:
 Date       ID  a   b   c   d   e   f   g   h        i               j          y
01/01/2017 1 1 0 0 1 0 0 1 6.5 -0.287199892 6.26048245 0.380978369
01/01/2017 2 0 0 0 1 0 0 1 6.5 -0.287199892 6.26048245 0.380978369
01/01/2017 3 1 0 0 0 1 0 0 7.8 -0.287199892 6.26048245 0.524437496
01/03/2017 4 1 0 0 0 0 0 0 7.8 -0.260937218 6.258402008 0.63409868
01/04/2017 5 0 0 0 1 0 0 1 6.5 10.51545939 6.263858877 0.392317155
01/04/2017 6 0 0 0 1 0 0 1 6.5 10.51545939 6.263858877 0.392317155
01/04/2017 7 0 1 0 1 0 0 0 6.5 10.51545939 6.263858877 1.049993284
01/04/2017 8 0 0 0 0 1 0 0 7.3 10.51545939 6.263858877 0.461989851
01/05/2017 9 0 0 0 0 1 0 0 6.1 -16.12973095 6.280696169 0.69686996
01/05/2017 10 0 0 0 1 0 0 0 7.7 -16.12973095 6.280696169 0.639270495
01/05/2017 11 0 0 0 0 1 0 0 7.3 -16.12973095 6.280696169 0.369339223
01/06/2017 12 1 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.395179169
01/06/2017 13 0 1 0 1 0 0 0 6.3 -7.097505117 6.281526986 0.634524509
01/06/2017 14 0 1 0 1 0 0 0 7.8 -7.097505117 6.281526986 0.605731699
01/06/2017 15 0 0 0 0 0 0 0 3.2 -7.097505117 6.281526986 1.765103139
01/07/2017 16 0 1 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.323052418
01/07/2017 17 0 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.323052418
01/08/2017 18 0 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.357581409
01/09/2017 19 0 0 0 1 0 0 1 6.5 -0.376295821 6.278540118 0.375177221
01/09/2017 20 0 0 0 1 0 0 1 6.5 -0.376295821 6.278540118 0.375177221
01/10/2017 21 0 0 0 1 0 0 1 6.5 1.07381926 6.275634353 0.323677822
01/10/2017 22 1 0 0 0 0 0 0 6.3 1.07381926 6.275634353 0.529304377
01/11/2017 23 0 0 0 1 0 0 1 6.5 -15.99695552 6.292042205 0.272404556
01/11/2017 24 0 0 0 1 0 0 1 6.5 -15.99695552 6.292042205 0.272404556
01/11/2017 25 0 0 0 1 0 0 0 5.8 -15.99695552 6.292042205 0.485387413
01/11/2017 26 0 0 0 0 1 0 0 6.3 -15.99695552 6.292042205 0.651151817
01/12/2017 27 0 1 0 1 0 0 1 6.5 4.672168917 6.290699191 0.259498815
01/12/2017 28 0 1 0 1 0 0 0 7.3 4.672168917 6.290699191 0.396883681
01/13/2017 29 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.247276795
01/13/2017 30 0 0 0 1 0 0 0 6.1 2.818656098 6.288309121 0.72878018
01/13/2017 31 1 0 0 0 0 0 0 6.3 2.818656098 6.288309121 0.439525331
01/13/2017 32 1 0 0 0 0 0 0 6.3 2.818656098 6.288309121 0.439525331
01/13/2017 33 0 0 0 1 0 0 0 7.8 2.818656098 6.288309121 0.674418975
01/14/2017 34 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.228731465
01/14/2017 35 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.228731465
01/14/2017 36 1 0 0 0 0 0 0 3.2 2.818656098 6.288309121 1.614602435
01/15/2017 37 0 1 0 1 0 0 1 6.5 2.818656098 6.288309121 0.247426893
01/15/2017 38 0 0 1 1 0 0 0 7.3 2.818656098 6.288309121 0.557578826
01/16/2017 39 0 0 0 1 0 0 1 6.5 0.393759618 6.285334223 0.27193064
01/16/2017 40 0 0 0 0 1 0 0 7.3 0.393759618 6.285334223 0.85500972
01/16/2017 41 0 1 0 1 0 0 0 5.7 0.393759618 6.285334223 0.676041272
01/17/2017 42 0 1 0 1 0 0 1 6.5 7.88986693 6.287050627 0.271868581
01/17/2017 43 0 1 0 1 0 0 0 5.7 7.88986693 6.287050627 0.601684565
01/17/2017 44 0 0 0 1 0 0 0 7.3 7.88986693 6.287050627 0.561442547
01/17/2017 45 0 0 0 1 0 0 0 5.7 7.88986693 6.287050627 0.426279123
01/18/2017 46 0 0 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/18/2017 47 0 1 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/18/2017 48 0 1 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/20/2017 49 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 50 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 51 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 52 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/21/2017 53 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.255335931
01/21/2017 54 0 0 0 1 0 0 0 7.3 -0.673663504 6.279474132 0.511621616
01/21/2017 55 0 1 0 1 0 0 0 7.8 -0.673663504 6.279474132 0.658433144
01/22/2017 56 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.295583923
01/23/2017 57 0 0 0 1 0 0 0 6.5 2.407225453 6.276942114 1.536492494
01/23/2017 58 0 0 0 0 0 1 0 5.2 2.407225453 6.276942114 0.484052447
01/23/2017 59 0 0 1 1 0 0 0 7.8 2.407225453 6.276942114 0.781409215
01/24/2017 60 0 1 0 1 0 0 1 6.5 -1.218354632 6.274089502 0.321647225
01/24/2017 61 0 0 0 1 0 0 1 6.5 -1.218354632 6.274089502 0.321647225
01/25/2017 62 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 63 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 64 0 1 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 65 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 66 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 67 0 1 0 1 0 0 0 4.3 -1.851624795 6.271387207 0.613050528
01/25/2017 68 0 1 0 1 0 0 0 7.3 -1.851624795 6.271387207 0.546333861
01/25/2017 69 0 0 0 0 1 0 0 7.2 -1.851624795 6.271387207 0.537850062
01/25/2017 70 0 1 0 0 1 0 0 7.8 -1.851624795 6.271387207 0.639599631
01/26/2017 71 0 0 0 1 0 0 1 6.5 2.534464661 6.26891399 0.300565872
01/26/2017 72 0 0 0 1 0 0 1 6.5 2.534464661 6.26891399 0.300565872
01/26/2017 73 0 1 0 1 0 0 0 8.3 2.534464661 6.26891399 0.529403228
01/27/2017 74 0 0 0 1 0 0 1 6.5 0.448048274 6.265976913 0.300919416
01/27/2017 75 0 1 0 1 0 0 1 6.5 0.448048274 6.265976913 0.300919416
01/27/2017 76 1 0 0 1 0 0 0 5.7 0.448048274 6.265976913 0.580776132
01/27/2017 77 0 0 0 0 1 0 0 6.1 0.448048274 6.265976913 0.686217699
02/01/2017 78 0 0 1 1 0 0 0 6.5 1.735858826 6.258495118 1.134076442
02/02/2017 79 1 0 0 1 0 0 1 6.5 2.678162264 6.256096701 0.246605523
02/02/2017 80 0 1 0 0 1 0 0 5.7 2.678162264 6.256096701 0.788278101
02/03/2017 81 0 0 0 1 0 0 1 6.5 1.153969121 6.253264176 0.240588472
02/03/2017 82 1 0 0 0 1 0 0 7.2 1.153969121 6.253264176 0.504511317
02/03/2017 83 1 0 0 1 0 0 0 7.8 1.153969121 6.253264176 0.367633571
02/04/2017 84 0 0 0 1 0 0 1 6.5 1.153969121 6.253264176 0.265418597
02/04/2017 85 0 1 0 1 0 0 0 6.1 1.153969121 6.253264176 0.657809188
02/04/2017 86 0 1 0 1 0 0 0 7.8 1.153969121 6.253264176 0.363212341
02/05/2017 87 0 0 0 1 0 0 0 7.3 1.153969121 6.253264176 0.508102373
02/06/2017 88 0 0 0 0 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 89 0 0 0 1 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 90 0 1 0 1 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 91 0 0 0 1 0 0 0 6.1 1.046748087 6.250418001 0.755692532
02/07/2017 92 1 0 0 0 1 0 0 6.1 2.572012207 6.247989068 0.669478955
02/07/2017 93 1 0 0 0 0 0 0 6.3 2.572012207 6.247989068 0.533384869
02/07/2017 94 0 0 0 1 0 0 0 5.7 2.572012207 6.247989068 0.551575392
02/07/2017 95 0 1 0 1 0 0 0 7.7 2.572012207 6.247989068 0.544694026
02/07/2017 96 1 0 0 0 0 0 0 7.3 2.572012207 6.247989068 0.466169871
02/07/2017 97 0 0 0 0 0 0 0 7.8 2.572012207 6.247989068 0.352763774
02/08/2017 98 0 1 0 1 0 0 1 6.5 0.93372207 6.245134 0.317670958
02/08/2017 99 0 1 0 1 0 0 1 6.5 0.93372207 6.245134 0.317670958
02/08/2017 100 0 0 0 1 0 0 0 6.1 0.93372207 6.245134 0.590983019
02/08/2017 101 0 1 0 1 0 0 0 7.8 0.93372207 6.245134 0.288007254
02/08/2017 102 0 0 0 1 0 0 0 7.8 0.93372207 6.245134 0.288007254
02/09/2017 103 0 1 0 1 0 0 1 6.5 -7.364293184 6.246272494 0.322058201
02/09/2017 104 0 1 0 1 0 0 0 7.7 -7.364293184 6.246272494 0.45710099
02/09/2017 105 0 1 0 1 0 0 0 7.3 -7.364293184 6.246272494 0.618448667
02/09/2017 106 1 0 0 0 1 0 0 6.7 -7.364293184 6.246272494 0.75533239
02/10/2017 107 1 0 0 1 0 0 1 6.5 2.054261996 6.243673693 0.322091275
02/10/2017 108 0 1 0 0 1 0 0 7.3 2.054261996 6.243673693 0.619229315
02/10/2017 109 1 0 0 0 1 0 0 6.3 2.054261996 6.243673693 0.574019377
02/11/2017 110 0 0 0 0 1 0 0 5.8 2.054261996 6.243673693 0.576474297
02/12/2017 111 1 0 0 0 0 1 0 7.8 2.054261996 6.243673693 0.292138684
02/13/2017 112 0 0 1 1 0 0 0 6.1 -0.5847667 6.2407891 0.796668232
02/14/2017 113 1 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 114 0 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 115 0 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 116 0 0 0 0 1 0 0 6.1 1.418813574 6.238033264 0.735497343
02/14/2017 117 0 1 0 1 0 0 0 4.3 1.418813574 6.238033264 0.51068871
02/14/2017 118 0 0 0 0 0 1 0 6.5 1.418813574 6.238033264 0.458087049
02/14/2017 119 0 0 0 0 1 0 0 4.5 1.418813574 6.238033264 0.402129607
02/15/2017 120 0 1 0 0 1 0 1 6.5 -0.041897715 6.235131309 0.308459288

………………

即使对于缩进来说,数据量也太大了,所以我到此为止。

最佳答案

使用您的数据样本:

datastr <- "
Date ID a b c d e f g h i j y
01/01/2017 1 1 0 0 1 0 0 1 6.5 -0.287199892 6.26048245 0.380978369
01/01/2017 2 0 0 0 1 0 0 1 6.5 -0.287199892 6.26048245 0.380978369
01/01/2017 3 1 0 0 0 1 0 0 7.8 -0.287199892 6.26048245 0.524437496
01/03/2017 4 1 0 0 0 0 0 0 7.8 -0.260937218 6.258402008 0.63409868
01/04/2017 5 0 0 0 1 0 0 1 6.5 10.51545939 6.263858877 0.392317155
01/04/2017 6 0 0 0 1 0 0 1 6.5 10.51545939 6.263858877 0.392317155
01/04/2017 7 0 1 0 1 0 0 0 6.5 10.51545939 6.263858877 1.049993284
01/04/2017 8 0 0 0 0 1 0 0 7.3 10.51545939 6.263858877 0.461989851
01/05/2017 9 0 0 0 0 1 0 0 6.1 -16.12973095 6.280696169 0.69686996
01/05/2017 10 0 0 0 1 0 0 0 7.7 -16.12973095 6.280696169 0.639270495
01/05/2017 11 0 0 0 0 1 0 0 7.3 -16.12973095 6.280696169 0.369339223
01/06/2017 12 1 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.395179169
01/06/2017 13 0 1 0 1 0 0 0 6.3 -7.097505117 6.281526986 0.634524509
01/06/2017 14 0 1 0 1 0 0 0 7.8 -7.097505117 6.281526986 0.605731699
01/06/2017 15 0 0 0 0 0 0 0 3.2 -7.097505117 6.281526986 1.765103139
01/07/2017 16 0 1 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.323052418
01/07/2017 17 0 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.323052418
01/08/2017 18 0 0 0 1 0 0 1 6.5 -7.097505117 6.281526986 0.357581409
01/09/2017 19 0 0 0 1 0 0 1 6.5 -0.376295821 6.278540118 0.375177221
01/09/2017 20 0 0 0 1 0 0 1 6.5 -0.376295821 6.278540118 0.375177221
01/10/2017 21 0 0 0 1 0 0 1 6.5 1.07381926 6.275634353 0.323677822
01/10/2017 22 1 0 0 0 0 0 0 6.3 1.07381926 6.275634353 0.529304377
01/11/2017 23 0 0 0 1 0 0 1 6.5 -15.99695552 6.292042205 0.272404556
01/11/2017 24 0 0 0 1 0 0 1 6.5 -15.99695552 6.292042205 0.272404556
01/11/2017 25 0 0 0 1 0 0 0 5.8 -15.99695552 6.292042205 0.485387413
01/11/2017 26 0 0 0 0 1 0 0 6.3 -15.99695552 6.292042205 0.651151817
01/12/2017 27 0 1 0 1 0 0 1 6.5 4.672168917 6.290699191 0.259498815
01/12/2017 28 0 1 0 1 0 0 0 7.3 4.672168917 6.290699191 0.396883681
01/13/2017 29 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.247276795
01/13/2017 30 0 0 0 1 0 0 0 6.1 2.818656098 6.288309121 0.72878018
01/13/2017 31 1 0 0 0 0 0 0 6.3 2.818656098 6.288309121 0.439525331
01/13/2017 32 1 0 0 0 0 0 0 6.3 2.818656098 6.288309121 0.439525331
01/13/2017 33 0 0 0 1 0 0 0 7.8 2.818656098 6.288309121 0.674418975
01/14/2017 34 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.228731465
01/14/2017 35 0 0 0 1 0 0 1 6.5 2.818656098 6.288309121 0.228731465
01/14/2017 36 1 0 0 0 0 0 0 3.2 2.818656098 6.288309121 1.614602435
01/15/2017 37 0 1 0 1 0 0 1 6.5 2.818656098 6.288309121 0.247426893
01/15/2017 38 0 0 1 1 0 0 0 7.3 2.818656098 6.288309121 0.557578826
01/16/2017 39 0 0 0 1 0 0 1 6.5 0.393759618 6.285334223 0.27193064
01/16/2017 40 0 0 0 0 1 0 0 7.3 0.393759618 6.285334223 0.85500972
01/16/2017 41 0 1 0 1 0 0 0 5.7 0.393759618 6.285334223 0.676041272
01/17/2017 42 0 1 0 1 0 0 1 6.5 7.88986693 6.287050627 0.271868581
01/17/2017 43 0 1 0 1 0 0 0 5.7 7.88986693 6.287050627 0.601684565
01/17/2017 44 0 0 0 1 0 0 0 7.3 7.88986693 6.287050627 0.561442547
01/17/2017 45 0 0 0 1 0 0 0 5.7 7.88986693 6.287050627 0.426279123
01/18/2017 46 0 0 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/18/2017 47 0 1 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/18/2017 48 0 1 0 1 0 0 1 6.5 -2.816167339 6.284668375 0.247257528
01/20/2017 49 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 50 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 51 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/20/2017 52 0 0 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.240110499
01/21/2017 53 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.255335931
01/21/2017 54 0 0 0 1 0 0 0 7.3 -0.673663504 6.279474132 0.511621616
01/21/2017 55 0 1 0 1 0 0 0 7.8 -0.673663504 6.279474132 0.658433144
01/22/2017 56 0 1 0 1 0 0 1 6.5 -0.673663504 6.279474132 0.295583923
01/23/2017 57 0 0 0 1 0 0 0 6.5 2.407225453 6.276942114 1.536492494
01/23/2017 58 0 0 0 0 0 1 0 5.2 2.407225453 6.276942114 0.484052447
01/23/2017 59 0 0 1 1 0 0 0 7.8 2.407225453 6.276942114 0.781409215
01/24/2017 60 0 1 0 1 0 0 1 6.5 -1.218354632 6.274089502 0.321647225
01/24/2017 61 0 0 0 1 0 0 1 6.5 -1.218354632 6.274089502 0.321647225
01/25/2017 62 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 63 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 64 0 1 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 65 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 66 0 0 0 1 0 0 1 6.5 -1.851624795 6.271387207 0.307303763
01/25/2017 67 0 1 0 1 0 0 0 4.3 -1.851624795 6.271387207 0.613050528
01/25/2017 68 0 1 0 1 0 0 0 7.3 -1.851624795 6.271387207 0.546333861
01/25/2017 69 0 0 0 0 1 0 0 7.2 -1.851624795 6.271387207 0.537850062
01/25/2017 70 0 1 0 0 1 0 0 7.8 -1.851624795 6.271387207 0.639599631
01/26/2017 71 0 0 0 1 0 0 1 6.5 2.534464661 6.26891399 0.300565872
01/26/2017 72 0 0 0 1 0 0 1 6.5 2.534464661 6.26891399 0.300565872
01/26/2017 73 0 1 0 1 0 0 0 8.3 2.534464661 6.26891399 0.529403228
01/27/2017 74 0 0 0 1 0 0 1 6.5 0.448048274 6.265976913 0.300919416
01/27/2017 75 0 1 0 1 0 0 1 6.5 0.448048274 6.265976913 0.300919416
01/27/2017 76 1 0 0 1 0 0 0 5.7 0.448048274 6.265976913 0.580776132
01/27/2017 77 0 0 0 0 1 0 0 6.1 0.448048274 6.265976913 0.686217699
02/01/2017 78 0 0 1 1 0 0 0 6.5 1.735858826 6.258495118 1.134076442
02/02/2017 79 1 0 0 1 0 0 1 6.5 2.678162264 6.256096701 0.246605523
02/02/2017 80 0 1 0 0 1 0 0 5.7 2.678162264 6.256096701 0.788278101
02/03/2017 81 0 0 0 1 0 0 1 6.5 1.153969121 6.253264176 0.240588472
02/03/2017 82 1 0 0 0 1 0 0 7.2 1.153969121 6.253264176 0.504511317
02/03/2017 83 1 0 0 1 0 0 0 7.8 1.153969121 6.253264176 0.367633571
02/04/2017 84 0 0 0 1 0 0 1 6.5 1.153969121 6.253264176 0.265418597
02/04/2017 85 0 1 0 1 0 0 0 6.1 1.153969121 6.253264176 0.657809188
02/04/2017 86 0 1 0 1 0 0 0 7.8 1.153969121 6.253264176 0.363212341
02/05/2017 87 0 0 0 1 0 0 0 7.3 1.153969121 6.253264176 0.508102373
02/06/2017 88 0 0 0 0 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 89 0 0 0 1 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 90 0 1 0 1 0 0 1 6.5 1.046748087 6.250418001 0.372827574
02/06/2017 91 0 0 0 1 0 0 0 6.1 1.046748087 6.250418001 0.755692532
02/07/2017 92 1 0 0 0 1 0 0 6.1 2.572012207 6.247989068 0.669478955
02/07/2017 93 1 0 0 0 0 0 0 6.3 2.572012207 6.247989068 0.533384869
02/07/2017 94 0 0 0 1 0 0 0 5.7 2.572012207 6.247989068 0.551575392
02/07/2017 95 0 1 0 1 0 0 0 7.7 2.572012207 6.247989068 0.544694026
02/07/2017 96 1 0 0 0 0 0 0 7.3 2.572012207 6.247989068 0.466169871
02/07/2017 97 0 0 0 0 0 0 0 7.8 2.572012207 6.247989068 0.352763774
02/08/2017 98 0 1 0 1 0 0 1 6.5 0.93372207 6.245134 0.317670958
02/08/2017 99 0 1 0 1 0 0 1 6.5 0.93372207 6.245134 0.317670958
02/08/2017 100 0 0 0 1 0 0 0 6.1 0.93372207 6.245134 0.590983019
02/08/2017 101 0 1 0 1 0 0 0 7.8 0.93372207 6.245134 0.288007254
02/08/2017 102 0 0 0 1 0 0 0 7.8 0.93372207 6.245134 0.288007254
02/09/2017 103 0 1 0 1 0 0 1 6.5 -7.364293184 6.246272494 0.322058201
02/09/2017 104 0 1 0 1 0 0 0 7.7 -7.364293184 6.246272494 0.45710099
02/09/2017 105 0 1 0 1 0 0 0 7.3 -7.364293184 6.246272494 0.618448667
02/09/2017 106 1 0 0 0 1 0 0 6.7 -7.364293184 6.246272494 0.75533239
02/10/2017 107 1 0 0 1 0 0 1 6.5 2.054261996 6.243673693 0.322091275
02/10/2017 108 0 1 0 0 1 0 0 7.3 2.054261996 6.243673693 0.619229315
02/10/2017 109 1 0 0 0 1 0 0 6.3 2.054261996 6.243673693 0.574019377
02/11/2017 110 0 0 0 0 1 0 0 5.8 2.054261996 6.243673693 0.576474297
02/12/2017 111 1 0 0 0 0 1 0 7.8 2.054261996 6.243673693 0.292138684
02/13/2017 112 0 0 1 1 0 0 0 6.1 -0.5847667 6.2407891 0.796668232
02/14/2017 113 1 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 114 0 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 115 0 0 0 1 0 0 1 6.5 1.418813574 6.238033264 0.347653035
02/14/2017 116 0 0 0 0 1 0 0 6.1 1.418813574 6.238033264 0.735497343
02/14/2017 117 0 1 0 1 0 0 0 4.3 1.418813574 6.238033264 0.51068871
02/14/2017 118 0 0 0 0 0 1 0 6.5 1.418813574 6.238033264 0.458087049
02/14/2017 119 0 0 0 0 1 0 0 4.5 1.418813574 6.238033264 0.402129607
02/15/2017 120 0 1 0 0 1 0 1 6.5 -0.041897715 6.235131309 0.308459288
"

我设法运行了 dynlm 模型:
> data <- read.table(text=datastr,header=TRUE)
>
> library('dynlm')
> dynlm_model <-dynlm(formula = y ~ a + b + c + d*g + e*g + f*g + h + i + j, data)
> dynlm_model

Time series regression with "numeric" data:
Start = 1, End = 120

Call:
dynlm(formula = y ~ a + b + c + d * g + e * g + f * g + h + i +
j, data = data)

Coefficients:
(Intercept) a b c d g e f h i
-3.980619 -0.027872 -0.009254 0.238524 -0.091746 -0.333547 -0.080245 -0.287590 -0.115933 -0.000234
j d:g g:e g:f
0.870471 0.009906 0.038182 NA

似乎你有 Date 顺序的数据,而且很可能也是按时间顺序排列的,因为 ID 正在增加。

但是, 您的时间步长正在变化 ,因为每个日期的观察次数不是恒定的。数据样本中的滞后大约在几天到几个小时之间变化。

如果您使用该模型,您将获得某种平均时间步长的指示性结果,因为该方法不知道您的时间步长是变化的。如果您尝试预测时间序列中的下一个点,那么当您最接近平均时间步长(例如 6 小时 vs. 6 小时)时预测最准确,而离平均时间步长最远时预测最不准确(例如6 天与 6 小时)。

记住这一点,您可以尝试解释拟合动态线性模型的结果。即使时间步长适度变化,dlm 模型也可用于平滑数据,以便在分散的数据中揭示趋势。

编辑:

使用普通的 lm 函数,我得到完全相同的结果:
> lm_model <-lm(formula = y ~ a + b + c + d*g + e*g + f*g + h + i + j, data)
> lm_model

Call:
lm(formula = y ~ a + b + c + d * g + e * g + f * g + h + i +
j, data = data)

Coefficients:
(Intercept) a b c d g e f
-3.980619 -0.027872 -0.009254 0.238524 -0.091746 -0.333547 -0.080245 -0.287590
h i j d:g g:e g:f
-0.115933 -0.000234 0.870471 0.009906 0.038182 NA

因此,使用 formula = y ~ a + b + c + d * g + e * g + f * g + h + i + j 拟合 dynlm 的结果与使用回归拟合函数 lm 拟合相同的公式完全相同。

根据 dynlm package manual ,您需要在公式中指定动态(通过 d()L() )或线性/循环模式(通过 trend()season()harmon() )才能充分利用 dynlm

关于r - 我可以在没有任何滞后变量的情况下使用 dynlm 吗?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47045077/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com