Skip to content

Commit 85e36c0

Browse files
author
Thomas Sargent
committed
Tom's Aug 4 edits of Calvo_machine_learning lecture
1 parent 85dbae6 commit 85e36c0

File tree

1 file changed

+56
-26
lines changed

1 file changed

+56
-26
lines changed

lectures/calvo_machine_learn.md

Lines changed: 56 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ kernelspec:
1515

1616
## Introduction
1717

18-
This lecture studies a problem that we also study in another quantecon lecture
18+
This lecture studies a problem that we shall study from another angle in another quantecon lecture
1919
{doc}`calvo`.
2020

2121
That lecture used an analytic approach based on ``dynamic programming squared`` to guide computation of a Ramsey plan in a version of a model of Calvo {cite}`Calvo1978`.
@@ -172,7 +172,7 @@ U(m_t - p_t) = u_0 + u_1 (m_t - p_t) - \frac{u_2}{2} (m_t - p_t)^2, \quad u_0 >
172172
The money demand function {eq}`eq_grad_old1` and the utility function {eq}`eq_grad_old5` imply that
173173

174174
$$
175-
U(-\alpha \theta_t) = u_1 + u_2 (-\alpha \theta_t) -\frac{u_2}{2}(-\alpha \theta_t)^2 .
175+
U(-\alpha \theta_t) = u_0 + u_1 (-\alpha \theta_t) -\frac{u_2}{2}(-\alpha \theta_t)^2 .
176176
$$ (eq_grad_old5a)
177177
178178
@@ -1013,13 +1013,13 @@ Our hope is that these regressions will reveal structure hidden within the $\ve
10131013
10141014
It is worth pausing here to think about roles played by **human** intelligence and **artificial** intelligence here.
10151015
1016-
Artificial intelligence (AI a.k.a. ML) is running the regressions.
1016+
Artificial intelligence, in this case meaning a computer, is running the regressions for us.
10171017
1018-
But you can regress anything on anything else.
1018+
But we are free to regress anything on anything else.
10191019
1020-
Human intelligence tell us which regressions to run.
1020+
Human intelligence tells us which regressions to run.
10211021
1022-
Even more human intelligence is required fully to appreciate what they reveal about the structure of the Ramsey plan.
1022+
Additional inputs of human intelligence will be required fully to appreciate what those regressions reveal about the structure of a Ramsey plan.
10231023
10241024
```{note}
10251025
At this point, it is worthwhile to read how Chang {cite}`chang1998credible` chose
@@ -1134,21 +1134,22 @@ plt.show()
11341134
Points for succeeding times appear further and further to the lower left and eventually converge to
11351135
$\bar \mu, \bar \mu$.
11361136
1137-
Now we proceed to the third regression.
1137+
Next, we'll compute a sequence $\{v_t\}_{t=0}^T$ of what we'll call "continuation values" along a Ramsey plan.
11381138
1139-
First we compute a sequence $\{v_t\}_{t=0}^T$ backward from $v_T$
1139+
To do so, we'll start at date $T$ and compute
11401140
11411141
$$
11421142
v_T = \frac{1}{1-\beta} s(\bar \mu, \bar \mu).
11431143
$$
11441144
1145-
Then starting from $t=T-1$, iterate backwards on the recursion
1145+
Then starting from $t=T-1$, we'll iterate backwards on the recursion
11461146
11471147
$$
11481148
v_t = s(\theta_t, \mu_t) + \beta v_{t+1}
11491149
$$
11501150
1151-
for $t=0, \ldots, T-1$ to compute the sequence $\{v_t\}_{t=0}^T.$
1151+
for $t= T-1, T-2, \ldots, 0.$
1152+
11521153
11531154
```{code-cell} ipython3
11541155
# Define function for s and U in section 41.3
@@ -1176,9 +1177,25 @@ def compute_vt(θ, μ, β, c, u0=1, u1=0.5, u2=3, α=1):
11761177
return v_t
11771178
11781179
v_t = compute_vt(θs, μs, β=0.85, c=2)
1180+
print("continuation value sequence = ", v_t)
11791181
```
11801182
1181-
Now we can run regression
1183+
1184+
The initial continuation value $v_0$ should equals the optimized value of the Ramsey planner's criterion $V$ defined
1185+
in equation {eq}`eq:RamseyV`.
1186+
1187+
**Note to Humphrey**
1188+
1189+
Let's add a line of code to check this equality.
1190+
1191+
I printed out the sequence and it looks good. But I suspect you will want to clean up
1192+
that line where I printed out the sequence.
1193+
1194+
Also, please add a graph of $v_t$ against $t$ for $t=0, \ldots, T$.
1195+
1196+
**End of note to Humphrey**
1197+
1198+
Next we ask Python to regress $v_t$ against a constant, $\theta_t$, and $\theta_t^2$.
11821199
11831200
$$
11841201
v_t = g_0 + g_1 \theta_t + g_2 \theta_t^2 .
@@ -1196,21 +1213,18 @@ print("\nRegression of v_t on a constant, θ_t and θ^2_t:")
11961213
print(results3.summary(slim=True))
11971214
```
11981215
1199-
**NOTE TO TOM**
1216+
The regression has an $R^2$ equal to $1$ and so fits perfectly.
1217+
1218+
However, notice the warning about the high condition number.
12001219
1201-
We find that $\theta_t$ and $\theta_t^2$ are highly "linearly" correlated
1220+
As indicated in the printout, this is a consequence of
1221+
$\theta_t$ and $\theta_t^2$ being highly correlated along the Ramsey plan.
12021222
12031223
```{code-cell} ipython3
12041224
np.corrcoef(θs, θs**2)
12051225
```
12061226
1207-
So the condition number is large.
1208-
1209-
**END OF NOTE TO TOM**
1210-
1211-
+++
1212-
1213-
Now we plot $v_t$ against $\theta_t$
1227+
Let's plot $v_t$ against $\theta_t$ along with the nonlinear regression line.
12141228
12151229
```{code-cell} ipython3
12161230
θ_grid = np.linspace(min(θs), max(θs), 100)
@@ -1226,8 +1240,14 @@ plt.legend()
12261240
plt.tight_layout()
12271241
plt.show()
12281242
```
1243+
The highest continuation value $v_0$ at $t=0$ appears at the peak of the graph.
12291244
1230-
### What has machine learning taught us?
1245+
Subsequent values of $v_t$ for $t \geq 1$ appear to the left and converge monotonically from above to $v_T$ at time $T$.
1246+
1247+
1248+
1249+
1250+
## What has machine learning taught us?
12311251
12321252
12331253
Our regressions tells us that along the Ramsey outcome $\vec \mu^R, \vec \theta^R$, the linear function
@@ -1236,10 +1256,14 @@ $$
12361256
\mu_t = .0645 + 1.5995 \theta_t
12371257
$$
12381258
1239-
fits perfectly and that so does the regression line
1259+
fits perfectly and that so do the regression lines
1260+
1261+
$$
1262+
\theta_{t+1} = - .0645 + .4005 \theta_t
1263+
$$
12401264
12411265
$$
1242-
\theta_{t+1} = - .0645 + .4005 \theta_t .
1266+
v_t = 6.8052 - .7580 \theta_t - 4.6991 \theta_t^2.
12431267
$$
12441268
12451269
@@ -1261,12 +1285,18 @@ that along a Ramsey plan, the following relationships prevail:
12611285
12621286
where the initial value $\theta_0^R$ was computed along with other components of $\vec \mu^R, \vec \theta^R$ when we computed the Ramsey plan, and where $b_0, b_1, d_0, d_1$ are parameters whose values we estimated with our regressions.
12631287
1288+
In addition, we learned that continuation values are described by the quadratic function
1289+
1290+
$$
1291+
v_t = g_0 + g_1 \theta_t + g_2 \theta_t^2
1292+
$$
1293+
12641294
1265-
We discovered this representation by running some carefully chosen regressions and staring at the results, noticing that the $R^2$ of unity tell us that the fits are perfect.
1295+
We discovered these relationships by running some carefully chosen regressions and staring at the results, noticing that the $R^2$ of unity tell us that the fits are perfect.
12661296
12671297
We have learned something about the structure of the Ramsey problem.
12681298
1269-
But it is challenging to say more just by using the methods and ideas that we have deployed in this lecture.
1299+
However, it is challenging to say more just by using the methods and ideas that we have deployed in this lecture.
12701300
12711301
There are many other linear regressions among components of $\vec \mu^R, \theta^R$ that would also have given us perfect fits.
12721302
@@ -1278,7 +1308,7 @@ After all, the Ramsey planner chooses $\vec \mu$, while $\vec \theta$ is an o
12781308
12791309
Isn't it more natural then to expect that we'd learn more about the structure of the Ramsey problem from a regression of components of $\vec \theta$ on components of $\vec \mu$?
12801310
1281-
To answer such questions, we'll have to deploy more economic theory.
1311+
To answer these questions, we'll have to deploy more economic theory.
12821312
12831313
We do that in this quantecon lecture {doc}`calvo`.
12841314

0 commit comments

Comments
 (0)