-
-
Save ChadFulton/a9172cd6f41947333f47bd4842832619 to your computer and use it in GitHub Desktop.
Hi @tri2911, in this case, instead of the forecast method (which only produces out-of-sample forecasts), you can use the predict method (which allows for both in-sample and out-of-sample predictions). For example, if you wanted a prediction for the months of the fourth quarter, you could do:
res.predict(start='2023-10', end='2023-12')Or if you really want a prediction only for December, you would do:
res.predict(start='2023-12', end='2023-12')"Hello @ChadFulton, I hope you're doing well. I want to express my gratitude to you for providing this high-quality content for those interested. I'm learning a lot from this material.
I'm trying to run the model codes and encountered a stumbling block that I'm having difficulty resolving.
It refers to the replacement of names and the mapping of variables. It seems that the names are not being assigned properly and this is causing problems further on, with variables not being found.
I've reviewed the data, variables, names, etc., and apparently everything is correct, but when I run the code, problems occur further on.
If there have been any important changes to the data or how to execute it, or any updates to this part of the code, I believe I might have missed them. If you could take a look at this part and guide me on how to proceed, I would be very grateful.
*Definitions from the Appendix for FRED-MD variables
defn_m = pd.read_csv('C:\Users\Cliente\Documents\fredmd_definitions.csv', encoding='windows-1252')
defn_m.index = defn_m.fred
*Definitions from the Appendix for FRED-QD variables
defn_q = pd.read_csv('C:\Users\Cliente\Documents\fredqd_definitions.csv', encoding='windows-1252')
defn_q.index = defn_q['FRED MNEMONIC']
*Example of the information in these files:
print(defn_m.head())
print(defn_q.head())
*Replace the names of the columns in each monthly and quarterly dataset
map_m = defn_m['description'].to_dict()
map_q = defn_q['DESCRIPTION'].to_dict()
for date, value in dta.items():
value.orig_m.columns = value.orig_m.columns.map(map_m)
value.dta_m.columns = value.dta_m.columns.map(map_m)
value.orig_q.columns = value.orig_q.columns.map(map_q)
value.dta_q.columns = value.dta_q.columns.map(map_q)
*Get the mapping of variable id to group name, for monthly variables
groups = defn_m[['description', 'group']].copy()
*Re-order the variables according to the definition CSV file
*(which is ordered by group)
columns = [name for name in defn_m['description']
if name in dta['2024-10'].dta_m.columns]
for date in dta.keys():
dta[date].dta_m = dta[date].dta_m.reindex(columns, axis=1)
*Add real GDP (our quarterly variable) into the "Output and Income" group
gdp_description = defn_q.loc['GDPC1', 'DESCRIPTION']
new_row = pd.DataFrame([{'description': gdp_description, 'group': 'Output and Income'}])
groups = pd.concat([groups, new_row], ignore_index=True)
*Display the number of variables in each group
(groups.groupby('group', sort=False)
.count()
.rename({'description': '# series in group'}, axis=1))"
Hello @ChadFulton ,
First of all, I want to extend my deep thanks for your work on the DynamicFactorModel MQ. I have been using this model and, overall, it has been a good experience. However, I encountered an issue when it comes to nowcasting.
Here are the details of the problem:
I trained the model using data up to the beginning of Q4 2024 with my vintage dataset. When I apply a new vintage to the trained model, I can still get the nowcast for "Dec 2024".
At the end of December, I have some monthly indicators for December and I update data to my result. However, I am unable to generate a nowcast anymore. The error message I receive is:
ValueError: Prediction must have
endafterstart.When I run the forecast in this scenario, it only generates forecasts starting from January 2025 instead of December 2024. My quarterly variable for Q4 2024 in endog_quarterly is still NaN, but I am unable to generate a nowcast for this period.
I would appreciate any guidance or suggestions you might have to resolve this issue. Thank you again for your valuable work and support.
Best regards, Tri