Sort by
Best overall
XGBoost

Strongest combination of directional accuracy (78%) and error rate across the test set. Trains significantly faster than LSTM while matching its predictive capability on weekly S&P 500 data, making it the most practical choice for regular retraining.

78% Dir. Acc.
33 MAE
51 RMSE
A note on the research

Literature consistently shows that ensemble methods outperform single estimators on financial time series when the feature set is well-engineered. XGBoost and Random Forest dominate on tabular data tasks, while LSTM models close the gap when longer temporal sequences are available. Linear Regression serves as a critical baseline, meaning if any model cannot significantly beat it, the feature engineering likely needs revisiting. No model should be trusted in isolation; the ensemble of predictions across models gives a more reliable signal than any single forecast.

Accuracy Metrics Test set · Weekly S&P 500
Model MAE MAPE RMSE Dir. Acc.
MAE Average absolute error in index points. Lower is better.
MAPE Mean absolute % error. Normalised across price levels.
RMSE Penalises large errors more. Use alongside MAE.
Dir. % of weeks the model correctly predicted up or down. Above 50% beats a coin flip.
Training & Complexity Relative to dataset size
Model Training Time Complexity

Select a model to explore its full weekly forecasts and interactive chart.

Browse models