Adapting deep learning models between regional markets
Research output: Contribution to journal › Article › peer-review
Electronic versions
Documents
- Adapting deep learning models between regional markets
Final published version, 329 KB, PDF document
Licence: CC BY Show licence
DOI
This paper extends a series of deep learning models developed on US equity data to the Australian market. The model architectures are retrained, without structural modification, and tested on Australian data comparable with the original US data. Relative to the original US-based results, the retrained models are statistically less accurate at predicting next day returns. The models were also modified in the standard train/validate manner on the Australian data, and these modelsyielded significantly better predictive results on the holdout data. It was determined that the best-performing models were a CNN and LSTM, attaining highly significant Z-scores of 6.154 and 8.789, respectively. Due to the relative structural similarity across all models, the improvement is ascribed to regional influences within the respective training data sets.
Such unique regional differences are consistent with views in the literature stating that deep learning models in computational finance that are developed and trained on a single market will always contain market-specific bias. Given this finding, future research into the development of deep learning models trained on global markets is recommended.
Such unique regional differences are consistent with views in the literature stating that deep learning models in computational finance that are developed and trained on a single market will always contain market-specific bias. Given this finding, future research into the development of deep learning models trained on global markets is recommended.
Keywords
- Deep learing, Machine learning, Candlesticks, Technical analysis
Original language | English |
---|---|
Pages (from-to) | 1483–1492 |
Journal | Neural Computing and Applications |
Volume | 35 |
Issue number | 2 |
Early online date | 27 Sept 2022 |
DOIs | |
Publication status | Published - Jan 2023 |
Total downloads
No data available