機器學習預測股票(單變數預測Demo)

機器學習預測股票(單變數預測Demo)

7 人贊了文章

試驗運行基礎:Anaconda+tensorflow+sklearn+VScode

實驗結果出人意料,最簡單的線性模型完爆複雜的機器學習模型,簡單的往往就是最好的。

線性預測模型

#線性預測模型示例from my_line import Lineimport pandas as pddataset = pd.read_csv(dataset.csv) #讀取數據tw = 3 #時間窗口l = 0.9 #數據集劃分長度m = 1 #模型選擇[1:line,2:lasso,3:ridge]a = 0.01 #正則化速率my_line = Line() #類實例化對象df = my_line.time_window(dataset, tw) #將時序數據轉化為滑動窗口data_input, data_output = my_line.data_select(df) #無周期時序特徵選擇 #data_input, data_output = my_line.period_data_select(df) #周期時序特徵選擇data_input, data_output = my_line.data_scaler(data_input, data_output) #標準化x_train, y_train, x_test, y_test = my_line.data_split(data_input, data_output, l) #訓練測試集劃分pre, pred = my_line.model_line(x_train, y_train, x_test, y_test, m, a) #預測y_train, y_test, pre, pred = my_line.data_inverse(y_train, y_test, pre, pred) #標準化逆變換my_line.metric(y_train, y_test, pre, pred) #性能度量#my_line.data_save(pred, m) #預測數據保存my_line.display(y_test, pred,m) #繪圖

line

time: 0.0MAPE_train: 0.008587837916105429MAE_train: 10.306710550502197MSE_train: 212.05155210185387RMSE_train: 14.561989977398483MAPE_test: 0.007109396625928449MAE_test: 14.393204364765207MSE_test: 380.87929466853564RMSE_test: 19.5161290902816

集成學習預測模型示例

#集成學習預測模型示例from my_ensemble import Ensembleimport pandas as pddataset = pd.read_csv(dataset.csv) #讀取數據tw = 3 #時間窗口l = 0.9 #數據集劃分長度m = 1 #模型選擇[1:GradientBoosting,2:RandomForest,3:AdaBoost,4:Bagging,5:DecisionTree,6:knn]n = 100 #決策樹數量k = 5 #knn的k值my_ensemble = Ensemble() #類實例化對象df = my_ensemble.time_window(dataset, tw) #將時序數據轉化為滑動窗口data_input, data_output = my_ensemble.data_select(df) #無周期時序特徵選擇#data_input, data_output = my_ensemble.period_data_select(df) #周期時序特徵選擇x_train, y_train, x_test, y_test = my_ensemble.data_split(data_input, data_output, l) #訓練測試集劃分pre, pred = my_ensemble.model_ensemble(x_train, y_train, x_test, y_test, m, n, k) #預測my_ensemble.metric(y_train, y_test, pre, pred) #性能度量#my_ensemble.data_save(pred, m) #預測數據保存my_ensemble.display(y_test, pred, m) #繪圖

time: 0.12558817863464355MAPE_train: 0.007820997188079638MAE_train: 9.423206236608412MSE_train: 168.08037488635307RMSE_train: 12.964581554618455MAPE_test: 0.014701287152282102MAE_test: 30.445878442691964MSE_test: 1480.0780367784937RMSE_test: 38.471782344706796

GradientBoosting

神經網路預測模型示例

#神經網路預測模型示例from my_nerual import Nerualimport pandas as pddataset = pd.read_csv(dataset.csv) #讀取數據tw = 7 #時間窗口l = 0.9 #數據集劃分長度n = 128 #神經元個數dp = 0.2 #dropoute = 50 #epochsl2 = 0.0001 #l2正則化bs = 50 #batch_sizec = 1g = 0.1m=2 #[1:ann, 2:dnn, 3:svm]my_nerual = Nerual() #類實例化對象df = my_nerual.time_window(dataset, tw) #將時序數據轉化為滑動窗口data_input, data_output = my_nerual.data_select(df) #無周期時序特徵選擇 #data_input, data_output = my_nerual.period_data_select(df) #周期時序特徵選擇data_input, data_output = my_nerual.data_scaler(data_input, data_output) #標準化x_train, y_train, x_test, y_test = my_nerual.data_split(data_input, data_output, l) #訓練測試集劃分#pre, pred = my_nerual.model_ann(x_train, y_train, x_test, y_test, n, dp, e, l2, bs) #預測pre, pred = my_nerual.model_dnn(x_train, y_train, x_test, y_test, n, dp, e, l2, bs) #預測#pre, pred = my_nerual.model_svm(x_train, y_train, x_test, y_test, c, g) #預測y_train, y_test, pre, pred = my_nerual.data_inverse(y_train, y_test, pre, pred) #標準化逆變換my_nerual.metric(y_train, y_test, pre, pred) #性能度量#my_nerual.data_save(pred, m) #預測數據保存my_nerual.display(y_test, pred, m) #繪圖

time: 12.871931076049805MAPE_train: 0.013557188647355794MAE_train: 16.916352231142387MSE_train: 486.4883706058537RMSE_train: 22.056481374096226MAPE_test: 0.017959997571820412MAE_test: 37.18421403958709MSE_test: 1721.6050339283984RMSE_test: 41.49222859679145

dnn

循環神經網路預測模型示例

#循環神經網路預測模型示例from my_rnn import R_nerualimport pandas as pddataset = pd.read_csv(dataset.csv) #讀取數據tw = 3 #時間窗口l = 0.9 #數據集劃分長度ts = 3 #timestepdim = 4 #特徵維數m = 1 #模型選擇[1:lstm,2:gru,3:rnn]n = 128 #神經元個數dp = 0.2 #dropoute = 10 #epochsl2 = 0.001 #l2正則化bs = 50 #batch_sizemy_rnn = R_nerual() #類實例化對象df = my_rnn.time_window(dataset, tw) #將時序數據轉化為滑動窗口data_input, data_output = my_rnn.data_select(df) #無周期時序特徵選擇 #data_input, data_output = my_rnn.period_data_select(df) #周期時序特徵選擇data_input, data_output = my_rnn.data_scaler(data_input, data_output) #標準化x_train, y_train, x_test, y_test = my_rnn.data_split(data_input, data_output, l, ts) #訓練測試集劃分#x_train, y_train, x_test, y_test = my_rnn.period_data_split(data_input, data_output, l=0.9, ts, dim) #周期性訓練測試集劃分pre, pred = my_rnn.model_nerual(x_train, y_train, x_test, y_test, m, n, dp, e, l2, bs) #預測y_train, y_test, pre, pred = my_rnn.data_inverse(y_train, y_test, pre, pred) #標準化逆變換my_rnn.metric(y_train, y_test, pre, pred) #性能度量#my_rnn.data_save(pred, m) #預測數據保存my_rnn.display(y_test, pred,m) #繪圖

time: 24.299691200256348MAPE_train: 0.012012480014394268MAE_train: 14.495914799866911MSE_train: 388.14090401666255RMSE_train: 19.701291937755315MAPE_test: 0.009212615826167218MAE_test: 18.636055992955637MSE_test: 658.9963136636107RMSE_test: 25.670923506247505

lstm

推薦閱讀:

DARNN:一種新的時間序列預測方法——基於雙階段注意力機制的循環神經網路
如何認識趨勢?|來自計量經濟學的視角
基於GARCH模型的平安銀行股票研究和危機預警——R語言描述
Arima

TAG:機器學習 | 時間序列分析 | 預測 |