pytorch實現線性回歸

剛學習pytorch,的確很python,很keras,很gluon,很不tensorflow,但過程中發覺數據處理是個巨坑。

最後生成的函數是 y= 4.9990 * x + 0.9614

@np.vectorizendef f(x):n return 5 * x + 3nnx = np.arange(5000).reshape(5000,1)ny = f(x)nnclass dataset(data.Dataset):n def __init__(self,x,y):n super().__init__()n self.x= xn self.y= yn def __getitem__(self,index):n return x[index],y[index]n def __len__(self):n return len(x)nnclass Net(nn.Module):n def __init__(self):n super(Net, self).__init__()n self.dense = nn.Linear(1, 1)n def __call__(self, input):n output = self.dense(input)n return outputnnnnmodel = Net()noptimizer = optim.SGD(model.parameters(), lr=0.00000007) ncriterion = nn.MSELoss() nnfor i in range(2):n dt = data.DataLoader(dataset(x, y), batch_size=50, shuffle=True, drop_last= False, num_workers= 4)n dt= iter(dt) n for i in range(len(dt)):n x, y = next(dt)n inputs = autograd.Variable(x)n target = autograd.Variable(y)n output = model.__call__(inputs)n loss = criterion(output, target)n optimizer.zero_grad() n loss.backward() n optimizer.step()nnmodel.eval()npredict= model(autograd.Variable(torch.from_numpy(np.arange(5000,5050).reshape(50,1)).float()))npredict= predict.data.numpy()nprint(list(model.parameters()))n

推薦閱讀:

Yoshua Bengio為什麼能跟Hinton、LeCun相提並論??
Python寫簡單的線性分類器
如何作用和理解神經網路在電力電子方向的運用?
我所理解的神經網路
深度學習(Deep Learning)基礎概念1:神經網路基礎介紹及一層神經網路的python實現

TAG:PyTorch | 神经网络 | 机器学习 |