根据龙良曲Pytorch学习视频整理,视频链接:
【计算机-AI】PyTorch学这个就够了!
(好课推荐)深度学习与PyTorch入门实战——主讲人龙良曲
40.时间序列表示
Sequence representation
- [seq_len, feature_len]
- [word, word_vec]
one-hot - [words, word vec]
sparse high-dim semantic similarity
word2vec vs Glove
import torch
import torch.nn as nn
from torchnlp.word_to_vector import GloVe
word_to_idx = {'hello': 0, 'world': 1}
lookup_tensor = torch.tensor([word_to_idx['hello']], dtype=torch.long)
embeds = nn.Embedding(2, 5)
hello_embed = embeds(lookup_tensor)
print(hello_embed)
"""
tensor([[ 0.2565, -0.2827, -0.0259, -1.9533, 0.8330]],
grad_fn=<EmbeddingBackward>)
"""
vectors = GloVe()
print(vectors['hello'])
torchnlp包安装 pip install pytorch-nlp
41.循环神经网络
Weight sharing Consistent memory
42.RNN Layer使用
input dim, hidden dim
rnn = nn.RNN(100, 10)
print(rnn._parameters.keys())
print(rnn.weight_hh_l0.shape, rnn.weight_ih_l0.shape)
print(rnn.bias_hh_l0.shape, rnn.bias_ih_l0.shape)
42.1 nn.RNN
- .__init__
(input_size, hidden_size, num_layers) - out, ht = forward(x, h0)
x: [seq_len, b, word_vec] ho/ht: [num_layers, b, h_dim] out: [seq_len, b, h_dim]
Single layer RNN
rnn = nn.RNN(input_size=100, hidden_size=20, num_layers=1)
print(rnn)
x = torch.randn(10, 3, 100)
out, h = rnn(x, torch.zeros(1, 3, 20))
print(out.shape, h.shape)
h是最后一个时间序列所有的memory状态;out是所有时间序列的最后一个memory状态
2 layer RNN
rnn = nn.RNN(100, 10, num_layers=2)
print(rnn._parameters.keys())
print(rnn.weight_hh_l0.shape, rnn.weight_ih_l0.shape)
print(rnn.bias_hh_l0.shape, rnn.bias_ih_l0.shape)
[T, b, h_dim], [layers, b, h_dim]
rnn = nn.RNN(input_size=100, hidden_size=20, num_layers=4)
print(rnn)
x = torch.randn(10, 3, 100)
out, h = rnn(x)
print(out.shape, h.shape)
42.2 nn.RNNCell
- __init__
(input_size, hidden_size, num_layers) - ht = rnncell(xt, ht_1)
x: [b, word_vec] ht_1/ht: [num_layers, b, h_dim] out = torch.stack([h1, h2,…ht])
Functional
cell1 = nn.RNNCell(100, 30)
cell2 = nn.RNNCell(30, 20)
h1 = torch.zeros(3, 30)
h2 = torch.zeros(3, 20)
for xt in x:
h1 = cell1(xt, h1)
h2 = cell2(h1, h2)
print(h1.shape)
print(h2.shape)
43. 时间序列预测
在这里插入代码片
44.RNN训练难题
45.LSTM Layer使用
在这里插入代码片
46.情感分类实战
在这里插入代码片
|