百度360必应搜狗淘宝本站头条
当前位置:网站首页 > 编程字典 > 正文

Pytorch学习记录-卷积Seq2Seq(模型训练)

toyiye 2024-08-27 21:55 5 浏览 0 评论

Pytorch学习记录-torchtext和Pytorch的实例5

0. PyTorch Seq2Seq项目介绍

在完成基本的torchtext之后,找到了这个教程,《基于Pytorch和torchtext来理解和实现seq2seq模型》。 这个项目主要包括了6个子项目

  1. ~~使用神经网络训练Seq2Seq~~
  2. ~~使用RNN encoder-decoder训练短语表示用于统计机器翻译~~
  3. ~~使用共同学习完成NMT的堆砌和翻译~~
  4. ~~打包填充序列、掩码和推理~~
  5. 卷积Seq2Seq
  6. Transformer

5. 卷积Seq2Seq

5.1 准备数据

5.2 构建模型

5.3 训练模型

INPUT_DIM = len(SRC.vocab)
OUTPUT_DIM = len(TRG.vocab)
EMB_DIM = 256
HID_DIM = 512
ENC_LAYERS = 10
DEC_LAYERS = 10
ENC_KERNEL_SIZE = 3
DEC_KERNEL_SIZE = 3
ENC_DROPOUT = 0.25
DEC_DROPOUT = 0.25
PAD_IDX = TRG.vocab.stoi['<pad>']
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
enc = Encoder(INPUT_DIM, EMB_DIM, HID_DIM, ENC_LAYERS, ENC_KERNEL_SIZE, ENC_DROPOUT, device)
dec = Decoder(OUTPUT_DIM, EMB_DIM, HID_DIM, DEC_LAYERS, DEC_KERNEL_SIZE, DEC_DROPOUT, PAD_IDX, device)
model = Seq2Seq(enc, dec, device).to(device)
model

Seq2Seq( (encoder): Encoder( (embedding): Embedding(7853, 256) (rnn): GRU(256, 512, bidirectional=True) (fc): Linear(infeatures=1024, outfeatures=512, bias=True) (dropout): Dropout(p=0.5) ) (decoder): Decoder( (attention): Attention( (attn): Linear(infeatures=1536, outfeatures=512, bias=True) ) (embedding): Embedding(5893, 256) (rnn): GRU(1280, 512) (out): Linear(infeatures=1792, outfeatures=5893, bias=True) (dropout): Dropout(p=0.5) ) )

def count_parameters(model):
 return sum(p.numel() for p in model.parameters() if p.requires_grad)
print(f'The model has {count_parameters(model):,} trainable parameters')
The model has 37,351,685 trainable parameters
optimizer = optim.Adam(model.parameters())
criterion = nn.CrossEntropyLoss(ignore_index = PAD_IDX)
def train(model, iterator, optimizer, criterion, clip):
 model.train()
 epoch_loss = 0
 for i, batch in enumerate(iterator):
 src = batch.src
 trg = batch.trg
 optimizer.zero_grad()
 output, _ = model(src, trg[:,:-1])
 #output = [batch size, trg sent len - 1, output dim]
 #trg = [batch size, trg sent len]
 output = output.contiguous().view(-1, output.shape[-1])
 trg = trg[:,1:].contiguous().view(-1)
 #output = [batch size * trg sent len - 1, output dim]
 #trg = [batch size * trg sent len - 1]
 loss = criterion(output, trg)
 loss.backward()
 torch.nn.utils.clip_grad_norm_(model.parameters(), clip)
 optimizer.step()
 epoch_loss += loss.item()
 return epoch_loss / len(iterator)
def evaluate(model, iterator, criterion):
 model.eval()
 epoch_loss = 0
 with torch.no_grad():
 for i, batch in enumerate(iterator):
 src = batch.src
 trg = batch.trg
 output, _ = model(src, trg[:,:-1])
 #output = [batch size, trg sent len - 1, output dim]
 #trg = [batch size, trg sent len]
 output = output.contiguous().view(-1, output.shape[-1])
 trg = trg[:,1:].contiguous().view(-1)
 #output = [batch size * trg sent len - 1, output dim]
 #trg = [batch size * trg sent len - 1]
 loss = criterion(output, trg)
 epoch_loss += loss.item()
 return epoch_loss / len(iterator)
def epoch_time(start_time, end_time):
 elapsed_time = end_time - start_time
 elapsed_mins = int(elapsed_time / 60)
 elapsed_secs = int(elapsed_time - (elapsed_mins * 60))
 return elapsed_mins, elapsed_secs
N_EPOCHS = 10
CLIP = 1
best_valid_loss = float('inf')
for epoch in range(N_EPOCHS):
 start_time = time.time()
 train_loss = train(model, train_iterator, optimizer, criterion, CLIP)
 valid_loss = evaluate(model, valid_iterator, criterion)
 end_time = time.time()
 epoch_mins, epoch_secs = epoch_time(start_time, end_time)
 if valid_loss < best_valid_loss:
 best_valid_loss = valid_loss
 torch.save(model.state_dict(), 'tut5-model.pt')
 print(f'Epoch: {epoch+1:02} | Time: {epoch_mins}m {epoch_secs}s')
 print(f'\tTrain Loss: {train_loss:.3f} | Train PPL: {math.exp(train_loss):7.3f}')
 print(f'\t Val. Loss: {valid_loss:.3f} | Val. PPL: {math.exp(valid_loss):7.3f}')
# 10个epoch
Epoch: 01 | Time: 1m 6s
 Train Loss: 4.154 | Train PPL: 63.715
 Val. Loss: 2.897 | Val. PPL: 18.116
Epoch: 02 | Time: 1m 6s
 Train Loss: 2.952 | Train PPL: 19.140
 Val. Loss: 2.368 | Val. PPL: 10.680
Epoch: 03 | Time: 1m 6s
 Train Loss: 2.556 | Train PPL: 12.884
 Val. Loss: 2.125 | Val. PPL: 8.370
Epoch: 04 | Time: 1m 6s
 Train Loss: 2.335 | Train PPL: 10.334
 Val. Loss: 1.987 | Val. PPL: 7.291
Epoch: 05 | Time: 1m 6s
 Train Loss: 2.193 | Train PPL: 8.966
 Val. Loss: 1.926 | Val. PPL: 6.862
Epoch: 06 | Time: 1m 6s
 Train Loss: 2.089 | Train PPL: 8.074
 Val. Loss: 1.878 | Val. PPL: 6.538
Epoch: 07 | Time: 1m 6s
 Train Loss: 2.011 | Train PPL: 7.470
 Val. Loss: 1.835 | Val. PPL: 6.264
Epoch: 08 | Time: 1m 6s
 Train Loss: 1.946 | Train PPL: 7.001
 Val. Loss: 1.818 | Val. PPL: 6.159
Epoch: 09 | Time: 1m 6s
 Train Loss: 1.890 | Train PPL: 6.621
 Val. Loss: 1.802 | Val. PPL: 6.064
Epoch: 10 | Time: 1m 6s
 Train Loss: 1.850 | Train PPL: 6.359
 Val. Loss: 1.790 | Val. PPL: 5.988
# 20个epoch
Epoch: 01 | Time: 1m 6s
 Train Loss: 1.815 | Train PPL: 6.144
 Val. Loss: 1.771 | Val. PPL: 5.880
Epoch: 02 | Time: 1m 6s
 Train Loss: 1.779 | Train PPL: 5.926
 Val. Loss: 1.753 | Val. PPL: 5.772
Epoch: 03 | Time: 1m 6s
 Train Loss: 1.751 | Train PPL: 5.759
 Val. Loss: 1.732 | Val. PPL: 5.651
Epoch: 04 | Time: 1m 6s
 Train Loss: 1.723 | Train PPL: 5.600
 Val. Loss: 1.735 | Val. PPL: 5.671
Epoch: 05 | Time: 1m 6s
 Train Loss: 1.700 | Train PPL: 5.472
 Val. Loss: 1.736 | Val. PPL: 5.672
Epoch: 06 | Time: 1m 6s
 Train Loss: 1.674 | Train PPL: 5.333
 Val. Loss: 1.721 | Val. PPL: 5.589
Epoch: 07 | Time: 1m 6s
 Train Loss: 1.651 | Train PPL: 5.211
 Val. Loss: 1.720 | Val. PPL: 5.587
Epoch: 08 | Time: 1m 6s
 Train Loss: 1.631 | Train PPL: 5.108
 Val. Loss: 1.720 | Val. PPL: 5.585
Epoch: 09 | Time: 1m 6s
 Train Loss: 1.613 | Train PPL: 5.020
 Val. Loss: 1.722 | Val. PPL: 5.596
Epoch: 10 | Time: 1m 6s
 Train Loss: 1.590 | Train PPL: 4.905
 Val. Loss: 1.708 | Val. PPL: 5.520
Epoch: 11 | Time: 1m 6s
 Train Loss: 1.579 | Train PPL: 4.848
 Val. Loss: 1.719 | Val. PPL: 5.577
Epoch: 12 | Time: 1m 6s
 Train Loss: 1.562 | Train PPL: 4.770
 Val. Loss: 1.728 | Val. PPL: 5.632
Epoch: 13 | Time: 1m 6s
 Train Loss: 1.552 | Train PPL: 4.719
 Val. Loss: 1.703 | Val. PPL: 5.493
Epoch: 14 | Time: 1m 6s
 Train Loss: 1.539 | Train PPL: 4.660
 Val. Loss: 1.723 | Val. PPL: 5.602
Epoch: 15 | Time: 1m 6s
 Train Loss: 1.526 | Train PPL: 4.598
 Val. Loss: 1.710 | Val. PPL: 5.529
Epoch: 16 | Time: 1m 6s
 Train Loss: 1.518 | Train PPL: 4.565
 Val. Loss: 1.704 | Val. PPL: 5.494
Epoch: 17 | Time: 1m 6s
 Train Loss: 1.517 | Train PPL: 4.560
 Val. Loss: 1.726 | Val. PPL: 5.616
Epoch: 18 | Time: 1m 6s
 Train Loss: 2.414 | Train PPL: 11.177
 Val. Loss: 2.562 | Val. PPL: 12.961
Epoch: 19 | Time: 1m 6s
 Train Loss: 2.830 | Train PPL: 16.952
 Val. Loss: 2.583 | Val. PPL: 13.240
Epoch: 20 | Time: 1m 6s
 Train Loss: 12.083 | Train PPL: 176818.618
 Val. Loss: 15.417 | Val. PPL: 4961313.167

感谢Colab,要不这么多计算量我得把笔记本显卡跑废

相关推荐

# Python 3 # Python 3字典Dictionary(1)

Python3字典字典是另一种可变容器模型,且可存储任意类型对象。字典的每个键值(key=>value)对用冒号(:)分割,每个对之间用逗号(,)分割,整个字典包括在花括号({})中,格式如...

Python第八课:数据类型中的字典及其函数与方法

Python3字典字典是另一种可变容器模型,且可存储任意类型对象。字典的每个键值...

Python中字典详解(python 中字典)

字典是Python中使用键进行索引的重要数据结构。它们是无序的项序列(键值对),这意味着顺序不被保留。键是不可变的。与列表一样,字典的值可以保存异构数据,即整数、浮点、字符串、NaN、布尔值、列表、数...

Python3.9又更新了:dict内置新功能,正式版十月见面

机器之心报道参与:一鸣、JaminPython3.8的热乎劲还没过去,Python就又双叒叕要更新了。近日,3.9版本的第四个alpha版已经开源。从文档中,我们可以看到官方透露的对dic...

Python3 基本数据类型详解(python三种基本数据类型)

文章来源:加米谷大数据Python中的变量不需要声明。每个变量在使用前都必须赋值,变量赋值以后该变量才会被创建。在Python中,变量就是变量,它没有类型,我们所说的"类型"是变...

一文掌握Python的字典(python字典用法大全)

字典是Python中最强大、最灵活的内置数据结构之一。它们允许存储键值对,从而实现高效的数据检索、操作和组织。本文深入探讨了字典,涵盖了它们的创建、操作和高级用法,以帮助中级Python开发...

超级完整|Python字典详解(python字典的方法或操作)

一、字典概述01字典的格式Python字典是一种可变容器模型,且可存储任意类型对象,如字符串、数字、元组等其他容器模型。字典的每个键值key=>value对用冒号:分割,每个对之间用逗号,...

Python3.9版本新特性:字典合并操作的详细解读

处于测试阶段的Python3.9版本中有一个新特性:我们在使用Python字典时,将能够编写出更可读、更紧凑的代码啦!Python版本你现在使用哪种版本的Python?3.7分?3.5分?还是2.7...

python 自学,字典3(一些例子)(python字典有哪些基本操作)

例子11;如何批量复制字典里的内容2;如何批量修改字典的内容3;如何批量修改字典里某些指定的内容...

Python3.9中的字典合并和更新,几乎影响了所有Python程序员

全文共2837字,预计学习时长9分钟Python3.9正在积极开发,并计划于今年10月发布。2月26日,开发团队发布了alpha4版本。该版本引入了新的合并(|)和更新(|=)运算符,这个新特性几乎...

Python3大字典:《Python3自学速查手册.pdf》限时下载中

最近有人会想了,2022了,想学Python晚不晚,学习python有前途吗?IT行业行业薪资高,发展前景好,是很多求职群里严重的香饽饽,而要进入这个高薪行业,也不是那么轻而易举的,拿信工专业的大学生...

python学习——字典(python字典基本操作)

字典Python的字典数据类型是基于hash散列算法实现的,采用键值对(key:value)的形式,根据key的值计算value的地址,具有非常快的查取和插入速度。但它是无序的,包含的元素个数不限,值...

324页清华教授撰写【Python 3 菜鸟查询手册】火了,小白入门字典

如何入门学习python...

Python3.9中的字典合并和更新,了解一下

全文共2837字,预计学习时长9分钟Python3.9正在积极开发,并计划于今年10月发布。2月26日,开发团队发布了alpha4版本。该版本引入了新的合并(|)和更新(|=)运算符,这个新特性几乎...

python3基础之字典(python中字典的基本操作)

字典和列表一样,也是python内置的一种数据结构。字典的结构如下图:列表用中括号[]把元素包起来,而字典是用大括号{}把元素包起来,只不过字典的每一个元素都包含键和值两部分。键和值是一一对应的...

取消回复欢迎 发表评论:

请填写验证码