婷婷综合国产,91蜜桃婷婷狠狠久久综合9色 ,九九九九九精品,国产综合av

主頁 > 知識(shí)庫 > PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇

PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇

熱門標(biāo)簽:鄭州智能外呼系統(tǒng)運(yùn)營(yíng)商 哈爾濱外呼系統(tǒng)代理商 電話機(jī)器人適用業(yè)務(wù) 佛山防封外呼系統(tǒng)收費(fèi) 南昌辦理400電話怎么安裝 湛江電銷防封卡 徐州天音防封電銷卡 獲客智能電銷機(jī)器人 不錯(cuò)的400電話辦理

概述

對(duì)于 MNIST 手寫數(shù)據(jù)集的具體介紹, 我們?cè)?TensorFlow 中已經(jīng)詳細(xì)描述過, 在這里就不多贅述. 有興趣的同學(xué)可以去看看之前的文章: https://www.jb51.net/article/222183.htm

在上一節(jié)的內(nèi)容里, 我們用 PyTorch 實(shí)現(xiàn)了回歸任務(wù), 在這一節(jié)里, 我們將使用 PyTorch 來解決分類任務(wù).

導(dǎo)包

import torchvision
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
import matplotlib.pyplot as plt

設(shè)置超參數(shù)

# 設(shè)置超參數(shù)
n_epochs = 3
batch_size_train = 64
batch_size_test = 1000
learning_rate = 0.01
momentum = 0.5
log_interval = 10
random_seed = 1
torch.manual_seed(random_seed)

讀取數(shù)據(jù)

# 數(shù)據(jù)讀取
train_loader = torch.utils.data.DataLoader(
    torchvision.datasets.MNIST('./data/', train=True, download=True,
                               transform=torchvision.transforms.Compose([
                                   torchvision.transforms.ToTensor(),
                                   torchvision.transforms.Normalize(
                                       (0.1307,), (0.3081,))
                               ])),
    batch_size=batch_size_train, shuffle=True)
    
test_loader = torch.utils.data.DataLoader(
    torchvision.datasets.MNIST('./data/', train=False, download=True,
                               transform=torchvision.transforms.Compose([
                                   torchvision.transforms.ToTensor(),
                                   torchvision.transforms.Normalize(
                                       (0.1307,), (0.3081,))
                               ])),
    batch_size=batch_size_test, shuffle=True)

examples = enumerate(test_loader)
batch_idx, (example_data, example_targets) = next(examples)

# 調(diào)試輸出
print(example_targets)
print(example_data.shape)

輸出結(jié)果:
tensor([7, 6, 7, 5, 6, 7, 8, 1, 1, 2, 4, 1, 0, 8, 4, 4, 4, 9, 8, 1, 3, 3, 8, 6,
2, 7, 5, 1, 6, 5, 6, 2, 9, 2, 8, 4, 9, 4, 8, 6, 7, 7, 9, 8, 4, 9, 5, 3,
1, 0, 9, 1, 7, 3, 7, 0, 9, 2, 5, 1, 8, 9, 3, 7, 8, 4, 1, 9, 0, 3, 1, 2,
3, 6, 2, 9, 9, 0, 3, 8, 3, 0, 8, 8, 5, 3, 8, 2, 8, 5, 5, 7, 1, 5, 5, 1,
0, 9, 7, 5, 2, 0, 7, 6, 1, 2, 2, 7, 5, 4, 7, 3, 0, 6, 7, 5, 1, 7, 6, 7,
2, 1, 9, 1, 9, 2, 7, 6, 8, 8, 8, 4, 6, 0, 0, 2, 3, 0, 1, 7, 8, 7, 4, 1,
3, 8, 3, 5, 5, 9, 6, 0, 5, 3, 3, 9, 4, 0, 1, 9, 9, 1, 5, 6, 2, 0, 4, 7,
3, 5, 8, 8, 2, 5, 9, 5, 0, 7, 8, 9, 3, 8, 5, 3, 2, 4, 4, 6, 3, 0, 8, 2,
7, 0, 5, 2, 0, 6, 2, 6, 3, 6, 6, 7, 9, 3, 4, 1, 6, 2, 8, 4, 7, 7, 2, 7,
4, 2, 4, 9, 7, 7, 5, 9, 1, 3, 0, 4, 4, 8, 9, 6, 6, 5, 3, 3, 2, 3, 9, 1,
1, 4, 4, 8, 1, 5, 1, 8, 8, 0, 7, 5, 8, 4, 0, 0, 0, 6, 3, 0, 9, 0, 6, 6,
9, 8, 1, 2, 3, 7, 6, 1, 5, 9, 3, 9, 3, 2, 5, 9, 9, 5, 4, 9, 3, 9, 6, 0,
3, 3, 8, 3, 1, 4, 1, 4, 7, 3, 1, 6, 8, 4, 7, 7, 3, 3, 6, 1, 3, 2, 3, 5,
9, 9, 9, 2, 9, 0, 2, 7, 0, 7, 5, 0, 2, 6, 7, 3, 7, 1, 4, 6, 4, 0, 0, 3,
2, 1, 9, 3, 5, 5, 1, 6, 4, 7, 4, 6, 4, 4, 9, 7, 4, 1, 5, 4, 8, 7, 5, 9,
2, 9, 4, 0, 8, 7, 3, 4, 2, 7, 9, 4, 4, 0, 1, 4, 1, 2, 5, 2, 8, 5, 3, 9,
1, 3, 5, 1, 9, 5, 3, 6, 8, 1, 7, 9, 9, 9, 9, 9, 2, 3, 5, 1, 4, 2, 3, 1,
1, 3, 8, 2, 8, 1, 9, 2, 9, 0, 7, 3, 5, 8, 3, 7, 8, 5, 6, 4, 1, 9, 7, 1,
7, 1, 1, 8, 6, 7, 5, 6, 7, 4, 9, 5, 8, 6, 5, 6, 8, 4, 1, 0, 9, 1, 4, 3,
5, 1, 8, 7, 5, 4, 6, 6, 0, 2, 4, 2, 9, 5, 9, 8, 1, 4, 8, 1, 1, 6, 7, 5,
9, 1, 1, 7, 8, 7, 5, 5, 2, 6, 5, 8, 1, 0, 7, 2, 2, 4, 3, 9, 7, 3, 5, 7,
6, 9, 5, 9, 6, 5, 7, 2, 3, 7, 2, 9, 7, 4, 8, 4, 9, 3, 8, 7, 5, 0, 0, 3,
4, 3, 3, 6, 0, 1, 7, 7, 4, 6, 3, 0, 8, 0, 9, 8, 2, 4, 2, 9, 4, 9, 9, 9,
7, 7, 6, 8, 2, 4, 9, 3, 0, 4, 4, 1, 5, 7, 7, 6, 9, 7, 0, 2, 4, 2, 1, 4,
7, 4, 5, 1, 4, 7, 3, 1, 7, 6, 9, 0, 0, 7, 3, 6, 3, 3, 6, 5, 8, 1, 7, 1,
6, 1, 2, 3, 1, 6, 8, 8, 7, 4, 3, 7, 7, 1, 8, 9, 2, 6, 6, 6, 2, 8, 8, 1,
6, 0, 3, 0, 5, 1, 3, 2, 4, 1, 5, 5, 7, 3, 5, 6, 2, 1, 8, 0, 2, 0, 8, 4,
4, 5, 0, 0, 1, 5, 0, 7, 4, 0, 9, 2, 5, 7, 4, 0, 3, 7, 0, 3, 5, 1, 0, 6,
4, 7, 6, 4, 7, 0, 0, 5, 8, 2, 0, 6, 2, 4, 2, 3, 2, 7, 7, 6, 9, 8, 5, 9,
7, 1, 3, 4, 3, 1, 8, 0, 3, 0, 7, 4, 9, 0, 8, 1, 5, 7, 3, 2, 2, 0, 7, 3,
1, 8, 8, 2, 2, 6, 2, 7, 6, 6, 9, 4, 9, 3, 7, 0, 4, 6, 1, 9, 7, 4, 4, 5,
8, 2, 3, 2, 4, 9, 1, 9, 6, 7, 1, 2, 1, 1, 2, 6, 9, 7, 1, 0, 1, 4, 2, 7,
7, 8, 3, 2, 8, 2, 7, 6, 1, 1, 9, 1, 0, 9, 1, 3, 9, 3, 7, 6, 5, 6, 2, 0,
0, 3, 9, 4, 7, 3, 2, 9, 0, 9, 5, 2, 2, 4, 1, 6, 3, 4, 0, 1, 6, 9, 1, 7,
0, 8, 0, 0, 9, 8, 5, 9, 4, 4, 7, 1, 9, 0, 0, 2, 4, 3, 5, 0, 4, 0, 1, 0,
5, 8, 1, 8, 3, 3, 2, 1, 2, 6, 8, 2, 5, 3, 7, 9, 3, 6, 2, 2, 6, 2, 7, 7,
6, 1, 8, 0, 3, 5, 7, 5, 0, 8, 6, 7, 2, 4, 1, 4, 3, 7, 7, 2, 9, 3, 5, 5,
9, 4, 8, 7, 6, 7, 4, 9, 2, 7, 7, 1, 0, 7, 2, 8, 0, 3, 5, 4, 5, 1, 5, 7,
6, 7, 3, 5, 3, 4, 5, 3, 4, 3, 2, 3, 1, 7, 4, 4, 8, 5, 5, 3, 2, 2, 9, 5,
8, 2, 0, 6, 0, 7, 9, 9, 6, 1, 6, 6, 2, 3, 7, 4, 7, 5, 2, 9, 4, 2, 9, 0,
8, 1, 7, 5, 5, 7, 0, 5, 2, 9, 5, 2, 3, 4, 6, 0, 0, 2, 9, 2, 0, 5, 4, 8,
9, 0, 9, 1, 3, 4, 1, 8, 0, 0, 4, 0, 8, 5, 9, 8])
torch.Size([1000, 1, 28, 28])

可視化展示

# 畫圖 (前6個(gè))
fig = plt.figure()
for i in range(6):
    plt.subplot(2, 3, i + 1)
    plt.tight_layout()
    plt.imshow(example_data[i][0], cmap='gray', interpolation='none')
    plt.title("Ground Truth: {}".format(example_targets[i]))
    plt.xticks([])
    plt.yticks([])
plt.show()

輸出結(jié)果:

建立模型

# 創(chuàng)建model
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.conv2_drop = nn.Dropout2d()
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc1(x))
        x = F.dropout(x, training=self.training)
        x = self.fc2(x)
        return F.log_softmax(x)


network = Net()
optimizer = optim.SGD(network.parameters(), lr=learning_rate,
                      momentum=momentum)

訓(xùn)練模型

# 訓(xùn)練
train_losses = []
train_counter = []
test_losses = []
test_counter = [i * len(train_loader.dataset) for i in range(n_epochs + 1)]


def train(epoch):
    network.train()
    for batch_idx, (data, target) in enumerate(train_loader):
        optimizer.zero_grad()
        output = network(data)
        loss = F.nll_loss(output, target)
        loss.backward()
        optimizer.step()
        if batch_idx % log_interval == 0:
            print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
                epoch, batch_idx * len(data), len(train_loader.dataset),
                       100. * batch_idx / len(train_loader), loss.item()))
            train_losses.append(loss.item())
            train_counter.append(
                (batch_idx * 64) + ((epoch - 1) * len(train_loader.dataset)))
            torch.save(network.state_dict(), './model.pth')
            torch.save(optimizer.state_dict(), './optimizer.pth')


def test():
    network.eval()
    test_loss = 0
    correct = 0
    with torch.no_grad():
        for data, target in test_loader:
            output = network(data)
            test_loss += F.nll_loss(output, target, size_average=False).item()
            pred = output.data.max(1, keepdim=True)[1]
            correct += pred.eq(target.data.view_as(pred)).sum()
    test_loss /= len(test_loader.dataset)
    test_losses.append(test_loss)
    print('\nTest set: Avg. loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
        test_loss, correct, len(test_loader.dataset),
        100. * correct / len(test_loader.dataset)))


for epoch in range(1, n_epochs + 1):
    train(epoch)
    test()

輸出結(jié)果:
Train Epoch: 1 [0/60000 (0%)] Loss: 2.297471
Train Epoch: 1 [6400/60000 (11%)] Loss: 1.934886
Train Epoch: 1 [12800/60000 (21%)] Loss: 1.242982
Train Epoch: 1 [19200/60000 (32%)] Loss: 0.979296
Train Epoch: 1 [25600/60000 (43%)] Loss: 1.277279
Train Epoch: 1 [32000/60000 (53%)] Loss: 0.721533
Train Epoch: 1 [38400/60000 (64%)] Loss: 0.759595
Train Epoch: 1 [44800/60000 (75%)] Loss: 0.469635
Train Epoch: 1 [51200/60000 (85%)] Loss: 0.422614
Train Epoch: 1 [57600/60000 (96%)] Loss: 0.417603

Test set: Avg. loss: 0.1988, Accuracy: 9431/10000 (94%)

Train Epoch: 2 [0/60000 (0%)] Loss: 0.277207
Train Epoch: 2 [6400/60000 (11%)] Loss: 0.328862
Train Epoch: 2 [12800/60000 (21%)] Loss: 0.396312
Train Epoch: 2 [19200/60000 (32%)] Loss: 0.301772
Train Epoch: 2 [25600/60000 (43%)] Loss: 0.253600
Train Epoch: 2 [32000/60000 (53%)] Loss: 0.217821
Train Epoch: 2 [38400/60000 (64%)] Loss: 0.395815
Train Epoch: 2 [44800/60000 (75%)] Loss: 0.265737
Train Epoch: 2 [51200/60000 (85%)] Loss: 0.323627
Train Epoch: 2 [57600/60000 (96%)] Loss: 0.236692

Test set: Avg. loss: 0.1233, Accuracy: 9622/10000 (96%)

Train Epoch: 3 [0/60000 (0%)] Loss: 0.500148
Train Epoch: 3 [6400/60000 (11%)] Loss: 0.338118
Train Epoch: 3 [12800/60000 (21%)] Loss: 0.452308
Train Epoch: 3 [19200/60000 (32%)] Loss: 0.374940
Train Epoch: 3 [25600/60000 (43%)] Loss: 0.323300
Train Epoch: 3 [32000/60000 (53%)] Loss: 0.203830
Train Epoch: 3 [38400/60000 (64%)] Loss: 0.379557
Train Epoch: 3 [44800/60000 (75%)] Loss: 0.334822
Train Epoch: 3 [51200/60000 (85%)] Loss: 0.361676
Train Epoch: 3 [57600/60000 (96%)] Loss: 0.218833

Test set: Avg. loss: 0.0911, Accuracy: 9723/10000 (97%)

完整代碼

import torchvision
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
import matplotlib.pyplot as plt

# 設(shè)置超參數(shù)
n_epochs = 3
batch_size_train = 64
batch_size_test = 1000
learning_rate = 0.01
momentum = 0.5
log_interval = 100
random_seed = 1
torch.manual_seed(random_seed)

# 數(shù)據(jù)讀取
train_loader = torch.utils.data.DataLoader(
    torchvision.datasets.MNIST('./data/', train=True, download=True,
                               transform=torchvision.transforms.Compose([
                                   torchvision.transforms.ToTensor(),
                                   torchvision.transforms.Normalize(
                                       (0.1307,), (0.3081,))
                               ])),
    batch_size=batch_size_train, shuffle=True)

test_loader = torch.utils.data.DataLoader(
    torchvision.datasets.MNIST('./data/', train=False, download=True,
                               transform=torchvision.transforms.Compose([
                                   torchvision.transforms.ToTensor(),
                                   torchvision.transforms.Normalize(
                                       (0.1307,), (0.3081,))
                               ])),
    batch_size=batch_size_test, shuffle=True)

examples = enumerate(test_loader)
batch_idx, (example_data, example_targets) = next(examples)

# 調(diào)試輸出
print(example_targets)
print(example_data.shape)

# 畫圖 (前6個(gè))
fig = plt.figure()
for i in range(6):
    plt.subplot(2, 3, i + 1)
    plt.tight_layout()
    plt.imshow(example_data[i][0], cmap='gray', interpolation='none')
    plt.title("Ground Truth: {}".format(example_targets[i]))
    plt.xticks([])
    plt.yticks([])
plt.show()


# 創(chuàng)建model
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.conv2_drop = nn.Dropout2d()
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc1(x))
        x = F.dropout(x, training=self.training)
        x = self.fc2(x)
        return F.log_softmax(x)


network = Net()
optimizer = optim.SGD(network.parameters(), lr=learning_rate,
                      momentum=momentum)

# 訓(xùn)練
train_losses = []
train_counter = []
test_losses = []
test_counter = [i * len(train_loader.dataset) for i in range(n_epochs + 1)]


def train(epoch):
    network.train()
    for batch_idx, (data, target) in enumerate(train_loader):
        optimizer.zero_grad()
        output = network(data)
        loss = F.nll_loss(output, target)
        loss.backward()
        optimizer.step()
        if batch_idx % log_interval == 0:
            print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
                epoch, batch_idx * len(data), len(train_loader.dataset),
                       100. * batch_idx / len(train_loader), loss.item()))
            train_losses.append(loss.item())
            train_counter.append(
                (batch_idx * 64) + ((epoch - 1) * len(train_loader.dataset)))
            torch.save(network.state_dict(), './model.pth')
            torch.save(optimizer.state_dict(), './optimizer.pth')


def test():
    network.eval()
    test_loss = 0
    correct = 0
    with torch.no_grad():
        for data, target in test_loader:
            output = network(data)
            test_loss += F.nll_loss(output, target, size_average=False).item()
            pred = output.data.max(1, keepdim=True)[1]
            correct += pred.eq(target.data.view_as(pred)).sum()
    test_loss /= len(test_loader.dataset)
    test_losses.append(test_loss)
    print('\nTest set: Avg. loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
        test_loss, correct, len(test_loader.dataset),
        100. * correct / len(test_loader.dataset)))


for epoch in range(1, n_epochs + 1):
    train(epoch)
    test()

到此這篇關(guān)于PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇的文章就介紹到這了,更多相關(guān)PyTorch神經(jīng)網(wǎng)絡(luò)分類內(nèi)容請(qǐng)搜索腳本之家以前的文章或繼續(xù)瀏覽下面的相關(guān)文章希望大家以后多多支持腳本之家!

您可能感興趣的文章:
  • PyTorch一小時(shí)掌握之a(chǎn)utograd機(jī)制篇
  • PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)氣溫預(yù)測(cè)篇
  • PyTorch一小時(shí)掌握之圖像識(shí)別實(shí)戰(zhàn)篇
  • PyTorch一小時(shí)掌握之基本操作篇

標(biāo)簽:蕪湖 廣西 蘭州 懷化 呂梁 吉安 安康 紹興

巨人網(wǎng)絡(luò)通訊聲明:本文標(biāo)題《PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇》,本文關(guān)鍵詞  PyTorch,一小時(shí),掌握,之,神經(jīng)網(wǎng)絡(luò),;如發(fā)現(xiàn)本文內(nèi)容存在版權(quán)問題,煩請(qǐng)?zhí)峁┫嚓P(guān)信息告之我們,我們將及時(shí)溝通與處理。本站內(nèi)容系統(tǒng)采集于網(wǎng)絡(luò),涉及言論、版權(quán)與本站無關(guān)。
  • 相關(guān)文章
  • 下面列出與本文章《PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇》相關(guān)的同類信息!
  • 本頁收集關(guān)于PyTorch一小時(shí)掌握之神經(jīng)網(wǎng)絡(luò)分類篇的相關(guān)信息資訊供網(wǎng)民參考!
  • 推薦文章
    婷婷综合国产,91蜜桃婷婷狠狠久久综合9色 ,九九九九九精品,国产综合av
    国产91精品久久久久久久网曝门 | 五月激情综合网| 色婷婷综合视频在线观看| 亚洲欧美国产高清| 在线播放91灌醉迷j高跟美女 | 欧美一级久久久久久久大片| 久草中文综合在线| 欧美激情一区二区三区蜜桃视频| 成人性生交大片免费看在线播放| 亚洲欧美国产毛片在线| 欧美久久一区二区| 91免费国产在线观看| 色偷偷久久一区二区三区| 亚洲午夜久久久久久久久久久 | 国产盗摄精品一区二区三区在线| 国产成人8x视频一区二区| 国产精品网站在线观看| 日韩欧美在线网站| av午夜一区麻豆| 国产亚洲欧美一级| 99精品在线观看视频| 欧美性猛片aaaaaaa做受| 欧美sm美女调教| 美脚の诱脚舐め脚责91| 日韩一区二区在线观看视频播放| 日韩成人免费看| 欧美精品日韩精品| 亚洲国产精品嫩草影院| 久久尤物电影视频在线观看| 日本乱人伦aⅴ精品| 成人av集中营| 欧美日本韩国一区| 日韩成人一级大片| 日韩一二三区不卡| 国产尤物一区二区在线| 国产乱码一区二区三区| 国产精品一区二区久激情瑜伽| 午夜欧美2019年伦理| 日韩一区二区三区在线视频| 一区二区成人在线| 7777精品伊人久久久大香线蕉 | 成人精品一区二区三区中文字幕| 国产精品久久国产精麻豆99网站| 国产91丝袜在线观看| 亚洲制服丝袜av| 成人午夜免费电影| 婷婷国产v国产偷v亚洲高清| 欧美日精品一区视频| 国产区在线观看成人精品 | 亚洲精品国产精品乱码不99| 91精品国产综合久久蜜臀| 成人精品电影在线观看| 国产在线看一区| 午夜视频在线观看一区| 欧美一区二区视频在线观看 | 爽好久久久欧美精品| 国产精品网站导航| 久久女同互慰一区二区三区| 欧美一区二区在线视频| 91精彩视频在线| 国产91精品一区二区麻豆亚洲| 亚洲一级二级在线| 日韩欧美二区三区| 丝袜亚洲另类欧美| 欧美精品一区二区三区一线天视频| 美女脱光内衣内裤视频久久网站| 丝袜亚洲另类欧美| 欧美日韩美少妇| 成人av影院在线| 夜夜嗨av一区二区三区| 亚洲国产成人av网| 欧美欧美午夜aⅴ在线观看| 亚洲精品中文在线| 日韩精品福利网| 日本强好片久久久久久aaa| 欧美成人综合网站| 欧美高清视频在线高清观看mv色露露十八| 国产成a人亚洲| 欧美蜜桃一区二区三区| 欧美日韩国产免费一区二区 | 亚洲人午夜精品天堂一二香蕉| 日本一不卡视频| 日本三级亚洲精品| 国产精品久久看| 亚洲午夜久久久久久久久电影院| 一区二区三区在线观看动漫| 国产精品三级在线观看| 亚洲国产一区二区在线播放| 成人av电影在线观看| 在线观看91av| 一区二区三区在线高清| 99精品桃花视频在线观看| 久久无码av三级| 日韩欧美视频在线| 免费在线成人网| 91精品国产综合久久精品app| 欧美午夜理伦三级在线观看| 久久久久国产一区二区三区四区| 精品久久五月天| 国产激情精品久久久第一区二区| 欧美一级日韩一级| 成人av电影在线观看| 理论电影国产精品| 欧美一区二区三区成人| 国产精品一二三四五| 久久亚洲精品小早川怜子| 国产欧美一区视频| 精品在线视频一区| 韩国精品一区二区| 亚洲欧美综合色| 欧美电影免费观看高清完整版| 91精品在线一区二区| 亚洲日本va在线观看| 亚洲精品免费看| 91美女片黄在线观看| 91免费观看国产| 粉嫩av一区二区三区粉嫩| 国产美女主播视频一区| 北条麻妃一区二区三区| 精品乱人伦小说| 欧美一区二区三区在线| 欧美午夜一区二区三区| 色综合天天做天天爱| 国产一区在线不卡| 欧美亚洲日本一区| 蜜桃久久av一区| 欧美区视频在线观看| 日本伊人午夜精品| 精品1区2区在线观看| 亚洲综合图片区| 中文字幕乱码久久午夜不卡 | 日韩福利电影在线| 欧美美女喷水视频| 欧美一级免费大片| 国产精品资源在线观看| 欧美在线短视频| 无码av免费一区二区三区试看 | 欧美成人猛片aaaaaaa| 国产网站一区二区| 久久99精品国产| 久久久美女毛片| 日本va欧美va精品| 久久精品国产一区二区三 | 欧洲精品在线观看| 理论电影国产精品| 亚洲自拍欧美精品| 不卡视频在线看| 国产精品国产馆在线真实露脸| 国产精品一品视频| 中文字幕亚洲一区二区va在线| 制服丝袜成人动漫| av高清不卡在线| 国产在线观看免费一区| 久久精品99国产精品日本| 欧美丝袜自拍制服另类| 在线电影欧美成精品| 一区二区三区欧美日韩| 欧美最猛性xxxxx直播| 一区二区三区欧美激情| 国产精品激情偷乱一区二区∴| 国产精品久久久久桃色tv| **欧美大码日韩| 欧美国产日韩a欧美在线观看| 亚洲成人激情社区| 欧美影院一区二区| 91在线精品一区二区| 久久久亚洲欧洲日产国码αv| 亚洲永久精品大片| 欧美日本不卡视频| 久久99精品久久久久久久久久久久| 欧美精品一区二区高清在线观看| 国产主播一区二区| 一区二区三区中文免费| 欧美精品电影在线播放| 国产一区二区三区在线观看免费视频| 中文在线免费一区三区高中清不卡| 日本福利一区二区| 久久国内精品自在自线400部| 国产欧美日韩视频在线观看| 色综合天天综合狠狠| 奇米精品一区二区三区在线观看| 久久久精品欧美丰满| 日本精品视频一区二区| 久久99精品久久只有精品| 蜜臀久久久99精品久久久久久| 亚洲精品一区二区三区精华液| www.性欧美| 六月丁香综合在线视频| 国产精品国产三级国产aⅴ入口 | 欧美日韩视频不卡| 国产99久久久国产精品潘金网站| 亚洲国产一区二区三区| 久久精品一区二区三区av| 欧美影院一区二区| 成人午夜激情片| 久久国产精品色婷婷| 亚洲一区中文在线| 337p日本欧洲亚洲大胆精品| 色婷婷国产精品| 丝袜诱惑制服诱惑色一区在线观看|