您好,登錄后才能下訂單哦!
這篇文章給大家分享的是有關(guān)在Pytorch中如何使用樣本權(quán)重sample_weight的內(nèi)容。小編覺得挺實(shí)用的,因此分享給大家做個(gè)參考,一起跟隨小編過來看看吧。
step:
1.將標(biāo)簽轉(zhuǎn)換為one-hot形式。
2.將每一個(gè)one-hot標(biāo)簽中的1改為預(yù)設(shè)樣本權(quán)重的值
即可在Pytorch中使用樣本權(quán)重。
eg:
對(duì)于單個(gè)樣本:loss = - Q * log(P),如下:
P = [0.1,0.2,0.4,0.3] Q = [0,0,1,0] loss = -Q * np.log(P)
增加樣本權(quán)重則為loss = - Q * log(P) *sample_weight
P = [0.1,0.2,0.4,0.3] Q = [0,0,sample_weight,0] loss_samle_weight = -Q * np.log(P)
在pytorch中示例程序
train_data = np.load(open('train_data.npy','rb')) train_labels = [] for i in range(8): train_labels += [i] *100 train_labels = np.array(train_labels) train_labels = to_categorical(train_labels).astype("float32") sample_1 = [random.random() for i in range(len(train_data))] for i in range(len(train_data)): floor = i / 100 train_labels[i][floor] = sample_1[i] train_data = torch.from_numpy(train_data) train_labels = torch.from_numpy(train_labels) dataset = dataf.TensorDataset(train_data,train_labels) trainloader = dataf.DataLoader(dataset, batch_size=batch_size, shuffle=True)
對(duì)應(yīng)one-target的多分類交叉熵?fù)p失函數(shù)如下:
def my_loss(outputs, targets): output2 = outputs - torch.max(outputs, 1, True)[0] P = torch.exp(output2) / torch.sum(torch.exp(output2), 1,True) + 1e-10 loss = -torch.mean(targets * torch.log(P)) return loss
感謝各位的閱讀!關(guān)于“在Pytorch中如何使用樣本權(quán)重sample_weight”這篇文章就分享到這里了,希望以上內(nèi)容可以對(duì)大家有一定的幫助,讓大家可以學(xué)到更多知識(shí),如果覺得文章不錯(cuò),可以把它分享出去讓更多的人看到吧!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。