溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點(diǎn)擊 登錄注冊 即表示同意《億速云用戶服務(wù)條款》

tensorflow中next_batch的具體使用

發(fā)布時(shí)間:2020-09-25 11:53:16 來源:腳本之家 閱讀:323 作者:小妖精Fsky 欄目:開發(fā)技術(shù)

本文介紹了tensorflow中next_batch的具體使用,分享給大家,具體如下:

此處給出了幾種不同的next_batch方法,該文章只是做出代碼片段的解釋,以備以后查看:

 def next_batch(self, batch_size, fake_data=False):
  """Return the next `batch_size` examples from this data set."""
  if fake_data:
   fake_image = [1] * 784
   if self.one_hot:
    fake_label = [1] + [0] * 9
   else:
    fake_label = 0
   return [fake_image for _ in xrange(batch_size)], [
     fake_label for _ in xrange(batch_size)
   ]
  start = self._index_in_epoch
  self._index_in_epoch += batch_size
  if self._index_in_epoch > self._num_examples: # epoch中的句子下標(biāo)是否大于所有語料的個(gè)數(shù),如果為True,開始新一輪的遍歷
   # Finished epoch
   self._epochs_completed += 1
   # Shuffle the data
   perm = numpy.arange(self._num_examples) # arange函數(shù)用于創(chuàng)建等差數(shù)組
   numpy.random.shuffle(perm) # 打亂
   self._images = self._images[perm]
   self._labels = self._labels[perm]
   # Start next epoch
   start = 0
   self._index_in_epoch = batch_size
   assert batch_size <= self._num_examples
  end = self._index_in_epoch
  return self._images[start:end], self._labels[start:end]

該段代碼摘自mnist.py文件,從代碼第12行start = self._index_in_epoch開始解釋,_index_in_epoch-1是上一次batch個(gè)圖片中最后一張圖片的下邊,這次epoch第一張圖片的下標(biāo)是從 _index_in_epoch開始,最后一張圖片的下標(biāo)是_index_in_epoch+batch, 如果 _index_in_epoch 大于語料中圖片的個(gè)數(shù),表示這個(gè)epoch是不合適的,就算是完成了語料的一遍的遍歷,所以應(yīng)該對圖片洗牌然后開始新一輪的語料組成batch開始

def ptb_iterator(raw_data, batch_size, num_steps):
 """Iterate on the raw PTB data.

 This generates batch_size pointers into the raw PTB data, and allows
 minibatch iteration along these pointers.

 Args:
  raw_data: one of the raw data outputs from ptb_raw_data.
  batch_size: int, the batch size.
  num_steps: int, the number of unrolls.

 Yields:
  Pairs of the batched data, each a matrix of shape [batch_size, num_steps].
  The second element of the tuple is the same data time-shifted to the
  right by one.

 Raises:
  ValueError: if batch_size or num_steps are too high.
 """
 raw_data = np.array(raw_data, dtype=np.int32)

 data_len = len(raw_data)
 batch_len = data_len // batch_size #有多少個(gè)batch
 data = np.zeros([batch_size, batch_len], dtype=np.int32) # batch_len 有多少個(gè)單詞
 for i in range(batch_size): # batch_size 有多少個(gè)batch
  data[i] = raw_data[batch_len * i:batch_len * (i + 1)]

 epoch_size = (batch_len - 1) // num_steps # batch_len 是指一個(gè)batch中有多少個(gè)句子
 #epoch_size = ((len(data) // model.batch_size) - 1) // model.num_steps # // 表示整數(shù)除法
 if epoch_size == 0:
  raise ValueError("epoch_size == 0, decrease batch_size or num_steps")

 for i in range(epoch_size):
  x = data[:, i*num_steps:(i+1)*num_steps]
  y = data[:, i*num_steps+1:(i+1)*num_steps+1]
  yield (x, y)

第三種方式:

  def next(self, batch_size):
    """ Return a batch of data. When dataset end is reached, start over.
    """
    if self.batch_id == len(self.data):
      self.batch_id = 0
    batch_data = (self.data[self.batch_id:min(self.batch_id +
                         batch_size, len(self.data))])
    batch_labels = (self.labels[self.batch_id:min(self.batch_id +
                         batch_size, len(self.data))])
    batch_seqlen = (self.seqlen[self.batch_id:min(self.batch_id +
                         batch_size, len(self.data))])
    self.batch_id = min(self.batch_id + batch_size, len(self.data))
    return batch_data, batch_labels, batch_seqlen

第四種方式:

def batch_iter(sourceData, batch_size, num_epochs, shuffle=True):
  data = np.array(sourceData) # 將sourceData轉(zhuǎn)換為array存儲
  data_size = len(sourceData)
  num_batches_per_epoch = int(len(sourceData) / batch_size) + 1
  for epoch in range(num_epochs):
    # Shuffle the data at each epoch
    if shuffle:
      shuffle_indices = np.random.permutation(np.arange(data_size))
      shuffled_data = sourceData[shuffle_indices]
    else:
      shuffled_data = sourceData

    for batch_num in range(num_batches_per_epoch):
      start_index = batch_num * batch_size
      end_index = min((batch_num + 1) * batch_size, data_size)

      yield shuffled_data[start_index:end_index]

迭代器的用法,具體學(xué)習(xí)Python迭代器的用法

另外需要注意的是,前三種方式只是所有語料遍歷一次,而最后一種方法是,所有語料遍歷了num_epochs次

以上就是本文的全部內(nèi)容,希望對大家的學(xué)習(xí)有所幫助,也希望大家多多支持億速云。

向AI問一下細(xì)節(jié)

免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。

AI