溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊(cè)×
其他方式登錄
點(diǎn)擊 登錄注冊(cè) 即表示同意《億速云用戶服務(wù)條款》

Unity實(shí)現(xiàn)粒子光效導(dǎo)出成png序列幀

發(fā)布時(shí)間:2020-10-02 21:07:58 來(lái)源:腳本之家 閱讀:588 作者:langresser 欄目:編程語(yǔ)言

本文為大家分享了Unity實(shí)現(xiàn)粒子光效導(dǎo)出成png序列幀的具體代碼,供大家參考,具體內(nèi)容如下

這個(gè)功能并不是很實(shí)用,不過(guò)美術(shù)同學(xué)有這樣的需求,那么就花了一點(diǎn)時(shí)間研究了下。

我們沒(méi)有使用Unity的引擎,但是做特效的同學(xué)找了一批Unity的粒子特效,希望導(dǎo)出成png序列幀的形式,然后我們的游戲來(lái)使用。這個(gè)就相當(dāng)于拿Unity做了特效編輯器的工作。這個(gè)并不是很“邪門(mén)”,因?yàn)橛没糜傲W樱蛘?dmax,差不多也是這個(gè)思路,只不過(guò)那些軟件提供了正規(guī)的導(dǎo)出功能,而Unity則沒(méi)有。

先上代碼

using UnityEngine;
using UnityEditor;
using System;
using System.IO;
using System.Collections;
using System.Collections.Generic;
 
public class ParticleExporter : MonoBehaviour
{
 // Default folder name where you want the animations to be output
 public string folder = "PNG_Animations";
 
 // Framerate at which you want to play the animation
 public int frameRate = 25;     // export frame rate 導(dǎo)出幀率,設(shè)置Time.captureFramerate會(huì)忽略真實(shí)時(shí)間,直接使用此幀率
 public float frameCount = 100;    // export frame count 導(dǎo)出幀的數(shù)目,100幀則相當(dāng)于導(dǎo)出5秒鐘的光效時(shí)間。由于導(dǎo)出每一幀的時(shí)間很長(zhǎng),所以導(dǎo)出時(shí)間會(huì)遠(yuǎn)遠(yuǎn)長(zhǎng)于直觀的光效播放時(shí)間
 public int screenWidth = 960;    // not use 暫時(shí)沒(méi)用,希望可以直接設(shè)置屏幕的大小(即光效畫(huà)布的大?。? public int screenHeight = 640;
 public Vector3 cameraPosition = Vector3.zero;
 public Vector3 cameraRotation = Vector3.zero;
 
 private string realFolder = ""; // real folder where the output files will be
 private float originaltimescaleTime; // track the original time scale so we can freeze the animation between frames
 private float currentTime = 0;
 private bool over = false;
 private int currentIndex = 0;
 private Camera exportCamera; // camera for export 導(dǎo)出光效的攝像機(jī),使用RenderTexture
 
 public void Start()
 {
  // set frame rate
  Time.captureFramerate = frameRate;
 
  // Create a folder that doesn't exist yet. Append number if necessary.
  realFolder = Path.Combine(folder, name);
 
  // Create the folder
  if (!Directory.Exists(realFolder)) {
   Directory.CreateDirectory(realFolder);
  }
 
  originaltimescaleTime = Time.timeScale;
 
  GameObject goCamera = Camera.main.gameObject;
  if (cameraPosition != Vector3.zero) {
   goCamera.transform.position = cameraPosition;
  }
 
  if (cameraRotation != Vector3.zero) {
   goCamera.transform.rotation = Quaternion.Euler(cameraRotation);
  }
 
  GameObject go = Instantiate(goCamera) as GameObject;
  exportCamera = go.GetComponent<Camera>();
 
  currentTime = 0;
 
  
 }
 
 void Update()
 {
  currentTime += Time.deltaTime;
  if (!over && currentIndex >= frameCount) {
   over = true;
   Cleanup();
   Debug.Log("Finish");
   return;
  }
 
  // 每幀截屏
  StartCoroutine(CaptureFrame());
 }
 
 void Cleanup()
 {
  DestroyImmediate(exportCamera);
  DestroyImmediate(gameObject);
 }
 
 IEnumerator CaptureFrame()
 {
  // Stop time
  Time.timeScale = 0;
  // Yield to next frame and then start the rendering
  // this is important, otherwise will have error
  yield return new WaitForEndOfFrame();
 
  string filename = String.Format("{0}/{1:D04}.png", realFolder, ++currentIndex);
  Debug.Log(filename);
 
  int width = Screen.width;
  int height = Screen.height;
 
  //Initialize and render textures
  RenderTexture blackCamRenderTexture = new RenderTexture(width, height, 24, RenderTextureFormat.ARGB32);
  RenderTexture whiteCamRenderTexture = new RenderTexture(width, height, 24, RenderTextureFormat.ARGB32);
 
  exportCamera.targetTexture = blackCamRenderTexture;
  exportCamera.backgroundColor = Color.black;
  exportCamera.Render();
  RenderTexture.active = blackCamRenderTexture;
  Texture2D texb = GetTex2D();
 
  //Now do it for Alpha Camera
  exportCamera.targetTexture = whiteCamRenderTexture;
  exportCamera.backgroundColor = Color.white;
  exportCamera.Render();
  RenderTexture.active = whiteCamRenderTexture;
  Texture2D texw = GetTex2D();
 
  // If we have both textures then create final output texture
  if (texw && texb) {
   Texture2D outputtex = new Texture2D(width, height, TextureFormat.ARGB32, false);
 
   // we need to check alpha ourselves,because particle use additive shader
   // Create Alpha from the difference between black and white camera renders
   for (int y = 0; y < outputtex.height; ++y) { // each row
    for (int x = 0; x < outputtex.width; ++x) { // each column
     float alpha;
     alpha = texw.GetPixel(x, y).r - texb.GetPixel(x, y).r;
     alpha = 1.0f - alpha;
     Color color;
     if (alpha == 0) {
      color = Color.clear;
     } else {
      color = texb.GetPixel(x, y);
     }
     color.a = alpha;
     outputtex.SetPixel(x, y, color);
    }
   }
 
 
   // Encode the resulting output texture to a byte array then write to the file
   byte[] pngShot = outputtex.EncodeToPNG();
   File.WriteAllBytes(filename, pngShot);
 
   // cleanup, otherwise will memory leak
   pngShot = null;
   RenderTexture.active = null;
   DestroyImmediate(outputtex);
   outputtex = null;
   DestroyImmediate(blackCamRenderTexture);
   blackCamRenderTexture = null;
   DestroyImmediate(whiteCamRenderTexture);
   whiteCamRenderTexture = null;
   DestroyImmediate(texb);
   texb = null;
   DestroyImmediate(texw);
   texb = null;
 
   System.GC.Collect();
 
   // Reset the time scale, then move on to the next frame.
   Time.timeScale = originaltimescaleTime;
  }
 }
 
 // Get the texture from the screen, render all or only half of the camera
 private Texture2D GetTex2D()
 {
  // Create a texture the size of the screen, RGB24 format
  int width = Screen.width;
  int height = Screen.height;
  Texture2D tex = new Texture2D(width, height, TextureFormat.ARGB32, false);
  // Read screen contents into the texture
  tex.ReadPixels(new Rect(0, 0, width, height), 0, 0);
  tex.Apply();
  return tex;
 }
}

這里對(duì)幾個(gè)關(guān)鍵的知識(shí)點(diǎn)來(lái)做說(shuō)明:

1、整體思路是這樣的,Unity中調(diào)整好攝像機(jī),正常播放特效,然后每幀截屏,保存成我們需要的png序列幀。這個(gè)不僅僅是特效可以這么用,其實(shí)模型也可以。比如我們需要同屏顯示幾百上千人,或者是無(wú)關(guān)緊要的怪物、場(chǎng)景物件等等,就可以使用這個(gè)導(dǎo)出成2d的序列幀,可以大大提高效率,使一些不可能的情況變?yōu)榭赡堋?/p>

2、關(guān)于時(shí)間和幀率的控制。由于截屏所需要的時(shí)間遠(yuǎn)遠(yuǎn)大于幀間隔,所以光效如果是播放1秒,則導(dǎo)出時(shí)間可能超過(guò)一分鐘。Time.captureFrameRate可以設(shè)置幀率,設(shè)置后則忽略真實(shí)時(shí)間,光效、模型會(huì)按照幀率的時(shí)間來(lái)播放。這個(gè)接口恰好就是用在視頻錄制上的。

3、光效畫(huà)布控制。這個(gè)暫時(shí)沒(méi)有找到好的方法,由于是全屏幕截屏,所以Game窗口的大小就是光效畫(huà)布的大小。

4、通過(guò)調(diào)整攝像機(jī)的位置、旋轉(zhuǎn),控制光效的顯示信息。

5、截屏函數(shù)就是GetTex2D()。這里面最主要的是ReadPixels函數(shù)。需要注意,CaptureFrame函數(shù)必須要以協(xié)程的方式運(yùn)行,因?yàn)槔锩嬗幸痪鋣ield return new WaitForEndOfFrame();如果沒(méi)有這一句,會(huì)報(bào)一個(gè)錯(cuò)誤,大概意思就是ReadPixels不在DrawFrame里面運(yùn)行。

6、截屏?xí)r間消耗很大,所以需要在截屏開(kāi)始使用Time.timeScale=0暫停時(shí)間運(yùn)行,截屏后再恢復(fù)

7、注意截屏操作完成后清理各種資源,并進(jìn)行GC。否則內(nèi)存很有可能就不夠用了,截100幀圖片,內(nèi)存很有可能就兩三G了。

8、截屏的時(shí)候使用了兩個(gè)RenderTexture,分別繪制白底和黑底的圖片,然后根據(jù)這兩張圖片計(jì)算出alpha。如果不是光效其實(shí)可以不這么麻煩,直接把Camera的backgroundColor中的alpha設(shè)置為0就可以了。但是光效使用了特殊的shader,比如Additive,這里涉及到alpha blend。繪制光效時(shí)如果也這樣設(shè)置的話,導(dǎo)出的圖片沒(méi)有任何東西。所以必須要有實(shí)色背景。

以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持億速云。

向AI問(wèn)一下細(xì)節(jié)

免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。

AI