Quantcast
Viewing latest article 15
Browse Latest Browse All 256

delay webcam in unity

Please help.. It's just a newbie who is just getting into the unity development. I thought about implementing a webcam and completed it. To get to the stage, we use our webcam to sprinkle hand movements in real time and react in seconds. Delaying was successful![alt text][1], but no camera rendering keeps popping up. The error code is UnassignedReferenceException: The variable renderCamera of DelayedCamera has not been assigned. You probably need to assign the renderCamera variable of the DelayedCamera script in the inspector. DelayedCamera.Awake () (at Assets/Scenes/DelayedCamera.cs:107) The whole code is using UnityEngine; using System.Collections; public class DelayedCamera : MonoBehaviour { public struct Frame { /// /// The texture representing the frame /// private Texture2D texture; /// /// The time (in seconds) the frame has been captured at /// private float recordedTime; /// /// Captures a new frame using the given render texture /// /// The render texture this frame must be captured from public void CaptureFrom( RenderTexture renderTexture ) { RenderTexture.active = renderTexture; // Create a new texture if none have been created yet in the given array index if ( texture == null ) texture = new Texture2D( renderTexture.width, renderTexture.height ); // Save what the camera sees into the texture texture.ReadPixels( new Rect( 0, 0, renderTexture.width, renderTexture.height ), 0, 0 ); texture.Apply(); recordedTime = Time.time; RenderTexture.active = null; } /// /// Indicates whether the frame has been captured before the given time /// /// The time /// true if the frame has been captured before the given time, false otherwise public bool CapturedBefore( float time ) { return recordedTime < time; } /// /// Operator to convert the frame to a texture /// /// public static implicit operator Texture2D( Frame frame ) { return frame.texture; } } /// /// The camera used to capture the frames /// [SerializeField] private Camera renderCamera; /// /// The delay /// [SerializeField] private float delay = 0.5f; /// /// The size of the buffer containing the recorded images /// /// /// Try to keep this value as low as possible according to the delay /// private int bufferSize = 256; /// /// The render texture used to record what the camera sees /// private RenderTexture renderTexture; /// /// The recorded frames /// private Frame[] frames; /// /// The index of the captured texture /// private int capturedFrameIndex; /// /// The index of the rendered texture /// private int renderedFrameIndex; /// /// The frame index /// private int frameIndex; private void Awake() { frames = new Frame[bufferSize]; // Change the depth value from 24 to 16 may improve performances. And try to specify an image format with better compression. renderTexture = new RenderTexture( Screen.width, Screen.height, 24 ); renderCamera.targetTexture = renderTexture; StartCoroutine( Render() ); } /// /// Makes the camera render with a delay /// /// private IEnumerator Render() { WaitForEndOfFrame wait = new WaitForEndOfFrame(); while ( true ) { yield return wait; capturedFrameIndex = frameIndex % bufferSize; frames[capturedFrameIndex].CaptureFrom( renderTexture ); // Find the index of the frame to render // The foor loop is **voluntary** empty for ( ; frames[renderedFrameIndex].CapturedBefore( Time.time - delay ) ; renderedFrameIndex = ( renderedFrameIndex + 1 ) % bufferSize ) ; Graphics.Blit( frames[renderedFrameIndex], null as RenderTexture ); frameIndex++; } } } [1]: /storage/temp/175246-data-1.jpg best regards

Viewing latest article 15
Browse Latest Browse All 256

Trending Articles