Hi everyone,
In a recent project on Unity 2021.3.8f1 , I try to use `WebCamTexture`, and everything looks fine on macOS & IOs & Android. See the code below :
private void SetUpCamera()
{
WebCamDevice[] devices = WebCamTexture.devices;
Debug.Log(devices);
if(devices.Length == 0)
{
isCamAvailable = false;
return;
}
for(int i = 0; i < devices.Length; i++)
{
Debug.Log(devices[i].name + " - Front facing:" + devices[i].isFrontFacing);
devicesList.text = devicesList.text + "\n" + devices[i].name + " - Front facing:" + devices[i].isFrontFacing;
if (devices[i].isFrontFacing == true)
{
camTexture = new WebCamTexture(devices[i].name, 500, 500);
break;
}
}
camTexture.Play();
rawImageBackground.texture = camTexture;
isCamAvailable = true;
}
But on Arch Linux (5.18.16-arch1-1) with a Ryzen 9 5900X & a GTX 960 & 16Gb DDR4, I got an absurd CPU usage when the webcam (Logitech C920) is running. Even reaching 100% on a single core at one point :
![alt text][1]
The webcam is running, but with a huge visual bug making it stretched and unusable :
![alt text][2]
Thanks for any help or even lead on that subject :)
[1]: /storage/temp/198735-screenshot-from-2022-08-14-16-11-24.png
[2]: /storage/temp/198736-screenshot-from-2022-08-14-16-22-43.png
↧