I recently started doing internet interviews (you'll get news soon) using my brand new 2020 MacBook Air (yes, the one apparently about to be overtaken by the new ARM architecture Macs), but I found that the iSight camera that the computer embeds is absolutely regrettable (and I'm not the only one to have noticed this). It appears to be a generic evil in the last few generations of Apple computers, which, with global confinement, has been absolutely in evidence.
My first solution was to buy a webcam through Amazon, for a modest sum of € 20. The camera is pretty good and worked, as expected, plugged in and ready.
Its big problem is that it doesn't have a fixed lens, but it allows you to focus manually. This, which is apparently an advantage, when it comes to limited distances such as separating a computer screen from a server, makes it extremely difficult to fine-tune when focused and when you've already passed.
On the other hand, as soon as you move a little, forward or backward, you run the risk of being out of focus again. Therefore, a use case where too many options are worse for the user.
Conversing with the gathering (patience, you know ...) it was possible to use the iPhone (or iPad) as a webcam for the Mac, knowing that the phone's camera - even the front one - is infinitely better than that of the MacBook Aria, and therefore, of course, the one that came from Amazon.
There was already the option to connect the iPhone directly to the computer so that QuickTime recognizes it, but sometimes you don't want to see everything the camera can capture (you understand me).
NeuralCam Live
Setting up NeuralCam Live is very simple: download the app from the Mac App Store. In the process of setting up the app itself, it allows you to AirDrop the complement so that your computer and phone are "understood".
You just have to connect the iPhone (or iPad) to the computer via its USB charging cable and select the desired program (QuickTime, Zoom, Skype) on the Mac, the iPhone as the camera.
Boom! Suddenly you will appear in Hyperreality!
The program offers several options, such as blurring the background, creating a "bubble of light", or putting a white halo around the head to hide the surrounding environment and automatic blurring if you touch your face (in case you sneeze or cough) or if someone passes behind.
You can sign up to remove the watermark and get some extra utilities (like low light mode) and filters for a modest € 32,99 per year, though if you're planning on using it as a webcam that's not particularly necessary.
Once the phone is connected to the computer, you can also record the iPhone screen from the computer, if you want to do some kind of tutorial or explanation.
What apps do you use to use iPhone as a Mac camera?
Record part of the Mac screen with QuickTime
mimetic
Another of the applications that I have tried to use the iPhone camera with the Mac during video conferencing is Camo. It has some peculiarities that clearly differentiate it from NeuroCam.
To get started, instead of having the configuration settings in the iPhone app, you need to use the app on the Mac, which appears as an entry in the menu bar, and log into Camo Studio. This allows you to choose the camera you want to use for your conference, being able to choose between the front iSight or any rear camera system on your iPhone (some require a subscription payment.
You also have the settings to put the filters (too radical for my taste)
But the main difference is that Camo places a watermark in the video it captures, unless you purchase the paid version. Being able to change the background is an option that the video conferencing program you use has to offer. The app no.
The paid version costs 41,13 euros per year.
In short, if you need to appear in video conferences in the most professional way possible, Camo is undoubtedly an application that you should consider because it is powerful and easy to set up.
Use iPhone as a camera with FaceTime
I could not. Do you know if the iPhone can continue to be used as a FaceTime camera in Big Sur?