I am streaming raw frames from a QCamera on a N950 using the following code:
cam = new QCamera();
QMediaService *m = cam->service();
QVideoRendererControl* vrc = m->requestControl<QVideoRendererControl*>();
where myVideoSurface is a subclass of QAbstractVideoSurface which seems to be doing fine obtaining frames one by one in its present() method.
I have a couple of questions:
* Is the above method indeed the "proper" way to access raw video frames? I managed to end up with it via endless googling/trial and error, so I am not sure. Is there in fact some piece of documentation somewhere that explains this explicitly?
* How can I change the camera's resolution/aspect ratio? I have a hunch this might be related to QVideoEncoderControl but so far attempts to use it to change just the resolution resulted in strange results. I.e. if I do something like
QVideoEncoderControl *enc = m->requestControl<QVideoEncoderControl*>();
QVideoEncoderSettings sets = enc->videoSettings();
my video surface indeed ends up receiving frames which are sized at 500x500, but if I decode and display them I get weird lines instead of proper frames. What is the right way of changing camera resolution and again, is there any good place documenting this?