the first param for FrameDetector’s constructor is bufferSize,the doc’s explanation is “Size of buffer to use for processing”,but I can’t understand,what’s the unit of the param?It is Mega bytes,or video frames? How to set this param for real time video process? If I detect a HD1080p video stream at 30fps processFrameRate,what value should I set?
The buffer size is the number of frames to be allocated in an internal frame buffer which is used to pass incoming frames from the client thread to the background thread, where the actual processing takes place.
In situations where the client feeds frames to the FrameDetector faster than the background thread can process them, a larger buffer allows more frames to accumulate while waiting for processing. If the buffer becomes full, frames at the front will be discarded to make room for newly arriving frames.
In practice, the default value of 30 works well for most situations.
Thanks for your explanation
Now I have an another question,I do some process in the onImageResults() callback functions,it take about 10 milleseconds,but I hope the processing can be executed simultaneously with detector’s background thread,so if I allocate more buffers(50-60 frames),can I acheivce the goal?