![]() ![]() It could be related to the way python does threading, but I wasn’t able to get it to work well for me. ![]() I haven’t had good performance using python / cv.imshow in the past.If you are constantly trying to read and queue frames, and your processing loop takes longer than it takes for the camera to provide an image, you will end up with a queue which continues to grow. In either case, is your processing thread able to process frames at the same rate you can generate them? If not, you might be building up a queue of unprocessed images, which could lead to a lot of latency.Do you mean you get a new frame every 22ms (about 45 FPS) or that it is 22 FPS? Maybe it has gotten better, but for high-performance multi-threading you might want to consider a different language. When I’ve used Python (not recently), I have found threading to be lacking.I didn’t examine the code closely, but a few comments: Can someone point out what mistake that i have done? Thank you I think the Camera thread is slower than the main thread. _, faces = tect(frame) # faces: None, or nx15 np.arrayĬv.imshow('libfacedetection demo', frame)Īnd the problem is the showed video is lagging behind from what is happening in realtime, even though the fps is around 22ms. ![]() # processing_thread = Detector(frame_queue, yunet) Model= "face_detection_yunet_2022mar.onnx",įrame_w = int(cap.get(cv.CAP_PROP_FRAME_WIDTH))įrame_h = int(cap.get(cv.CAP_PROP_FRAME_HEIGHT)) The code is as follow: class Camera(Thread):ĭef visualize(image, faces, print_flag=False): I am doing a learning experiment: using a worker thread to do the frame capturing from web cam, then send the frames to the main thread to do a processing which is forwarding to a face detection model (yunet). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |