Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fps error while running inference.py #172

Closed
danial880 opened this issue Mar 21, 2020 · 1 comment
Closed

Fps error while running inference.py #172

danial880 opened this issue Mar 21, 2020 · 1 comment

Comments

@danial880
Copy link

I have used the following command to run inference.py

python inference.py --cfg inference-config.yaml --videoFile /videos/video.mp4 --inferenceFps 10 --writeBoxFrames TEST.MODEL_FILE models/pytorch/pose_coco/pose_hrnet_w32_384x288.pth

Here are the logs


=> loading model from models/pytorch/pose_coco/pose_hrnet_w32_384x288.pth
desired inference fps is 10 but video fps is 0.0


@gachiemchiep
Copy link
Contributor

@danial880
Your video is failed to load, check the path of your video

vidcap = cv2.VideoCapture(args.videoFile)
fps = vidcap.get(cv2.CAP_PROP_FPS)
if fps < args.inferenceFps:
    print('desired inference fps is '+str(args.inferenceFps)+' but video fps is '+str(fps))
    exit()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants