-
-
Notifications
You must be signed in to change notification settings - Fork 7.1k
[Doc] Add missing mock import to docs conf.py
#6834
New issue
Have a question about this project? No Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “No Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? No Sign in to your account
Conversation
👋 Hi! Thank you for contributing to the vLLM project. Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge). To run full CI, you can do one of these:
🚀 |
Thanks for fixing this! Is there a way to cause failures like https://readthedocs.org/projects/vllm/builds/25082568/ to fail the CI so we won't run into such issues again? (This is far from the first time the API reference has been broken by a missing mock import) |
We could use some combination of nitpicky mode and fail on warning? This is probably something a RTD admin would have to do though |
Seems like we can update https://docs.readthedocs.io/en/stable/config-file/v2.html Can you test it out in this PR? (It should fail without the mock import) |
Head branch was pushed to by a user without write access
@DarkLight1337 I can't seem to get it to fail... |
I think it's because of |
That is what |
Oh, I didn't read the details of the CLI flags and thought it meant to skip the warnings. Hmm, in that case the build should have failed... |
Oh it did fail in AWS https://buildkite.com/vllm/ci-aws/builds/5709#0190ef62-e04f-4fb9-bfb2-bd29b30540e0 |
Nice! Let's put the fix back in this PR and get it merged then. |
Ok so it seems that we need nitpicky mode to catch missing mock imports, but doing so would cause a bunch of other errors. Let's fix the docs first and revisit the issue later. |
Signed-off-by: Alvant <alvasian@yandex.ru>
Signed-off-by: LeiWang1999 <leiwang1999@outlook.com>
Fixes the issue identified in #6600 (comment)
FIX #6853