Custom Components GalleryNEW
ExploreCustom Components GalleryNEW
ExploreNew to Gradio? Start here: Getting Started
See the Release History
To install Gradio from main, run the following command:
pip install https://gradio-builds.s3.amazonaws.com/98c1cbcd0e3b66a8d2463820d3ce157efe0a032d/gradio-4.31.4-py3-none-any.whl
*Note: Setting share=True
in
launch()
will not work.
gradio.load(name, ยทยทยท)
Constructs a demo from a Hugging Face repo. Can accept model repos (if src is "models") or Space repos (if src is "spaces"). The input and output components are automatically loaded from the repo. Note that if a Space is loaded, certain high-level attributes of the Blocks (e.g. custom css
, js
, and head
attributes) will not be loaded.
import gradio as gr
demo = gr.load("gradio/question-answering", src="spaces")
demo.launch()
Parameter | Description |
---|---|
name str required | the name of the model (e.g. "gpt2" or "facebook/bart-base") or space (e.g. "flax-community/spanish-gpt2"), can include the |
src str | None default: None | the source of the model: |
hf_token str | None default: None | optional access token for loading private Hugging Face Hub models or spaces. Find your token here: https://huggingface.co/settings/tokens. Warning: only provide this if you are loading a trusted private Space as it can be read by the Space you are loading. |
alias str | None default: None | optional string used as the name of the loaded model instead of the default name (only applies if loading a Space running Gradio 2.x) |