Model Retrieval Errors¶
Base Class: ModelRetrievalError
Model retrieval errors occur when the system fails to retrieve model metadata from the weights provider (typically Roboflow API). These errors happen during the model discovery phase and are typically caused by authentication issues, network problems, or inconsistent metadata.
ModelRetrievalError¶
Failed to retrieve model metadata from the weights provider.
Overview¶
This error is raised when the system cannot retrieve model metadata from the weights provider. It can be raised directly for general retrieval failures, or you may encounter one of its more specific subclasses.
When It Occurs¶
Scenario 1: Unknown weights provider
-
Requesting a provider that is not registered
-
Typo in provider name
-
Custom provider not registered before use
Scenario 2: Empty model packages list
-
Roboflow API returns no model packages
-
Model exists but has no compatible packages
-
Model processing incomplete on Roboflow platform
Scenario 3: API response decode failure
-
Roboflow API returns malformed JSON
-
Response structure doesn't match expected schema
-
Missing required fields in API response
Scenario 4: HTTP error from provider
-
API returns 4xx or 5xx error (except 401/403 which raise
UnauthorizedModelAccessError) -
Network issues or timeouts
-
Provider service unavailable
What To Check¶
-
Verify provider name:
-
Check model exists and is ready:
-
Visit Roboflow dashboard
-
Verify model training/deployment is complete
-
Check model version exists
-
Check network connectivity:
-
Review error message:
-
"provider which is not implemented" → Unknown provider
-
"empty list of model packages" → No packages available
-
"Could not decode" → API response format issue
-
"invalid response code" → HTTP error from API
-
How To Fix¶
Scenario 1: Unknown provider
from inference_models import AutoModel
# ❌ Wrong - typo in provider name
model = AutoModel.from_pretrained(
"yolov8n-640",
weights_provider="roboflo" # typo
)
# ✅ Correct
model = AutoModel.from_pretrained(
"yolov8n-640",
weights_provider="roboflow"
)
# Or use default (roboflow)
model = AutoModel.from_pretrained("yolov8n-640")
Register custom provider:
from inference_models.weights_providers import register_model_provider
# Register your custom provider
register_model_provider("my_provider", my_provider_function)
# Then use it
model = AutoModel.from_pretrained(
"model-id",
weights_provider="my_provider"
)
Scenario 2: Empty packages list
-
Wait for model to finish processing on Roboflow
-
Check model status in Roboflow dashboard
-
Verify model version exists
-
Contact Roboflow support if model should have packages
Scenario 3: API decode failure
This is typically a Roboflow API issue:
-
Report the issue with:
-
Full error message
-
Model ID
-
Timestamp when error occurred
-
Try again later (may be temporary API issue)
Scenario 4: HTTP errors
-
Check network connectivity
-
Verify Roboflow API is accessible
-
Try again (may be temporary)
-
If persistent, report the issue
UnauthorizedModelAccessError¶
Unauthorized access to model - invalid or missing API key.
Overview¶
This error occurs when you try to access a model without proper authentication or with an invalid API key.
When It Occurs¶
Scenario 1: Missing API key
-
Trying to access a private Roboflow model without API key
-
ROBOFLOW_API_KEYenvironment variable not set -
No
api_keyparameter provided toAutoModel.from_pretrained()
Scenario 2: Invalid API key
-
API key is incorrect or malformed
-
API key has been revoked or expired
-
API key doesn't have access to the requested workspace/project
Scenario 3: Wrong workspace
-
API key is valid but for a different workspace
-
Model ID references a workspace you don't have access to
What To Check¶
-
Verify API key is set:
-
Check API key validity:
-
Verify model ID format:
How To Fix¶
Set API key as environment variable:
Pass API key directly:
from inference_models import AutoModel
model = AutoModel.from_pretrained(
"my-workspace/my-model/1",
api_key="your_api_key_here"
)
Get your API key:
-
Visit Roboflow Dashboard
-
Go to Settings → Roboflow API
-
Copy your API key
-
See Authentication Guide for details
Verify workspace access:
-
Make sure the API key belongs to the correct workspace
-
Check that you have access to the project
-
Verify the model version exists
ModelMetadataConsistencyError¶
Inconsistent or invalid model metadata returned by the weights provider.
Overview¶
This error occurs when the Roboflow API returns model metadata that is malformed, incomplete, or contains inconsistent information.
When It Occurs¶
Scenario 1: Unparseable metadata
-
API returns metadata in unexpected format
-
JSON structure doesn't match expected schema
-
Required fields are missing
Scenario 2: Invalid batch size configuration
-
TRT package declares dynamic batch size but doesn't specify min/opt/max values
-
Static batch size package has invalid or missing batch size value
Scenario 3: Invalid version specification
-
Package manifest contains invalid version string
-
Version format doesn't follow semantic versioning
Scenario 4: Inconsistent package configuration
-
Package declares conflicting settings
-
Required package files are missing from metadata
What To Check¶
Since this is Weights Porvider issue, it is not actionable for users and should be reported to Roboflow, unless custom weights provider is used - then it should be fixed by the provider.
How To Fix¶
This is typically a Roboflow API issue:
-
Report the issue with:
-
Full error message
-
Model ID
-
Workspace name
-
When the error started occurring
If you're a custom weights provider:
-
Ensure your metadata follows the expected schema
-
Validate all required fields are present
-
Check version strings follow semantic versioning
-
Verify batch size configuration is consistent
ModelMetadataHandlerNotImplementedError¶
Model metadata handler is not implemented for this model type.
Overview¶
This error occurs when the Roboflow API returns metadata for a model type that is not yet supported by your version of inference-models.
When It Occurs¶
Scenario 1: Outdated inference-models package
-
Using an old version of
inference-models -
New model type added to Roboflow platform
-
Your package doesn't have the handler for this model type
Scenario 2: New/experimental model type
-
Model uses a new architecture not yet in stable release
-
Beta/experimental model type
-
Custom model type not in standard distribution
What To Check¶
-
Check your inference-models version:
-
Check for available updates and install if available
-
Review error message:
-
Usually indicates which model type/handler is missing
-
May suggest upgrading the package
How To Fix¶
Upgrade inference-models:
# Upgrade to latest version
uv pip install --upgrade inference-models
# Or install specific version
uv pip install inference-models==x.y.z
If already on latest version:
-
This model type may not be supported yet
-
Check GitHub issues for status
-
Open a new issue if not reported