Back to Jan

Reference

src-tauri/plugins/tauri-plugin-llamacpp/permissions/autogenerated/reference.md

0.7.99.9 KB
Original Source

Default Permission

Default permissions for the llamacpp plugin

This default permission set includes the following:

  • allow-cleanup-llama-processes
  • allow-load-llama-model
  • allow-unload-llama-model
  • allow-get-devices
  • allow-generate-api-key
  • allow-is-process-running
  • allow-get-random-port
  • allow-find-session-by-model
  • allow-get-loaded-models
  • allow-get-all-sessions
  • allow-get-session-by-model
  • allow-read-gguf-metadata
  • allow-estimate-kv-cache-size
  • allow-get-model-size
  • allow-is-model-supported
  • allow-plan-model-load
  • allow-map-old-backend-to-new
  • allow-get-local-installed-backends
  • allow-list-supported-backends
  • allow-determine-supported-backends
  • allow-get-supported-features
  • allow-is-cuda-installed
  • allow-find-latest-version-for-backend
  • allow-prioritize-backends
  • allow-parse-backend-version
  • allow-check-backend-for-updates
  • allow-remove-old-backend-versions
  • allow-validate-backend-string
  • allow-should-migrate-backend
  • allow-handle-setting-update

Permission Table

<table> <tr> <th>Identifier</th> <th>Description</th> </tr> <tr> <td>

llamacpp:allow-check-backend-for-updates

</td> <td>

Enables the check_backend_for_updates command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-check-backend-for-updates

</td> <td>

Denies the check_backend_for_updates command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-cleanup-llama-processes

</td> <td>

Enables the cleanup_llama_processes command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-cleanup-llama-processes

</td> <td>

Denies the cleanup_llama_processes command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-determine-supported-backends

</td> <td>

Enables the determine_supported_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-determine-supported-backends

</td> <td>

Denies the determine_supported_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-estimate-kv-cache-size

</td> <td>

Enables the estimate_kv_cache_size command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-estimate-kv-cache-size

</td> <td>

Denies the estimate_kv_cache_size command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-find-latest-version-for-backend

</td> <td>

Enables the find_latest_version_for_backend command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-find-latest-version-for-backend

</td> <td>

Denies the find_latest_version_for_backend command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-find-session-by-model

</td> <td>

Enables the find_session_by_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-find-session-by-model

</td> <td>

Denies the find_session_by_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-generate-api-key

</td> <td>

Enables the generate_api_key command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-generate-api-key

</td> <td>

Denies the generate_api_key command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-all-sessions

</td> <td>

Enables the get_all_sessions command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-all-sessions

</td> <td>

Denies the get_all_sessions command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-devices

</td> <td>

Enables the get_devices command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-devices

</td> <td>

Denies the get_devices command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-loaded-models

</td> <td>

Enables the get_loaded_models command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-loaded-models

</td> <td>

Denies the get_loaded_models command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-local-installed-backends

</td> <td>

Enables the get_local_installed_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-local-installed-backends

</td> <td>

Denies the get_local_installed_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-model-size

</td> <td>

Enables the get_model_size command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-model-size

</td> <td>

Denies the get_model_size command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-random-port

</td> <td>

Enables the get_random_port command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-random-port

</td> <td>

Denies the get_random_port command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-session-by-model

</td> <td>

Enables the get_session_by_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-session-by-model

</td> <td>

Denies the get_session_by_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-get-supported-features

</td> <td>

Enables the get_supported_features command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-get-supported-features

</td> <td>

Denies the get_supported_features command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-handle-setting-update

</td> <td>

Enables the handle_setting_update command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-handle-setting-update

</td> <td>

Denies the handle_setting_update command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-is-cuda-installed

</td> <td>

Enables the is_cuda_installed command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-is-cuda-installed

</td> <td>

Denies the is_cuda_installed command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-is-model-supported

</td> <td>

Enables the is_model_supported command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-is-model-supported

</td> <td>

Denies the is_model_supported command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-is-process-running

</td> <td>

Enables the is_process_running command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-is-process-running

</td> <td>

Denies the is_process_running command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-list-supported-backends

</td> <td>

Enables the list_supported_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-list-supported-backends

</td> <td>

Denies the list_supported_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-load-llama-model

</td> <td>

Enables the load_llama_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-load-llama-model

</td> <td>

Denies the load_llama_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-map-old-backend-to-new

</td> <td>

Enables the map_old_backend_to_new command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-map-old-backend-to-new

</td> <td>

Denies the map_old_backend_to_new command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-parse-backend-version

</td> <td>

Enables the parse_backend_version command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-parse-backend-version

</td> <td>

Denies the parse_backend_version command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-plan-model-load

</td> <td>

Enables the plan_model_load command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-plan-model-load

</td> <td>

Denies the plan_model_load command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-prioritize-backends

</td> <td>

Enables the prioritize_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-prioritize-backends

</td> <td>

Denies the prioritize_backends command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-read-gguf-metadata

</td> <td>

Enables the read_gguf_metadata command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-read-gguf-metadata

</td> <td>

Denies the read_gguf_metadata command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-remove-old-backend-versions

</td> <td>

Enables the remove_old_backend_versions command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-remove-old-backend-versions

</td> <td>

Denies the remove_old_backend_versions command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-should-migrate-backend

</td> <td>

Enables the should_migrate_backend command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-should-migrate-backend

</td> <td>

Denies the should_migrate_backend command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-unload-llama-model

</td> <td>

Enables the unload_llama_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-unload-llama-model

</td> <td>

Denies the unload_llama_model command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:allow-validate-backend-string

</td> <td>

Enables the validate_backend_string command without any pre-configured scope.

</td> </tr> <tr> <td>

llamacpp:deny-validate-backend-string

</td> <td>

Denies the validate_backend_string command without any pre-configured scope.

</td> </tr> </table>