Back to Label Studio

2.32.0

docs/source/guide/release_notes/onprem/2.32.0.md

2.2.10-111.3 KB
Original Source

Label Studio Enterprise 2.32.0

<div class="onprem-highlight">Support reports, granular permission options, new command palette</div>

Jan 14, 2026

Helm Chart version: 1.11.7

New features

Support reports

There is a new option under Organization > Settings to create Support Reports.

A Support Report is a downloadable file with diagnostic and usage statistics generated from your Label Studio Enterprise deployment.

These reports can help our support engineer diagnose issues, but are also transparent - meaning you can inspect their contents before sharing.

For more information, see Support reports.

Refine permissions

There is a new page available from Organization > Settings > Permissions that allows users in the Owner role to refine permissions across the organization.

This page is only visible to users in the Owner role.

For more information, see Customize organization permissions.

Command palette for enhanced searching

The command palette is an enhanced search tool that you can use to navigate to resources both inside and outside the app, find workspaces and projects, and (if used within a project) navigate to project settings.

For more information, see Command palette.

Feature updates

Annotator evaluation enhancements

The Annotator Evaluation feature been improved to include a more robust ongoing evaluation functionality and improved clarity within the UI.

Enhancements include:

  • An improved UI to make it clearer what will happen based on your selections.
  • Previously if you were using "ongoing" evaluation mode, ground truth tasks were restricted by whatever overlap you configured for your project. This meant that typically only the first annotators to reach the ground truth tasks were evaluated.Now, if annotator evaluation is enabled, all annotators will be evaluated against ground truth tasks regardless of whether the overlap threshold has been reached. This will ensure that even if the project has a large number of annotators and a comparatively small annotation-per-task requirement, all annotators will still be evaluated.
  • You can now set up your project so that annotator evaluation happens both at the beginning of the project as well as on an ongoing basis. Previously, you could only choose one or the other.

Before:

After:

!!! info Tip You can disallow skipping in all project tasks. But if you want to allow skipping while ensuring that annotators cannot skip ground truth tasks, you can use the new unskippable task feature described below.

Task summary improvements

We have made a number of improvements to task summaries.

Before:

After:

Improvements include:

  • Label Distribution A new Distribution row provides aggregate information. Depending on the tag type, this could be an average, a count, or other distribution.
  • Updated styling Multiple UI elements have been updated and improved, including banded rows and sticky columns. You can also now see full usernames.
  • Autoselect the comparison view If you are looking at the comparison view and move to the next task, the comparison view will be automatically selected.

Improved annotation tabs

Annotation tabs have the following improvements:

  • To improve readability, removed the annotation ID from the tab and truncated long model or prompts project names.
  • Added three new options to the overflow menu for the annotation tab:
    • Copy Annotation ID - To copy the annotation ID that previously appeared in the tab
    • Open Performance Dashboard - Open the Member Performance Dashboard with the user and project pre-selected.
    • Show Other Annotations Open the task summary view.

Before:

After:

Unskippable tasks

While you can hide the Skip action in the project settings, this enhancement allows you to configure individual tasks so that any user in the Annotator or Reviewer role should not be able to skip them.

To make a task unskippable, you must specify "allow_skip": false as part of the JSON task definition that you import to your project.

For example, the following JSON snippet would result in one skippable task and one unskippable task:

json
[
  {
    "data": {
      "text": "Demo text 1"
    },
    "allow_skip": false
  },
  {
    "data": {
      "text": "Demo text 2"
    }
  }
]

For more information, see Skipping tasks.

Apply overlap only to distinct users

When configuring Annotations per task for a project, only annotations from distinct users will count towards task overlap.

Previously, if a project had Annotations per task set to 2, and User A created and then submitted two annotations on a single task (which can be done in Quick View), then the task would be considered completed.

Now, the task would not be completed until a different user submitted an annotation.

Improved Local Storage setup process

The Local Storage setup process has a number of improvements.

Changes include:

  • When entering the absolute local path in the storage modal, trailing slashes will be automatically removed to prevent an error.
  • Normalized slashes when saving the absolute local path to prevent confusion between Unix and Windows environments.
  • If you create folders named "my-data" or "label-studio-data" and run the label-studio command in the parent directory, the LOCAL_FILES_DOCUMENT_ROOT and LOCAL_FILES_SERVING_ENABLED variables will be automatically set to point to those folders.
  • Improved error messages when the absolute local path does not include a subdirectory but matched the local files document root.
  • When the local files document root is set, the absolute local path is automatically preloaded with the appropriate path.

Support for GPT-5.1 and GPT-5.2

When you add OpenAI models to Prompts or to the organization model provider list, GPT-5.1 and GPT-5.2 will now be included.

Additional functional and usability enhancements

  • When configuring SAML, you can now click on a selection of common IdPs to pre-fill values with presets.

  • The Recent Projects list on the Home page will now include the most recently visited projects at the top of the list instead of pinned projects.

  • Removed the Publish button from the project Members page. It is now only on the Dashboard page.

  • When using the API to bulk assign and unassign users, you can now filter by last activity and role.

  • To better utilize space, the annotation ID and the navigation controls for the labeling stream have been moved to below the labeling interface.

Performance improvements

This release includes multiple performance improvements and optimizations.

Security

Improved permission checks when retrieving the workspaces list.

Bug fixes

  • Fixed an issue where PDFs were not filling the full height of the canvas.

  • Fixed a number of issues causing out-of-memory errors.

  • Fixed a layout issue with the overflow menu on the project Dashboard page.

  • Fixed a number of issues with the Member Performance Dashboard.

  • Fixed an issue that would cause API validation to fail when setting a 99% low agreement threshold.

  • Fixed an issue where the AI-assisted project setup would return markdown, causing an error.

  • Fixed an issue with agreement scores for Rating.

  • Fixed an issue with Managers being able move projects into workspaces where they were not members.

  • Fixed an issue with prediction validation for the Ranker tag.

  • Fixed an issue with syncing from Databricks.

  • Fixed an issue where users could not display two PDFs in the same labeling interface.

  • Fixed an issue where the scores reflected under Agreement (Selected)  were sometimes lower than expected.

  • Fixed an issue where the Agreement (Selected) dropdown would not open.

  • Fixed an issue where relations between VideoRectangles regions were not visible.

  • Fixed an issue that caused the Data Manager to throw a Runtime Error when sorting by Review Time.

  • Fixed an issue when PDF regions could not be drawn when moving the mouse in certain directions.

  • Fixed an issue where users were not shown a clear error message when attempting to access a page in which you do not have permission to view.

  • Fixed an issue with prompts not allowing negative Number tag results.

  • Fixed an issue that prevented scrolling the filter column drop-down after clearing a previous search.

  • Fixed an issue where region labels were not appearing on PDFs even if the Show region labels setting was enabled.

  • Fixed an issue that could cause the sidebar menu to be blank.

  • Fixed a minor visual issue when auto-labeling tasks in Safari.

  • Fixed an issue where clicking on an annotator's name in the task summary did not lead to the associated annotation.

  • Fixed an issue where the required parameter was not always working in Chat labeling interfaces.

  • Fixed an issue where conversion jobs were failing for YOLO exports.

  • Fixed an issue with redirection after deleting a workspace.

  • Fixed an issue where accepted reviews were not reflected in the Members dashboard.

  • Fixed an issue where Submit and Skip buttons were hidden if opening the labeling stream when previously viewing the Task Summary.

  • Fixed an issue where PDFs could sometimes appear flipped.

  • Fixed an issue with Databricks storage when upgrading releases.

  • Fixed a styling issue when navigating back from the Activity Log page into the Members dashboard.

  • Fixed an issue where embedded YouTube videos were not working in <HyperText> tags.

  • Fix an issue with <DateTime> tags when using consensus-based agreement.

  • Fixed an issue where import jobs through the API could fail silently.

  • Fixed an issue where the Copy region link action from the overflow menu for a region disappeared on hover.