docs/source/guide/custom_metric.md
Write a custom agreement metric to assess the quality of the predictions and annotations in your Label Studio Enterprise project. Label Studio Enterprise contains a variety of agreement metrics for your project but if you want to evaluate annotations using a custom metric or a standard metric not available in Label Studio, you can write your own.
!!! note This functionality is available out-of-the-box for Label Studio Enterprise Cloud users.
For Label Studio Enterprise on-prem environments, you must configure Amazon Web Services Elastic Compute Cluster [(AWS EC2)](https://aws.amazon.com/ec2/) or Amazon Elastic Kubernetes Service [(EKS)](https://aws.amazon.com/eks/). For more information, see [the section below on setting up permissions](#Set-up-permissions-for-a-private-cloud-custom-agreement-metric).
Label Studio Enterprise Edition includes various annotation and labeling statistics and the ability to add your own. The open source Community Edition of Label Studio does not contain these calculations. If you're using Label Studio Community Edition, see <a href="https://labelstud.io/guide/label_studio_compare.html">Label Studio Features</a> to learn more.
If you're adding your custom agreement metric to Label Studio Enterprise hosted in a private (self-managed) AWS EC2 or AWS EKS instance, set up permissions.
Before writing your custom agreement metric, do the following:
Based on the type of labeling that you're performing, write a custom agreement metric.
You can use the agreement metric to compare two annotations, or one annotation with one prediction. Use the input parameters annotation_1 and annotation_2 to specify the annotations to compare, or annotation and prediction to compare.
Add your code to the following function defined in Label Studio:
def agreement(annotation_1, annotation_2, per_label=False) -> float:
This function takes the following arguments:
| argument | format | description |
|---|---|---|
annotation_1 | JSON object | The first annotation or prediction to compare when calculating agreement. Retrieved in Label Studio JSON format. |
annotation_2 | JSON object | The second annotation or prediction to compare when calculating agreement. Retrieved in Label Studio JSON format. |
per_label | boolean | Whether to perform an agreement calculation for each label in the annotation, or across the entire annotation result. |
return | float | The agreement score to assign, as a float point number between 0 and 1. |
For example, given the following labeling config:
<View>
<Image name="image" value="$image"/>
<Choices name="choice" toName="image" showInLine="true">
<Choice value="Positive" />
<Choice value="Negative" />
<Choice value="Neutral" />
</Choices>
</View>
The following agreement metric compares two annotations for a classification task with choice options of "Positive" and "Negative":
def agreement(annotation_1, annotation_2, per_label=False) -> float:
# Retrieve two annotations in the Label Studio JSON format
r1 = annotation_1["result"][0]["value"]["choices"][0]
r2 = annotation_2["result"][0]["value"]["choices"][0]
# Determine annotation agreement based on specific choice values
if r1 == r2:
# If annotations match and include the choice "Positive", return an agreement score of 0.99
if r1 == "Positive":
return 0.99
# If annotations match and include the choice "Negative", return an agreement score of 0.7
if r1 == "Negative":
return 0.7
# If annotations do not match, return an agreement score of 0
else:
return 0
If you set per_label=True, you can define a separate method or agreement score for each label. If you do this, you must return a separate score for each label. For example, for a classification task, you could use the following function to assign a weight and return a specific agreement score for each label used in an annotation:
def agreement(annotation_1, annotation_2, per_label=False) -> float:
label_1 = annotation_1["result"][0]["value"]["choices"][0]
label_2 = annotation_2["result"][0]["value"]["choices"][0]
weight = {"Positive": 0.99, "Negative": 0.01}
if label_1 == label_2:
if per_label:
return {label_1: weight[label_1]}
else:
return weight[label_1]
else:
if per_label:
return {label_1: 0, label_2: 0}
else:
return 0
Set up a custom agreement metric for a specific project in Label Studio Enterprise.
!!! note You must configure the labeling interface before you can add your custom agreement metric.
!!! attention "important" Using tags on Lambda functions is an on-premise only feature.
tag_name tag_value.For information on troubleshooting custom metrics, see Troubleshooting Agreements & Quality Control in the HumanSignal support center.
If you have Label Studio Enterprise deployed in a private cloud (self-managed) Amazon Web Services (AWS) Elastic Compute Cluster (EC2) instance or Amazon Elastic Kubernetes Service (EKS), you must grant additional permissions so that Label Studio Enterprise can run custom agreement metrics in AWS Lambda.
To set up the permissions, do the following:
You must know the AWS account ID for the AWS account that you use to manage Label Studio Enterprise to perform these steps.
Using your preferred method, create an AWS IAM role.
LSE_CustomMetricsExecuteRole. Follow the steps to create a role to delegate permissions to an AWS service in the AWS Identity and Access Management documentation for Creating a role for an AWS service (console).YOUR_AWS_ACCOUNT with your AWS account ID that has access to Label Studio Enterprise.{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "logs:CreateLogGroup",
"Resource": "arn:aws:logs:*:YOUR_AWS_ACCOUNT:*"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": [
"arn:aws:logs:*:YOUR_AWS_ACCOUNT:log-group:/aws/lambda/custom-metric-*"
]
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:PutRetentionPolicy"
],
"Resource": [
"arn:aws:logs:*:YOUR_AWS_ACCOUNT:log-group:/aws/lambda/custom-metric-*"
]
}
]
}
After creating an IAM role to manage logs for the custom agreement metric, set up permissions to allow Label Studio Enterprise to interact with AWS Lambda.
How you set up permissions depends on how you deployed Label Studio Enterprise in your self-managed cloud infrastructure:
If you deployed Label Studio Enterprise using Docker Compose in an AWS EC2 instance, do the following to finish setting up permissions for the custom agreement metric functionality:
LSE_AllowInteractLambda policy.docker-compose.yaml file that you use to deploy Label Studio Enterprise, add the following environment variables in the app and rqworkers sections:!!! attention "important"
Update:
- YOUR_AWS_ACCESS_KEY_ID, YOUR_AWS_SECRET_ACCESS_KEY and YOUR_AWS_ACCOUNT with the credentials for the account created in step 1.
- YOUR_AWS_REGION with the AWS region that your EC2 instance exists in the following:
AWS_ACCESS_KEY_ID=YOUR_AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY=YOUR_AWS_SECRET_ACCESS_KEY
LS_LAMBDA_REGION_CUSTOM_METRICS=YOUR_AWS_REGION
LS_LAMBDA_ROLE_CUSTOM_METRICS=arn:aws:iam::YOUR_AWS_ACCOUNT:role/LSE_CustomMetricsExecuteRole
After you set up these permissions in your environment, you're ready to write your custom agreement metric and add it to Label Studio Enterprise:
If you deployed Label Studio Enterprise in Amazon Elastic Kubernetes Service (EKS) with OpenID Connect (OIDC) for identity and access management (IAM), do the following to finish setting up permissions for the custom agreement metric functionality:
LSE_ServiceAccountApp following the steps to create a role to delegate permissions to an AWS service in the AWS Identity and Access Management documentation for Creating a role for an AWS service (console).LSE_AllowInteractLambda policy to the LSE_ServiceAccountApp role.values.yaml file to include the following map. Replace YOUR_AWS_ACCOUNT with your AWS account ID:app:
serviceAccount:
annotations:
eks.amazonaws.com/role-arn: arn:aws:iam::YOUR_AWS_ACCOUNT:role/LSE_ServiceAccountApp
After you set up these permissions in your environment, you're ready to write your custom agreement metric and add it to Label Studio Enterprise:
If you deployed Label Studio Enterprise in Amazon Elastic Kubernetes Service (EKS) and are not using OpenID Connect (OIDC) for identity and access management (IAM), do the following to finish setting up permissions for the custom agreement metric functionality:
LSE_AllowInteractLambda.After you set up these permissions in your environment, you're ready to write your custom agreement metric and add it to Label Studio Enterprise:
To grant permissions to a specific user, role, or EKS node group used to manage Label Studio Enterprise access to interact with AWS Lambda, use the following IAM policy. Create an IAM policy called LSE_AllowInteractLambda and replace YOUR_AWS_ACCOUNT with your AWS account ID:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "iam:PassRole",
"Resource": "arn:aws:iam::YOUR_AWS_ACCOUNT:role/LSE_CustomMetricsExecuteRole"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"lambda:CreateFunction",
"lambda:UpdateFunctionCode",
"lambda:InvokeFunction",
"lambda:GetFunction",
"lambda:DeleteFunction",
"lambda:TagResource",
"lambda:ListTags"
],
"Resource": [
"arn:aws:lambda:*:YOUR_AWS_ACCOUNT:function:custom-metric-*"
]
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": "lambda:ListFunctions",
"Resource": "*"
},
{
"Action": [
"logs:CreateLogGroup",
"logs:PutRetentionPolicy",
"logs:TagResource",
"logs:StartQuery",
"logs:GetQueryResults"
],
"Effect": "Allow",
"Resource": [
"arn:aws:logs:*:YOUR_AWS_ACCOUNT:log-group:/aws/lambda/custom-metric-*"
]
}
]
}