Label Answers
Use labeling projects to create gold labels for your search system. Gold labels indicate what answers you expect as results to your queries. You can later export the labels and use them as evaluation datasets in experiments.
About This Task
To evaluate your pipeline through experiments, you need an evaluation dataset. The evaluation dataset contains gold labels. These are the answers you expect in response to your queries.
Currently, deepset Cloud supports labeling projects for document search only.
Prerequisites
- There is a labeling project created with all the mandatory steps configured.
- You have access to the labeling project.
Labeling
- Log in to deepset Cloud.
- Click Labeling and open the labeling project you're participating in.
- On the project page, click Start Labeling. You're redirected to the Labeling Query page. If the project creator added guidelines for you and the query target, you can see them here.
- Ask a query. You'll see a couple of documents that show the results. For each document indicate if it's:
- Relevant (answers the query well)
- Not relevant (doesn't match the query)
- Flagged for review (you're not sure if it's relevant or not, and you'd like another labeler to check)
- Continue until you've run the number of queries the project owner expected. To change the label for a previous query, navigate back to it using the Prev. Query option at the bottom of the page.
Coming soon - query history
We're working on adding a query history where you can check all the labels you've added so far. We'll let you know when it's ready.
When you're finished, let the project creator know. When the project reaches the target query number, the project creator can export the labels and use them in pipeline experiments.
Updated 5 months ago