90-Second Data Annotation Glossary: Essential Terms Explained Fast

Date updated: September 29, 2025

If you’re new to data annotation outsourcing or leading your first AI project, the jargon can feel overwhelming. Here’s your plain-English, no-fluff glossary—read it in 90 seconds, and you’ll sound like a pro on your next call with your annotation partner.

What Is Data Annotation?

Data annotation (or labelling) is the process where humans tag raw data—images, text, video, or audio—so that AI models can learn what to look for. Without labelled data, machine learning models are basically guessing.

⚙️ Key Annotation Terms You’ll Hear

Bounding Box

A simple rectangle drawn around an object in an image. It’s the fastest way to help computer vision models “see” an object. Example: draw a box around a car.

Polygon Annotation

Instead of just a rectangle, this precisely outlines an object’s shape. Essential when bounding boxes aren’t enough—think of the jagged shape of a tree or a road sign.

Point Cloud Labelling

Used in 3D AI like self-driving cars. Labels are applied to LiDAR data (those cloud-like dots representing depth and distance).

NER (Named Entity Recognition)

In Natural Language Processing (NLP), this means tagging words like people’s names, companies, or places. Example: tagging “Tesla” as an organisation in a news article.

Intent Labelling

Tagging user chat queries with what they mean. Example: “Book a flight” gets labelled as a travel intent.

🔍 Quality and Accuracy Metrics

QA Loop

Every annotation project should include a quality assurance (QA) loop. Typically, this means reviewing a random 10% of the labelled data before final delivery.

IoU (Intersection over Union)

A common accuracy score in computer vision. If the human label and the model’s prediction perfectly match, that’s 100% IoU. Industry benchmarks aim for 95%+.

BLEU Score

Measures how close machine-generated text matches a human-labelled reference. Common in NLP chat and translation tasks.

Inter-Annotator Agreement

How often two human labelers tag the same thing the same way. If your guidelines are clear, agreement should be 95%+.

🛠️ Tools of the Trade

Labelbox, SuperAnnotate, Prodigy

The most common annotation tools. They speed up labelling with automation, hotkeys, and QA checks.


Want the Full Glossary?

✅ Request the full Annotation Glossary PDF with 50+ terms, formulas, and quality metrics.

📞 Or ask our team to walk you through it in plain English—contact us.


About the Author

Martin English is the Founder of Smart Outsourcing Solution (SOS) and Co-Founder of AiDisco. With over 20 years of outsourcing experience across Southeast Asia, he helps global businesses scale remote teams and Employer of Record (EOR) operations. As an advocate for AIO (AI Outsourcing) and GEO (Global Employment Outsourcing), Martin helps organisations bridge onshore ↔ offshore talent with trust and results.

👉 Connect on LinkedIn

Best Tools for Managing Offshore Teams (2026 Guide)

Best Tools for Managing Offshore Teams (2026 Guide)

The Essential Stack for Communication, Productivity, and Performance Updated March 2026 Prepared by Smart Outsourcing Solution Led by Martin English, CEO & Founding Partner Key Takeaway Managing offshore teams successfully depends more on your systems than your...

How to Reduce Hiring Costs Using Offshore Teams

How to Reduce Hiring Costs Using Offshore Teams

A Practical Guide to Lowering Costs Without Sacrificing Quality Updated March 2026 Prepared by Smart Outsourcing Solution Led by Martin English, CEO & Founding Partner Key Takeaway Offshore teams can reduce hiring costs by 50–70% — but only when implemented with...

Data Security & IP Protection in Offshore Teams

Data Security & IP Protection in Offshore Teams

How to Protect Your Business When Hiring Offshore Updated March 2026 Prepared by Smart Outsourcing Solution Led by Martin English, CEO & Founding Partner Key Takeaway Yes — you can protect your data and intellectual property when working with offshore teams, but...