The Art of Annotation: Lessons Learned From a Picsellia Annotation Campaign

The Art of Annotation: Lessons Learned From a Picsellia Annotation Campaign

Recently, a part of Picsellia’s team gathered in Spain for our winter off-site (see picture below), and I decided to ruin their day by organizing a 10-hour annotation marathon.

Amazing team about to ascend to the skies in a hot air balloon

If you have been keeping up with our updates, you’re probably aware that we recently released a new feature in the Picsellia platform; the Image & Video Annotation Campaigns. In this article, I’ll try to elaborate on why we did this annotation campaign, what we learned, and how you can leverage this information in your next project. 

Why organize an annotation campaign at Picsellia? 

Be the main user of your product 

If there’s one thing we believe at Picsellia, it’s no-bullshit marketing. And is there a better way to ensure that we are building the best tools in the industry than by using them for real-world use cases, identifying improvements, and delivering these features to our customers? We don't think there is, so that’s exactly what we did.

Assessing the benefits

The way companies are doing their annotation projects at the moment is suboptimal; indeed, the annotation process is tedious, time-consuming, and hectic. Despite every tooling company stating that annotation will die soon, we strongly believe otherwise. The pain felt by large enterprises around annotation is partially caused by the lack of efficient tooling to handle the process properly, resulting in a lot of frustration, driving them to seek auto-annotation solutions. We are convinced that Picsellia can improve the efficiency of your company's annotation process by 50%. Efficiency means, time to 100% quality datasets.

Setting the stage for our annotation marathon

So where do we start? Fortunately, we had a prospect who wanted a custom demo on a real-world use case. And what is the most common problem for real-world computer vision problems? No open datasets. ​​

Rest assured, we don’t annotate datasets for every demo we do. This demo, however, had great potential, so we found 170 images representing the scenes we were looking to analyze.

The instructions for the annotation process were based on a document our prospect sent us, that specified the need to detect two classes and count specific objects on power stations. That was the only direction we gave to the team (you should keep that in mind for later).

To perform the annotation, we were a team of 10 people, split into 3 groups. 7 annotators, 2 reviewers, and 1 project manager (of course they assigned me an annotator role…). A big team of annotators can, as I’ll come back to later, create some efficiency challenges. 

We started by creating a simple Annotation → Review → Done workflow in Picsellia and uploaded a PDF with simple instructions for the team.

And then the annotation started …! 

Identifying annotation challenges and major bottlenecks

Funnily enough, even the Picsellia team stumbled into the same pitfalls as our customers. Remember our 7 people annotation team I mentioned earlier, potentially resulting in an inefficient process? Well, we blitzed through the first round of annotation and finished pretty quickly – or did we? Nope, not really.

Sure, we managed to slap annotations on all the images, but hey, building a top-notch dataset isn't just about ticking off boxes. It’s about having tight, precise, well-classified objects annotated consistently.  

The first iteration of the annotation took 2 hours - but the entire annotation process took 10 hours. Why was it so inefficient? Let's break it down and see what went wrong.

Factor 1: Unclear instructions

No one knew what to do – that’s the most straightforward way to say it.

The primary bottleneck was the lack of clear instructions. Effective communication of guidelines is crucial for consistent and accurate annotations. Our experience underscored the importance of detailed, accessible instructions for all team members.

Setting instructions seems pretty simple, however, it’s nearly impossible to anticipate all the edge cases one might encounter while annotating an entire dataset. 

Instructions have to evolve and be accessible at all times to ensure consistency. Additionally, real-time communication with annotators is vital to minimize friction.

Factor 2: Information management

Managing information and ensuring data quality emerged as significant challenges. The first iteration revealed inconsistencies in annotations, prompting a reassessment of our approach to achieving the desired dataset precision.

Our reviewing team was raising the same issues with every single annotator, which emphasized the need for a central repository of instructions. 

Displaying an instruction pdf file accessible at any time is a great way to ensure quality.

Factor 3: Project Management

We saw a lot of variability among the team members. Some were less efficient than others, some were more precise, and some were better at annotating small objects, so we had optimization to make. 

Sonia was killing it in terms of speed, but with varying degrees of quality, leaving her no tasks while Henrik was struggling to keep the pace, yet consistently produced annotations of high quality. Managing tasks became quite a challenge, and it resulted in inactive team members and wasted time. 

Annotation campaigns are a matter of budget and time optimization. You are more than likely paying for the time spent, so obviously you want this time to be efficient!

Thankfully, Picsellia's new annotation campaign feature allowed us to streamline the process! Let's discover how it works.

How Picsellia streamlined the annotation process

Issue tracking

Our reviewing team was able to tell to the annotators what was wrong in real-time, this gave us both a frictionless and efficient process with annotation iteration. The issue tracking helps identify where the problem is located, saving precious time in the review process. 

Tasks assignment

When setting up an annotation workflow you have different ways to give labeling tasks to team members, and Picsellia manages the split for you in a simple way. Just add all your annotators and the platform takes care of creating an equal distribution of tasks. 

Project management

Annotation campaign is a team sport, so you need to be sure that every stakeholder is playing on the same field. Centralization is key for a frictionless process, and that’s exactly what Picsellia is providing when it comes to labeling.

Having a real-time overview of the time spent on tasks, who is creating the most issues, and who is over-performing helps create a culture of success and a great annotation motion. 

Key figures

The analytics panel of Picsellia gives you a business overview of your annotation performances aggregating throughput, efficiency, and quality metrics in one place. 

In conclusion, the Picsellia team spent over 9 hours cumulated to annotate 136 images and 2919 objects. Bringing our average time spent by annotation task to 1m34s.

Start managing your AI data the right way.

Request a demo

Recommended for you:

french language
english language