[Week-5 Clean/Messy Rooms]

Didem Yanıktepe
bbm406f18
Published in
3 min readDec 30, 2018

Team Members: Atakan Erdoğdu, Zekeriya Onur Yakışkan, Damla Ünal, Didem Yanıktepe

We wrote the progress report last week and sent it. We have been looking for ways to get the right results this week. We needed something different to predict the room as well as it was messy or clean. We realized that we could use multi-label classification with Keras. We have researched projects that use this method before. we have found a project that is similar to our project and we have decided to advance our project through this method.

Multi-label Image Classification

Each instance may be associated with multiple labels set of instances X = {x1, · · · , xm}; set of predefined labels L = {l1,··· ,ln}; dataset (x1,S1),(x2,S2),··· where each Si ⊆ L. For example, a film can be labeled {romance,comedy}. In our task the room can be bedroom and dirty.

Most traditional learning algorithms have been developed for single-label classification problems. Therefore, many approaches in the literature convert the multi-label problem into a multi-label problem, so that only one label algorithm can be used.

Methods for Multi-label Classification

Problem Transformation Methods: Transforms the multi-label problem into the single-label problem. Use any off-the-shelf single-label classifier to suit requirements.

Algorithm Adaptation Methods: Adapt a single-label algorithm to produce multi-label outputs Benefit from specific classifier advantages (e.g., efficiency)

In both methods, adapt out data to the algorithm.

Binary Relevance is called training and then merging of single-label classes. The data set is trained once for each class. For example, once for the bedroom, then for the dirty room, all classes are trained. Datasets are then tested for belonging to each class. And each label tells you if the image is its own so that the multi-label appears. This is more preferred because it is easy to implement.

Chain Rule

Chain Rule:The Classifier Chains like Binary Relevance (one for each label), make L binary problems but include previous predictions as feature attributes. It is inspired from chain rule. We want to show that if the room is a bedroom, it tends to be messy which is Bayes Optimal Probabilistic Classifier Chains.

Classifier Chains

In the last few weeks, we have done the classification of the rooms with Keras and shared the results. This week, we conducted research and a number of studies about multiple label classification with Keras, which is the other part of our project.

Multiple label classification with Keras
The problem is that we need to train a classifier to categorize the items into various classes:
1.
Room: Bedroom, kitchen, living room.
2.Clean/Messy
We have trained two separate CNNs for each of the two categories and they work really well.

Multiple label classifications with Keras include two basic steps:
1. Change the softmax activation at the end of the network with a sigmoid activation
2. Change categorical cross entropy for binary cross entropy for loss function

Then we trained our network as we normally did.

Happy New Year!

This week we have done more theoretical research, with keras to share the results and stages of our work next week.

References:

--

--