Blip debug robot

In the last post I created a screencast of how a debug session, which showed which annotations is created.

Now I have posted the robot so it is possible for others to start using the robot. This will allow all users to investigate how the annotations work. I hope it will help you to gain some more knowledge on how annotations work.

Simply add the robot with the name BlipDebug@appspot.com to you wave, and start modify your application. Just be careful and not add the robot to any large waves. This robot is quite disturbing and should only be used for testing purposes.

The Java code to the robot is in the code.google.com repository.

The robot is now approved in the sample gallery.

Bookmark and Share

Google Wave annotations

Annotations (com.google.wave.api.Annotation) is a key concept to understand, when you are developing robots, how should understand the content of a robot.

I have created a screencast, where I show how the annotations are changed because of the editing.

I have found the following types of annotations.

  • user/d/key identifies that the user is on the blib and is in edit mode
  • user/e/key identifies where the users cursor is only the from selection counts.
  • user/r/key identifies the selection the user has created with start and end. The user will still have his curser at a place in the blip.
  • style/fontWeight identify if the selection is bold
  • style/textDecoration can be used to add line through
  • style/color is the color of the selected text
  • lang identifies the language of a region. There can be multiply different languages in a blip.

There are probably a number of different style markings, which you will have to find your self.

Key is probably a hash of the user address.

Bookmark and Share