Google Wave annotations

Annotations ( is a key concept to understand, when you are developing robots, how should understand the content of a robot.

I have created a screencast, where I show how the annotations are changed because of the editing.

I have found the following types of annotations.

  • user/d/key identifies that the user is on the blib and is in edit mode
  • user/e/key identifies where the users cursor is only the from selection counts.
  • user/r/key identifies the selection the user has created with start and end. The user will still have his curser at a place in the blip.
  • style/fontWeight identify if the selection is bold
  • style/textDecoration can be used to add line through
  • style/color is the color of the selected text
  • lang identifies the language of a region. There can be multiply different languages in a blip.

There are probably a number of different style markings, which you will have to find your self.

Key is probably a hash of the user address.

Bookmark and Share

Designing robots

I was just trying to design a robot that I would like to show for a demo. I needed somehow way to describe how the robot would interact with the user.

As so many other times I found the Whiteboard very useful. With the whiteboard is it possible draw with different colors for each participant. By using this method it will be possible to see the resulting Wave. The resulting wave will give an impression if the content is sufficient.

I would expect that use cases also could be an option, but they would focus more on with what the participants interacted with, not on how the result looked. The use case design is probably still a good idea, to make sure that all interactions are covered,

Designing such an interacting robot, might be involve many participants. It would be idea if the users cold design the resulting wave, by using Google Wave as the collaborative tool. They should probably create a wavelet, they could collaborate on and one where they could sketch the resulting wave.

Do you know of any modeling tools or methods, which could be prove useful for designing robots?



Bookmark and Share

Gadget and Robots interaction

In some instances you want to have the robots and gadgets to interact. A scenario where it could make sense is, when the user wants information about a site. The page which could be showed could be the sales report of the customer or some other information which is of interest to the participants.

This tutorial has basis in the Creating A Simple Inline Gadget To Show External Web Applications, which shows how to inline a page into the a wave using a gadget. There is a robot which creates a form for entering the url of the page. When the user press submit the robot adds a gadget which inline the specified url.

First the robot which inserts the gadget.

public class EmbedUrlRobotServlet extends AbstractRobotServlet {
    public final String URLFIELD = "URL_FIELD";
    public void processEvents(RobotMessageBundle bundle) {
        Wavelet wavelet = bundle.getWavelet();
        String creator = wavelet.getCreator();
        if (bundle.wasSelfAdded()) {
            Blip blip = wavelet.appendBlip();
            TextView textView = blip.getDocument();
            textView.appendMarkup("<p><b>Inline the url</b></p>\n");
            FormView form = textView.getFormView();
            form.append(new FormElement(ElementType.INPUT, URLFIELD,"http://"));
            form.append(new FormElement(ElementType.BUTTON, "submit", "INSERT"));

        for (Event e : bundle.getEvents()) {
            if (e.getType() == EventType.FORM_BUTTON_CLICKED) {
                Blip blip = e.getBlip();
                FormView form = blip.getDocument().getFormView();
                FormElement urlElement = form.getFormElement(URLFIELD);
                GadgetView gadgetView= blip.getDocument().getGadgetView();
                gadgetView.append(new Gadget(""+urlElement.getValue()));


The robots listens for two event. First the SELF_ADDED which creates the URL form with a text box and a submit button. When the button is pressed a gadget is inserted in to the Blip, with the url of the gadget + the target URL. Probably something should be performed to ensure that the submitted URL is valid and can be sent as a query parameter. Probably a URL encoding should be performed or save the data in the datastorage.

    public void doGet(HttpServletRequest req, HttpServletResponse resp)
            throws IOException {
        PrintWriter out= resp.getWriter();
        out.println("<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
      +"<ModulePrefs title=\"inline_external_page\" height=\"400\" width=\"800\"/>"
      +"<Content type=\"html\">"
      +"<![CDATA[ "
      +" <div id=\"main\"><!--Main container for the iframe-->"
      +" <iframe name=\"check\" id=\"check\" height=\"100%\" src =\""+(String)req.getParameter("url")+"\" width=\"100%\" frameborder=\"0\">"
        +"<p>Your browser does not support iframes .</p>"
        +"</iframe> <!-- this is where contents of the links are displayed -->"

The code is the same in the original blog it has just been placed into a servlet, to make it possible to have parameters inside. It just inserts the query parameter into the gadget XML.

The code can be found at

Bookmark and Share

Architecture for Google Wave applications.

I’m designing an application for Google Wave, which interacts with the user. The following is the considerations that I have for how the architecture of the application works best.

The application that I’m planning on should consist of a Robot which interacts with the user. The other part of the application should be a web application where users can configure how the robot should interact and view status from the waves. There should also be some analytics capabilities. The backend application needs to be rather a little complex to fulfill the requirements.

The most convenient way would be to use appengine, since it is the place that is mostly integrated with Google Wave currently. Continue reading Architecture for Google Wave applications.

Bookmark and Share

Coming changes to Google Wave API

Tommy Pedersen shared the link to what is coming in the API.

It looks like the developer preview will stop on September 30th, and then we must hope we can get an account on the production system. So we will not float without wave access for a long time.

What is interesting is that Douwe Osinga from Google is sharing, what they believe they can achieve before the end of the developer preview.

The things I think is the most exciting news are the following:

  • The Robots can use other addresses that That could mean you could have more robots on the same application making the robots tailored to the user. Also we will see that robots will exist on our own servers and not app-engine.
  • A better integration between robots in to Wave, so they robot can perform tasks even the wave is not active by a user.
  • Gadgets better support for using Google Web Toolkit, will make it easier for everyone (how is not a Javascript master) to make Gadgets. This is probably lower the barrier to create Gadgets.
  • OpenSocial integration for both robots and gadgets. That is something that I’ll need to look more into.
  • Insert simple HTML into the waves. This is not possible now, so the robots can make some better looking content easy. Then we need to consider using CSS on the document.
  • Open source the robot API it does not matter much now, but for the adoption that will probably make things easier.

I’m impressed with the plan Google has created for the development of the Google Wave API. I hope they have made some of the improvements already otherwise they will have a hard time implementing them all.

Bookmark and Share