It is really nice to see that a lot of new features has been delivered. It is a long time since the last release of any updates on the API, so it looked like the development or patching had stopped. But this new release solves many of the problems that have exists. I have written my comments to all to all of the functions.
Active API: In v2, robots can now push information into waves (without having to wait to respond to a user action). This replaces the need for our deprecated cron API, as now you can update a wave when the weather changes or the stock price falls below some threshold. You can learn more in the Active API docs.
This features is really useful for enterprise contexts, where it is required that you will be able to update the wave based on external events. With this features you can get a new Wave, when something happens that you need to react to. From the SAP perspective there is Universal Worklist where all events that you have to process are. With this Wave you can achieve some of the same ideas.
Context: Robots can now more precisely specify how much information they want to get back from a particular event. If only the contents of the affected blip needs updating and you want to reduce your robot’s bandwidth, then you can specify the new ‘SELF’ context. On the flip side, if you do need all the information in the wavelet, you can specify the new ‘ALL’ context. You can learn more in the Context docs.
With this enhancement it will be much easier for robots to interact with the wave, because they can get a much better control of what the data is need in each instance. This will make the development easier because you got the date that you need and it will specify the data you need.
Filtering: In a similar way, with this new API, the robot can specify what events it needs to respond to, conserving valuable bandwidth — and ignore all those that don’t apply. You can learn more in the Filtering Events docs.
With this function you will only get the events that you need in your robots. Simply create a regular expression and then only the entries matching the expression is sent. This is quite useful if you only want events, when a large number of capital letters is written.
Error reporting: Robots are now able to register to receive errors about failed operations, such as insertion on non-existent ranges. You can learn more in the Error Reporting docs.
This function will make it possible to make better and more stabile robots. The use requires that you work, how the robot should interact when an error is created.
Proxying-For: Robots can now convey to Google Wave that their actions are actually on behalf of a different user, via the proxyingFor field. For robots like the Buggy sample, which connects with the Google Code issue tracker, this means that the wave can be updated with attribution to users on non-wave systems. You can learn more in the Proxying-For docs.
I can see that this function will be really interesting when working with one place to monitor all of your solution. It will also make collaboration with external partners much easier.
With all of this information it looks like the robots have been much more decentralized and acts as regular clients. So in the principle the new features will allow you to create a client to Wave using the robot API and some proxy functionality.
There is still room for improvements in the API. This is the two things that I see that currently are missing.
Better form handling and the ability to apply styles to the forms, so they don’t look as 90ish.
UPDATE: This is also possible. Integration with Gadgets, so the robots can update the gadgets with new information. With the Active API is this even more interesting. It is now possible to update the statistic gadget when you get a new expression.
Pamela Fox has created a great presentation describing what how the features connect.
I tested quite a bit of different Google Wave gadgets in my christmas calender last month. I tried 40 different apps with a large verity in their complexity and usability. Some of them did not have any useful functions or did not work.
I really liked many of the ideas created in the gadgets and some was really useful. It was fun to see how people wanted to impress with new ideas.
The quality of the gadgets and robot was lausy. Mine included. I believe that many of to my application must be improved. But with the limited option in the robot API hinders making layouts to shine. It also provide a hindering there is a lack of way the gadget/robots can work together. If there was a better way to make the gadgets/robot work together, the layout issues could be solved.
I will say that most of the gadgets except maybe the simpler voting gadgets all require extra work before they can be used by a wider audience. The robots where you need to write commands with # or ! as the only thing in blip, I doubt that my mother will pick it up easily which is required to get non tech savvy users on board.
One key to make it possible to get the robot to function better is to pay the developers. I guess most of the developers of the robots/gadgets I have seen just tried the protocol out of see if they could make something useful. And they can. But to make them look wave-y the gadgets need to look better.
To make the gadgets/robot complete we need to be able to get some better apis and more importantly be able to charge for your apps. Currently it can be possible to charge for the usage of a robot and with some development also a gadget, but it is a pain and will require to much of the user.
I’m therefore looking forward a Wave app store is created. It will be interesting to see if it only supports one time payments or monthly payments and maybe even corporate multi seat licenses. The wave app store will mean that more developers will start with developing programs for Google Wave, making the platform more attractive.
I’m currently only letting my apps be available to my mail list subscribers. We have a new scrum gadget that just need to be published. So sign up to the newsletter and try it out.
I had when I first got access to Wave, I wrote a blog about how to create a Wave Robot using Grails. This blog can be read at Graversen.org.
Now David Trattnig has created a plugin to Grails for developing Wave applications. I had to try it out and see how it works.
The first project I tried the plugin on was a failure. I got some error, which I did not think the Wave plugin had created. Just me fumbling around. So I created a new project to see if it was better. The result was much better on the second try. I got a functional robot working without much code.
When you run the commands.
grails install-plugin wave
You get all the objects you need to create a robot, which is bacisly just one file as a service. You can then start coding in this file. The plugin automaticly handles the creation of the URL mappings needed for the Robot to work.
In Wave you normaly have to create the capabilities.xml file as a seperate file outsite the code. This is irretating to be working with and you have to make sure to update the file. With this plugin all you need to do is to change the begining of the Robot file to something like.
Extensions are an easy way to distribute you Wave applications. When the user has installed an extension the user can add robots or gadgets just by pressing a button. This is much easier then remembering the URL of the gadget or Robot. For robots it is pretty simple, since you can add the robot to your gadget.
What is even more exciting about extension is they can allow for an app store for Wave applications. It would be very interesting, if you could sell/buy as easy as you can from the Itunes App store. If it was possible just to make applications and then have somebody else to sell them, more developers would be interested in developing applications. That way you don’t need to finance you gadgets with ad words.
I have been looking at creating my own extension and install them. In the sandbox there was a debug menu, where you can stall the extensions from. In the preview system you needed to find a place the gallery wave (Search for: Extension Gallery). From this wave it is possible to install the gadgets Google has accepted by clicking on a button.
This video shows how the gadget installer and uninstaller work.
I you are a developer and want to test your own extensions you can do it by installing the “Extension Installer”. When this is installed you can install new gadgets from the gadget URL. So this is the way you can test your application, before you send it to the app store. I don’t know you can share the applications with other or they need to install via the developer gadget.
In the last post I created a screencast of how a debug session, which showed which annotations is created.
Now I have posted the robot so it is possible for others to start using the robot. This will allow all users to investigate how the annotations work. I hope it will help you to gain some more knowledge on how annotations work.
Simply add the robot with the name BlipDebug@appspot.com to you wave, and start modify your application. Just be careful and not add the robot to any large waves. This robot is quite disturbing and should only be used for testing purposes.