It is really nice to see that a lot of new features has been delivered. It is a long time since the last release of any updates on the API, so it looked like the development or patching had stopped. But this new release solves many of the problems that have exists. I have written my comments to all to all of the functions.
Active API: In v2, robots can now push information into waves (without having to wait to respond to a user action). This replaces the need for our deprecated cron API, as now you can update a wave when the weather changes or the stock price falls below some threshold. You can learn more in the Active API docs.
This features is really useful for enterprise contexts, where it is required that you will be able to update the wave based on external events. With this features you can get a new Wave, when something happens that you need to react to. From the SAP perspective there is Universal Worklist where all events that you have to process are. With this Wave you can achieve some of the same ideas.
Context: Robots can now more precisely specify how much information they want to get back from a particular event. If only the contents of the affected blip needs updating and you want to reduce your robot’s bandwidth, then you can specify the new ‘SELF’ context. On the flip side, if you do need all the information in the wavelet, you can specify the new ‘ALL’ context. You can learn more in the Context docs.
With this enhancement it will be much easier for robots to interact with the wave, because they can get a much better control of what the data is need in each instance. This will make the development easier because you got the date that you need and it will specify the data you need.
Filtering: In a similar way, with this new API, the robot can specify what events it needs to respond to, conserving valuable bandwidth — and ignore all those that don’t apply. You can learn more in the Filtering Events docs.
With this function you will only get the events that you need in your robots. Simply create a regular expression and then only the entries matching the expression is sent. This is quite useful if you only want events, when a large number of capital letters is written.
Error reporting: Robots are now able to register to receive errors about failed operations, such as insertion on non-existent ranges. You can learn more in the Error Reporting docs.
This function will make it possible to make better and more stabile robots. The use requires that you work, how the robot should interact when an error is created.
Proxying-For: Robots can now convey to Google Wave that their actions are actually on behalf of a different user, via the proxyingFor field. For robots like the Buggy sample, which connects with the Google Code issue tracker, this means that the wave can be updated with attribution to users on non-wave systems. You can learn more in the Proxying-For docs.
I can see that this function will be really interesting when working with one place to monitor all of your solution. It will also make collaboration with external partners much easier.
With all of this information it looks like the robots have been much more decentralized and acts as regular clients. So in the principle the new features will allow you to create a client to Wave using the robot API and some proxy functionality.
There is still room for improvements in the API. This is the two things that I see that currently are missing.
Better form handling and the ability to apply styles to the forms, so they don’t look as 90ish.
UPDATE: This is also possible. Integration with Gadgets, so the robots can update the gadgets with new information. With the Active API is this even more interesting. It is now possible to update the statistic gadget when you get a new expression.
Pamela Fox has created a great presentation describing what how the features connect.
I have updated the WordPress robot (WP-BOT@appspot.com) to publish to your own blog. This feature has been requested by many users, so I thought it was time to give it a try. The usage of the robot is fairly simple. The original version of the robot was described in this post, where I just proved the concept was possible.
Simply add the robot to a Wave. You don’t have any credentials connected with your Wave your you will get a link, where you can enter information about your WordPress. Next time you add the robot to a wave, the new Wave will be published to a new Wave.
To use the plugin simply add Wavr to your blog and in Setting -> Writing Settings make sure XML-RPC is activate.
The usage is also showed on this video.
Do you have any suggestion on how I can make the service even better?
Extensions are an easy way to distribute you Wave applications. When the user has installed an extension the user can add robots or gadgets just by pressing a button. This is much easier then remembering the URL of the gadget or Robot. For robots it is pretty simple, since you can add the robot to your gadget.
What is even more exciting about extension is they can allow for an app store for Wave applications. It would be very interesting, if you could sell/buy as easy as you can from the Itunes App store. If it was possible just to make applications and then have somebody else to sell them, more developers would be interested in developing applications. That way you don’t need to finance you gadgets with ad words.
I have been looking at creating my own extension and install them. In the sandbox there was a debug menu, where you can stall the extensions from. In the preview system you needed to find a place the gallery wave (Search for: Extension Gallery). From this wave it is possible to install the gadgets Google has accepted by clicking on a button.
This video shows how the gadget installer and uninstaller work.
I you are a developer and want to test your own extensions you can do it by installing the “Extension Installer”. When this is installed you can install new gadgets from the gadget URL. So this is the way you can test your application, before you send it to the app store. I don’t know you can share the applications with other or they need to install via the developer gadget.
Today Google posted the new design principles for developing applications to Wave. This list of principles is to raise the bar for developing applications on to Wave. It requires the developers to think usability in to extensions. The concept is really nice; think about you wave development and create better applications.
The concept wave-y is introduced to help developer on focus tools, which are target collaborative intuitive and real-time components. Developing applications to support these properties can prove to be difficult, because it is a new paradigm for development.
One of the areas the guide touches is about writing commands to robots. The robots should be able to detect when they are needed. This defiantly provide a problem to create robots which listen after the data and works with them. An enterprise robot, which does this, is the robot DJ Adams has created. It listens for indications on a transport name using regular expression. If it finds a match it reacts. But with the right text tools it will be possible to extract some information about the current customer, which is used.
This text analysis will probably requires some better functions for semantic web.
According to the document it should be possible to subscribe to events depending on what is written in the wave. So if the user writes “address” the address robot is added to the wave to process the address.
The extensions (gadgets or robots) should be very intuitive. It should be possible for first time users to use them, without pulling out an instruction manual or now a wiki. It might be difficult to make an easy to use enterprise modeling tool like Aris (if it was a gadget) from IDS sheers, which can be used without a manual or a course.
The document also promotes the use of extension install. I have not used it for any of my robots, but it could prove to be an interesting task to do. It will be much easier to use the robot if it was just a one click in the menu.
The best thing about the document is that extensions have to be fun or be useful.
Debugging robots made in Google Wave can be a little difficult because it can only be tested on the server and it can be difficult to see how it is sent.
A way I have found is quite useful is to use the app engine log. On the log tab select “Requests only” and then see that data being sent to the robot. It is this data the robot can see, if it is not here then the robot API is not receiving the data.
I hope this help you debug and get at better understanding of your robot.
Update 19 aug 2009: If you want to format the JSON try the JSON formatter.