Completing the plugin – integrating it with Zope

The next tutorial on KSS is up. This time you will learn how to complete the plugin by writing the server side components. In the end you will have a working plugin which you can call from the server just like any other KSS command.

Advertisements

Writing a simple KSS plugin

Developing with KSS should save you from ever having to write Javascript. This premise of course only holds true if the framework is complete enough to do everything that you want. Fortunately writing a plugin for KSS is not that difficult. To make this even easier I wrote a small tutorial on how to create a KSS plugin complete with all related Zope bootstrapping.

Keep watching kssproject.org or this blog for another tutorial on how to create server side code to communicate with this plugin.

Doctest syntax file for Vim

To make reading (and writing) doctests a bit more pleasant in Vim I wrote a small syntax file. You download it form the Vim scripts section. The screenshot shows it in action on a Zope 3 test. Screenshot of Doctest syntax

Backgrounds for all

We at Pareto just improved our website. Aside from things like minor style changes, new Plone version etc. there is also something which may be of general interest to the Plone community. That something is wallpapers. For the recently held Baarn sprint I convinced Pareto and a colleague of mine to create T-Shirts for the attendees. It is this design that has been used as a base for the freely available wallpapers.

So if you want some Plone on your desktop you can now visit the Pareto website. If your favorite resolution is not available drop me a note so we can include it.

PUN meeting at a farm

The trip to the meeting was almost surreal. Instead of driving there it felt like a trip trough the heavens.  Outside pieces of clouds flew by whilst inside the car music set the mood.

Unlike previous meetings this one was held in farm near Utrecht. I was driving there from work with Jan (a collegue of mine). He drove ahead and I followed behind him. Because of the dense mist we couldn’t see more than a few meters ahead. After a few detours we made it to the meeting location.

Stani and his girlfriend really put in a lot of effort to make people feel at home. They even had food prepared to nourish the hungry.

The meeting was well attended and after a while a large group (around thirty people) gathered for the first talk. Martijn Faassen introduced us to Grok. This little cave man seems to introduce a new flavour to Zope 3 development. And from what I have seen it might be the yummiest to date.  I think Grok is best explained in an analogy to Apple. You see, Grok is to Zope 3 what the Mac is to Unix. Just like the OS X you get a nice interface (no more ZCML and other low level plumbing). But the best is that it still allows you to code against the low-level framework whenever needed.

Grok is a huge step forward and one that will attract a lot of new people to Zope.  Just like Apple made Unix accessible for the masses this system enables people from all scripting or programming backgrounds to quickly become productive with Zope.

After a short break we continued with a talk on test driven development. Doing such a talk always gets some crowd participation (sollicited or not) on which framework to use etc. I think Frank Niessink did a good job of explaining the underlying principles. It was also nice to hear his experience and point of view.

In line with Frank’s talk the next presentation was on AOP with a focus on (unit)testing. Remco Wendt told us his experience with testing complex systems. He showed how to use AOP to write tests against a remote service by replacing the implementation run-time. It won’t replace my stubbing/mocking habits but its always nice to rethink your choices.

To wrap the evening up Stani gave a presentation on the Copacabana Cybercafé project. This is one of the things I like best about being in the Python community. Although it may seem (at least to me) that everyone is developing web apps this is certainly not the case. Stani used Python in this case to create a cyber cafe in which reality is subtly though drastically changed. Basically he wrote a proxy which substitutes words on a web page with different words. An example of this is changing terrorist for martyr. He also showed us some more examples.

It was a great evening and I want to thank everyone involved for making it happen. Etienne was definitely right that to much time had gone by without a meeting.

Crossing Alaska for KiKa

Crossing Alaska for KiKaCancer has devastating effects on people. Children are not spared from its horrors. Fortunately organisations like KiKa exist which try to make a difference. To raise money for KiKa a team is going to cross Alaska. They maintain their website using our favorite CMS; Plone. It is nice to work at a company which helps to make nice sites like this come into existence. Check out the site: Crossing Alaska for KiKa.

Publishing XML with Plone

Integrating Plone with external systems always seems to cost more effort than it really should. In the past I have integrated Plone with ERP systems using CSV exports, IBM DB2 databases with text dumps, VCARD import’s from a CRM system and also some relational databases. Fortunately the story for relation databases is improving. Of course this only works with databases you can access directly.

Because writing synchronisation scripts is not only very tedious but also very error prone I wanted to fix this once and for all. One of the lessons I learned from previous experiences is not to try to sync everything in one step. Converting the data to an intermediate format you understand and is under your own control gives a lot of advantages.

This post is not going to go on with complaining about all the troubles with external systems. The reason is I made something which should solve it (or solves it for me). You can take a look at the code in the collective. What enables you to do is create a simple piece of content in Plone. In this content you can specify a location on the filesystem you want to publish. When you traverse the piece of content it reads the XML with the same name from the filesystem and wrap it in a proxy. This proxy is then decorated with a view ready for publishing. And the nice thing is it can publish any piece of XML you can think of.

Well that is not completely true, there has to be one little thing in there. The XML must have declare an interface. This interface is an identifier for an Zope 3 interface. It is this interface that is used when looking up views etc.

Another nice thing is that the proxy can also delegate calls it does not understand. This allows it to play nice with the ZCatalog by diverting all queries for Title etc. to a specific adapter.

One word of caution is order before you get to excited, this code is not used in production. It is not finished yet and last but not least there is a problem running all tests simultaniosly. This has to do with a hack I stuck in to make PloneTestCase work with Zope 2.9 and pythonproducts. They should work without failing if you run them one at a time.

Comments on the architecture, system as a whole etc. are greatly appreciated. I hope some more people will find a use for this (or tell me why I am going at it the wrong way). Oh, before I forget it, tips for a better name are also welcome (it is called xmlcontent for now).

PAS human sniffer plugin

After the Seattle Plone Conf I had an idea how to implement a simple Pluggable Auth Service plugin to make authenticated RSS feeds accessible to desktop clients. The main problem with desktop RSS readers is that they need something like basic auth instead of a login form.

My first attempt was writing a challenger plugin which differentiated on user agent headers. This was not the most optimal sollution according to wiggy.

He advised me to create a protocol sniffer. So of I went and changed the plugin. It is now a proper protocol sniffer. This adds a protocol named Human Browser to the set of protocols. Using the challenger chooser you can set this protocol to use cookie auth. The normal browser protocol should then be set to http auth.

The plugin still uses the user agent header to detect if it is dealing with a human controlled browser. You can add browsers to this using a subscriber (look at detectors.py).

Second day of the sprint

The second and final day of the sprint is over now. I implemented portlet reloading for the navigation and recent items portlet. This code is now easy to use from any event handler so refreshing other portlets is easy.
Another thing I did was to add information on what fields where modified by the Ajax call. This allows us to check in the event handlers if we need to update anything. The navigation portlet reloader is an example of how this could work.
At night (after the sprint) I finished the final bits by adding reloading support for the portal tabs and the breadcrumb. This was so easy to do that I made a small screencast as well.

First day of the sprint

The first day of the sprint went pretty well. It was my first sprint so I can make no comperison. A was a bit anxious to get started. It took more time getting all the people in to the auditorium, introducing all the subjects than I liked at the time.

There were a lot of subjects to choose from. I chose for the Azax sprint. It was not as big of a group as the mebrane guys but still nice. We did a bit of introduction in the begin as well.

A lot of people had trouble setting up Zope on their Windows system. Furtonately I was prepared for this and had a custom build ready to go. I supplied all Windows user with a installer.

Then it was time to choose a subject. I chose to work on the event integration. The main idea was to support the follow use-case.

Let’s say you change the title of a document using Ajax. Now the title will be changed in the document view. But usually you also want to update the navigation menu, the navigation portlet etc.

So the thing I did today was to use Zope 3 events to add more Azax commands to the request. In layman’s terms this means that the Ajax handling code just generates and event. This will then activate any number of interested handlers. These handlers can add extra commands like changing the portlets.

At the end of the day I had made a doctest and a working system. Along the way we had discussions on how to implement it and I fixed the API a bit.

I am quite content with the results of the first day. Hopefully I can do some usefull stuff tomorrow as well.

The final day was really interesting. It started with a keynote from Eben Moglen. This got a quite emotional response from most people. After this Alexander Limi highlighted the fifteen most exiting features of Plone 3.

After a short break I went to the talk from Joel Burton about making Plone simpler for end users. He had some really good ideas and ready to go tweaks. A lot of these tweaks should be automatable so maybe I will write a burtonizer product.

Phillip had an interesting talk about viewlets in Zope 2. It seemed really nice but may be a bit difficult for scripters to understand. Although the current macro mess is pretty bad as well.

We had some great snacks during the break. This day they served mediterainian tapas.

One of the days highlights for me was Wichert’s talk about PAS (the Zope 2 version). He really made it clear for me where everything goes and what it does. So when I will need to use it I now know what to do. An example of a usecase I know have an idea about how to implement is RSS feeds that need authentication.

Most RSS readers don’t understand the Plone login form. So if you want to access the feeds from the reader you need to supply the proper credentials via basic authentication. What we want is that normal users go through the login form and RSS readers use basic auth.

To do this you create a challenger plugin. This can be used to differentiate based on the user agent or some detectable propertie in the url.

The lightning talks were the closing talks. It was fun to see these as they are always fun due to the frantic pace. After these the conference ended. As far as I am concerned this was the best IT conference I have ever been to. My thanks go out to all the people involved in making this happen.