Trifork Blog

Axon Framework, DDD, Microservices

Embedding RSS in Hippo using the pipelines feature

December 13th, 2011 by
|

Onehippologo

For one of the biggest Hippo projects I have been working on, we have created a custom rss solution. When we started the project, Hippo did not have an rss solution and we had some requirements for caching and reusability that we could not implement with standard hippo. A few years have passed and hippo is not what it used to be. Nowadays it has a lot more features and a lot less NullPointers (sorry guys, could not resist). About a week a go Jeroen Reijn told me about the Pipeline feature in Hippo. This feels like the right time to start thinking about a new solution for rss.

With this blog post I am going to show you a better way to create rss feeds with Hippo using the provided features of Hippo. I know there are plugins available for rss, still I think mine is better :-). The solution is based on the Rome project and, as mentioned, the hippo pipelines.Romelogo small

Overview of the solution

There are a few things that we need to cover before we have an rss feed using hippo. I am not going to explain all the details of hippo. There are other resources on the web that explain hippo better than I can do it.

If you have ever worked with hippo, you know that it consists of multiple components. The two important ones are CMS, the environment where the content is created, and the HST, the toolkit used to create the website for the visitors. The HST (Hippo Site Toolkit) is based on the spring framework. Hippo has functionality to extend the solution by providing additional spring configuration. We can use spring to add a pipeline to the hst. Pipelines are used to show different content to the client when asked for. This way it is easy to provide a preview site for not published content, a mobile site or an international site. We use this feature to provide an RSS Site.

A pipeline starts with a mount point. A mount directs to a piece of hst configuration, the site. The site consists of the site map and components.

The following sections discuss the different parts of the solution.

The pipeline

The new pipeline is created using the following spring xml configuration. You can read more about a pipeline at the hippo wiki. Main parts of the pipeline are valves. We have valves that are used before invocation, valves called during invocation, and post invocation valves. For our solution we focus on the Aggregation valve that we copied and cleaned from the standard hippo implementation. The other valve is our own RssValve. This valve is responsible for sending the response to the client using the rome framework. The content as provided by the hst component. The next code block shows the spring configuration.

<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
    <bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
        <property name="targetObject">
            <bean class="org.springframework.beans.factory.config.PropertyPathFactoryBean">
                <property name="targetObject" ref="org.hippoecm.hst.core.container.Pipelines"/>
                <property name="propertyPath" value="pipelines"/>
            </bean>
        </property>
        <property name="targetMethod" value="put"/>
        <property name="arguments">
            <list>
                <value>RssPipeline</value>
                <bean class="org.hippoecm.hst.core.container.HstSitePipeline">
                    <property name="preInvokingValves">
                        <list>
                            <ref bean="initializationValve"/>
                        </list>
                    </property><p></p>
                    <property name="invokingValves">
                        <list>
                            <ref bean="contextResolvingValve"/>
                            <ref bean="localizationValve"/>
                            <ref bean="subjectBasedSessionValve"/>
                            <ref bean="jcrSessionStatefulConcurrencyValve"/>
                            <ref bean="rssAggragationValve"/>
                            <ref bean="romeRssValve"/>
                        </list>
                    </property>
                    <property name="postInvokingValves">
                        <list>
                            <ref bean="cleanupValve"/>
                        </list>
                    </property>
                </bean>
            </list>
        </property>
    </bean>

    <bean id="romeRssValve" class="nl.dutchworks.rss.RssValve"/>
    <bean id="rssAggragationValve" class="nl.dutchworks.rss.AggregationValve" parent="abstractValve"/>
</beans>

Next part is configuring the HST to use our pipeline for certain requests.

Hst Configuration

The mount

We start of with the mount of the rss site. The mount is configured in the repository hst:hst > hst:hosts. In here you can configure your different environments based on domain or ip-address. Therefore I have configured a mount for localhost, but you can also configure subdomains. You could create a mount point for rss.dutchworks.nl for instance. This mount is important, we configure the name of the pipeline in here. This name must be the same as in the spring configuration: hst:namedpipeline = RssPipeline. We also configure the hst:mountpoint = /hst:hst/hst:sites/rss.

Besides these mandatory properties for hippo pipelines to work, we also introduce a few of our own properties that we add to the mount. These properties are later on read by the component to define some of the meta-data fields for the rss feed. The properties defined in here are static for the all the rss feeds. Some of the properties are: logoLink, logoUrl, copyright and webmaster.

The site

The mount point points to a site, the site configures the content used in that particular site. In our case this is done using a hippo:facetselect with a certain hippo:docbase pointing to the root of our content. A filter is added for only published content. The site also configures the location of the hst configuration in the hst:configurationpath. In our case this is /hst:hst/hst:configurations/rss.

The site map and components

The hst configuration for our rss solution has two main parts. The site map configures the url mapping. The name of the node is used to map a request to a certain component. In our case we have a /nieuws.rss node with a hst:componentconfigurationid that points to hst:components/nieuwsrss. Now we are approaching the final bit of configuration, than we move on to some code. The second part of our rss configuration is the configuration of the component. The main property is hst:componentclassname. This configures the name of the class to use when the request for this component has come. We also use the hst component to configure some of the more specific properties of the rss feed. Here we configure things like title and description. A component is of a different type than a mount point. We cannot just create a property, we have to create a multi value hst:parameternames field and a hst:parametervalues field. In here you can define multiple parameters and their values. Not very intuitive, but it works.

The code

Time to show some of the code. The first class I want to show is the component. We have a BaseRssComponent. This is the parent class for the other rss components. This class takes care of all the meta-data of the rss feed. As mentioned we obtain the meta-data from the mount point and from the component configuration. The following code block shows you the complete class. How we obtain the meta data and calling the subclass to create the list of items for the feed.

public abstract class BaseRssComponent extends AbstractOverviewComponent {

    static final String METADATA_PROPERTY = "metadata";
    static final String ITEMS_PROPERTY = "items";

    public void doBeforeRender(HstRequest request, HstResponse response) throws HstComponentException {
        super.doBeforeRender(request, response);

        RssFeedMetaData metaData = createMetaDataForRssFeed(request);
        request.getRequestContext().setAttribute(METADATA_PROPERTY, metaData);

        List<Item> items = createListOfItemsForRssFeed(request);
        request.getRequestContext().setAttribute(ITEMS_PROPERTY, items);
    }

    protected abstract List<Item> createListOfItemsForRssFeed(HstRequest request);

    protected RssFeedMetaData createMetaDataForRssFeed(HstRequest request) {
        RssFeedMetaData metaData = new RssFeedMetaData();
        enhanceMetadataWithFixedRssPropsFromMountPoint(request, metaData);
        enhanceMetadataWithUrlFromRequest(request, metaData);
        enhanceMetadataWithPropertiesFromComponent(request, metaData);
        return metaData;
    }

    private void enhanceMetadataWithFixedRssPropsFromMountPoint(HstRequest request, RssFeedMetaData metaData) {
        metaData.setCopyright(getMountProperty("copyright", request));
        metaData.setLanguage(getMountProperty("language", request));
        metaData.setWebmaster(getMountProperty("webmaster",request));
        metaData.setLogoHeight(getMountProperty("logoHeight", request));
        metaData.setLogoWidth(getMountProperty("logoWidth", request));
        metaData.setLogoLink(getMountProperty("logoLink", request));
        metaData.setLogoUrl(getMountProperty("logoUrl", request));
    }
    
    private void enhanceMetadataWithPropertiesFromComponent(HstRequest request, RssFeedMetaData metaData) {
        metaData.setTitle(extractComponentParam("title", request));
        metaData.setCategoryName(extractComponentParam("categoryName", request));
        metaData.setDescription(extractComponentParam("description", request));
        metaData.setFeedRODisplayTitle(extractComponentParam("displayTitle", request));
    }

    private void enhanceMetadataWithUrlFromRequest(HstRequest request, RssFeedMetaData metaData) {
        HstSiteMapItem siteMapItem = request.getRequestContext().getResolvedSiteMapItem().getHstSiteMapItem();
        HstLink hstLink = request.getRequestContext().getHstLinkCreator().create(siteMapItem);
        String url = hstLink.toUrlForm(request.getRequestContext(), true);
        metaData.setFeedUrl(url);
    }

    private String getMountProperty(String param, HstRequest request) {
        return request.getRequestContext().getResolvedMount().getMount().getProperty(param);
    }

    private String extractComponentParam(String param, HstRequest request) {
        return getComponentConfiguration().getParameter(param,request.getRequestContext().getResolvedSiteMapItem());
    }
}

When implementing an rss feed you have to create a subclass that obtains items from the repository and maps them to the rome Item objects. The following code block shows you the actual component class that only has to create the list of items.

public class NewsRssComponent extends BaseRssComponent {
    protected List<Item> createListOfItemsForRssFeed(HstRequest request) {
        NewsDao newsDao = getDaoFactory().newNewsDao(request);
        PagedResult<NewsBean> allNews = newsDao.findAllNews(new PageRequest(0, 20));
        RssQueryResultMapper<NewsBean> mapper = new NewsRssQueryResultMapper();
        return mapper.map(allNews.getItems());
    }
}

Going through the DAO stuff is a bit to much for this blog. I guess you can find a good implementation for this yourself. The same is valid for the mappers. You just have to create a mapping from the special Hippo document beans to the Rome Items.

The last piece of code is the RssValve. This valve makes use of the rome project. Having the list of Items makes it very easy to create the valve. The valve first constructs a Channel object with all the meta-data and the items. Than we output the contents using the provided response object.

The following code block shows the complete valve.

public class RssValve extends AbstractValve {
    private final static Logger logger = LoggerFactory.getLogger(RssValve.class);

    protected static final int TIME_TO_LIVE = 5;

    @Override
    public void invoke(ValveContext context) throws ContainerException {
        HttpServletResponse response = context.getServletResponse();

        Channel channel = new Channel("rss_2.0");
        response.setContentType("application/rss+xml");
        channel.setEncoding("UTF-8");

        RssFeedMetaData metaData = (RssFeedMetaData) context.getRequestContext().getAttribute("metadata");
        addMetaDataToChannel(channel, metaData);

        List<Item> items = (List<Item>) context.getRequestContext().getAttribute("items");
        channel.setItems(items);

        writeResponseToClient(response, channel);

        context.invokeNext();

    }

    private void writeResponseToClient(HttpServletResponse response, Channel channel) {
        WireFeedOutput feedOutput = new WireFeedOutput();
        try {
            ServletOutputStream out = response.getOutputStream();
            feedOutput.output(channel, new OutputStreamWriter(out, channel.getEncoding()));
            out.flush();
        } catch (IOException e) {
            logger.error("Problem while outputting the rss feed", e);
        } catch (FeedException e) {
            logger.error("Problem while outputting the rss feed", e);
        }
    }

    private void addMetaDataToChannel(Channel channel, RssFeedMetaData metaData) {
        channel.setTitle(metaData.getTitle());
        channel.setDescription(metaData.getDescription());

        channel.setLink(metaData.getFeedUrl());
        channel.setWebMaster(metaData.getWebmaster());
        channel.setLanguage(metaData.getLanguage());
        channel.setCopyright(metaData.getCopyright());

        List<Category> categories = new ArrayList<Category>();
        Category category = new Category();
        category.setValue(metaData.getCategoryName());
        categories.add(category);
        channel.setCategories(categories);

        Image image = new Image();
        image.setTitle(metaData.getTitle());
        image.setLink(metaData.getLogoLink());
        image.setUrl(metaData.getLogoLink());
        image.setWidth(Integer.parseInt(metaData.getLogoWidth()));
        image.setHeight(Integer.parseInt(metaData.getLogoHeight()));

        channel.setImage(image);
        channel.setLastBuildDate(new Date());
        channel.setTtl(TIME_TO_LIVE);
    }
}

Concluding

I hope this gives you an idea what you can do with the pipeline feature of hippo.

Yesterday Arje Cahn showed me a new Hippo feature that will be available in 7.7 which makes this blog post even more interesting. Hippo 7.7 comes with channels. At the moment more directed to different html channels, but easy to extend for data or rss channels. Hope to try this out in the nearby future as well

If you have questions about this solution do not hesitate to contact me.

Comments are closed.