Neighbourhoods of Winnipeg: A Community Semantic Portal

Introduction

I am proud to announce the new NOW (Neighbourhoods Of Winnipeg) semantic web portal! This new and innovative semantic web portal was publicly announced by the Mayor of Winnipeg City last week.

The NOW (Neighbourhoods of Winnipeg) portal is “a new Web portal (the “Portal”) produced by the City of Winnipeg to provide broad, dynamic and interactive access to local and neighbourhood information. Designed for easy access and use by all citizens, businesses, community organizations and Governments, the information on the site includes municipal data, census and demographic information, economic development information, historical data, much spatial and mapping information, and facilities for including and sharing data by external groups and constituencies.”

I would suggest you to read Mike Bergman’s blog post about this new semantic web portal to have the proper background about that initiative by the city of Winnipeg and how it uses the OSF (Open Semantic Framework) as its foundational technology stack.

This project has been the springboard that led to the Open Semantic Framework version 1.1. Multiple pieces of the framework have been developed in relation to this project, and more particularly pieces like the sWebMap semantic component and several improvements to the structWSF web services endpoints and conStruct modules for Drupal 6.

Development of the Portal

The development plan of this portal is composed of four major areas:

  1. Development of the data structure of the municipal domain by creating a series of ontologies
  2. Conversion of existing data asset using this new data structure
  3. Creation of the web portal by creating its design and by developing all the display templates
  4. Creation of new tools to let users interact with the data available on the portal

Structured Dynamics has been involved in #1, #2 and #4 by providing design and development resources, technology transfer sessions and material and supporting internal teams to create, maintain and deploy their 57 publicly available datasets.

The Data Structure

This technology stack does not have any meaning without the proper data and data structures (ontologies) in place. This gold mine of information is what drives the functionality of the portal.

The portal is driven by 12 ontologies: 2 internal and 10 external. The content of the 57 publicly available datasets is defined by the classes and properties defined in one of these ontologies.

The two internal ontologies have been created jointly by Structured Dynamics and the City of Winnipeg, but they are extended and maintained by the city only.

These ontologies are maintained using two different kind of tools:

  1. Protege
  2. structOntology

Protege is used for the big development tasks such as creating a big number of classes and properties, to do a big reorganization of the classes structure, etc.

structOntology is used for quick ontological changes to have an immediate impact on the behaviors of the portals such as label changes, SCO ontology property assignments to change the behavior of some of the tools that exist in the portal, etc.

structOntology can also be used by portal users to understand the underlying data structure used to define the data available on the portal. All users have access to the reading mode of the tool which let them browse, search and export the loaded ontologies on the portal.

The Data

Except for rare exceptions such as the historical photos, no new data has been created by the City of Winnipeg to populate this NOW portal. Most of its content comes from existing internal sources of data such as:

  • Conventional relational databases
  • GIS (Geographic Information System) on-top of relational databases
  • Spreadsheets

All of the conventional relation databases and legacy data from the GIS systems has been converted into RDF using the FME Workbench ETL system. All of the FME workbench templates are mapping the relational data into RDF using the ontologies loaded into the portal. All of the geolocated records that exist in the portal come from this ETL process and have been converted using FME.

Some smaller datasets come from internal spreadsheets that got modified to comply with the commON spreadsheet format that is used to convert spreadsheet (CSV/TSV) data files into RDF.

All of the dataset creation and maintenance is managed internally by the City of Winnipeg using one of these two data conversion and importation processes.

Here are some internal statistics of the content that is currently accessible on the NOW portal.

General Portal

These are statistics related to different functionalities of the portal.

  • Number of neighbourhoods: 236
  • Number of community areas: 14
  • Number of wards: 15
  • Number of neighbourhood clusters: 23
  • Number of major site sections: 7
  • Total number of site pages: 428,019
    • Static pages: 2,245
    • Record-oriented pages: 425,874
    • Dynamic (search-based) pages: infinite
  • Number of documents: 1,017
  • Number of images: 2,683
  • Number of search facets: 1,392
  • Number of display templates: 54
  • Number of links: 1,067
    • External links: 784
    • Internal links: 283
Site Data

These statistics show the things that are available via the portal, what are their types, their properties, what is the quantity of data that is searchable, manipulable and exportable from the portal.

  • Number of datasets: 57
  • Number of records: 425,874
    • Number of geolocational records: 418,869
      • Point of interest (POI) records: 193,272
      • Polygon records: 218,602
      • Path (route) records: 6,995
  • Number of classes (types): 84
  • Number of properties: 1,308
  • Number of triple assertions: 8,683,103

Sharing Content

An important aspect of this portal is that all of the content is contextually available, in different formats, to all of the users of the portal. Whether you are browsing content within datasets, searching for specific pieces of content, or looking at a specific record page, you always have the possibility to get your hands on the content that is being displayed to you, the user, with a choice of five different data formats:

Export Page Content

All content pages can be exported in one of the formats outlined above. In the bottom right corner of these pages you will see a Export button that you can click to get the content of that page in one of these formats.

record_export

Export Search Content

Every time you do a search on the portal, you can export the results of that search in one of the formats outlined above. You can do that by selecting the Export tab, and by selecting one of the formats you want to use for exporting the data.

browse_export

Export Datasets

You can export any publicly available dataset from the portal. These datasets have to be exported in slices if they are too big to fit in a single slice. The datasets can be exported in one of the formats mentioned above.

datasets_export

Export Census

Users also have the possibility to export census data, from the census section of the portal, in spreadsheets. They only have to select the Tables tab, and then to click the Export Spreadsheet button.

export_census

Export Ontologies

The export functionality would not be complete without the ability to consult and export the ontologies that are used to describe the content exposed by the portal. These ontologies can be read from the ontologies reader user interface, or can be exported from the portal to be read by external ontologies management tools such as Protege.

ontologies_export

Portal Design

The portal is using Drupal 6 as its CMS (Content Management System). The Drupal 6 instance communicates with structWSF using the conStruct module, which acts as a bridge between a Druapal portal and a structWSF web service network.

Here are the main design phases that have been required to create the portal:

  1. Creation of the portal’s design, and the Drupal 6 theme that implements it
  2. Creation of the Search and Browse results templates
  3. Creation of the individual records’ page design and templates based on their type
  4. Creation of the sWebMap search results templates.

The portal’s design has been created internally by the City of Winnipeg and by Tactica based on the Citizen DAN demo. Tactica also worked on another Citizen DAN like portal called MyPeg.ca.

Semantic Components

The NOW Web portal is using a series of tools that are called the Semantic Components. These are a set of Flash and JavaScript tools that can be embedded within any web page and that can easily communicate with structWSF instance(s). They display information in all kinds of charts, they can display document reading widgets, they can create dashboards of structured data, etc. The initial set of Semantic Components was developed for the MyPeg.ca project back in November 2010. This was before Steve Jobs announced that Apple would not support Adobe Flash, and far before Google announced that it would drop support for it as well.

Since the NOW portal wanted to re-use as much as possible to lower the development cost related to the portal, they choose to use the complete OSF stack which includes these Semantic Components.

However, when we participated in developing this new NOW portal, we did extended the set of Semantic Components by creating the most complex Semantic Component: the sWebMap. However, because of the two announcements mentioned above, we choose to move forward and to create the sWebMap Semantic Component using JavaScript instead of Flash. The other Semantic Component tools that have been developed in Flash have not yet been ported into JavaScript.

Conclusion

The new NOW semantic web portal’s main asset is its data: how it can be searched (with traditional search engines or using a semantic component to search, browse, filter and localize results), displayed and exported. This portal has been developed using a completely free and open source semantic platform that has been developed from previous projects that open sourced their code.

I consider this portal a pioneer in the way municipal organization will provide new online services to their citizens and to the commercial enterprises based on the quality of the data that will be exposed via such Web portals.

New Mapping Semantic Component In JavaScript

 

I am please to announce the release of the new sWebMap Semantic Component in JavaScript. This new mapping component is a standalone JavaScript application that can be integrated on any new or existing web sites and that interact with an Open Semantic Framework (OSF) instance to search, browse, filter and display with geographically-located information on an interactive map.

Features

The sWebMap is a rich mapping tool that can easily be integrated on any webpage, and that can be extensively customized. The sWebMap does support these features:

  • Full text search for searching and displaying results on a map
  • Extensive filtering capabilities
    • Filtering by dataset source
    • Filtering by type
    • Filtering by attribute/value
    • Filtering of records that belongs to a specific geographic region
  • Display of record on the map using:
    • Different markers depending on the type of record to display (determined by the ontologies)
    • Polygon shapes for records that refers to a geographic region
    • Polyline shapes for records that refers to a geographically-located path
  • Templating of records in a resultset depending on their type
  • Templating of records’ preview, displayed in an overlay window, depending on their type
  • Persist records on the map accros searches and filtering operations
  • Supports map sessions
    • Save map sessions
    • Load saved map sessions
    • Delete saved map sessions
    • Share saved map sessions
  • Supports a multiple-maps mode
    • Three focus maps are available under the main map
    • Each map focus on a particular region of the main map
    • User can switch between focus map to see different records in different region

 

Normal Mode

Here is what the default sWebMap, in normal mode, using a few datasets related to the city of Iowa looks like. You can also interact with this sWebMap instance directly on the Citizen DAN demo website here.

Multiple Windows Mode

Here is what the default sWebMap, in multiple windows mode, using a few datasets related to the city of Iowa looks like. You can also interact with this sWebMap instance directly on the Citizen DAN demo website here.

 

 

Under the Hood: The Open Semantic Framework

Each sWebMap component communicates with an OSF (Open Semantic Framework) instance. More specifically, a sWebMap component will send Search/Filtering queries to a geo-enabled structWSF Search web service endpoint.

Depending on the options you had specified when you created the sWebMap control, each time you move (option), zoom (option) or change the filtering criterias, this will send a query to the Search endpoint. The sWebMap control then requests JSON formatted resultset and display the results to the user.

This means that to implement the sWebMap component on your website, you will need to have:

Download

You can immediately download the entire code source from this GitHub reposiroty:

Installation

Installing the sWebMap component is really easy. In fact, you only have to load a few JavaScript and CSS files, to defined a <div></div> container for the map, and to create a sWebMap component object, which is a single line of code.

Additionally, you can initialize the sWebMap component with one of the multiple options available.

Refer you to the Usage section of the sWebMap component to know exactly how to install and setup a sWebMap component instance.

Resources

Here are some additional resources related to the sWebMap component:

 

jQuery Cookie Pluging Extended With HTML5 localStorage And Chunked Cookies

Is there a web developer that never used cookies to save some information in a user’s browser? There may be, but they should be legion. As you probably know, the problem with cookies is that their implementation in browsers is random: some will limit the size of the cookie to 4096 bytes, others will limit the number of cookies from a specific domain to 50, others will have no perceivable limits, etc.


In any case, if one of these limits is reached, the cookie is simply not created the browser. This is fine, because web developer expects cookies to fail from time to time, and the system they develop has to cope with this unreliableness. However, this situation can sometimes become frustrating, and it is why I wanted to extend the default behavior of the jQuery Cookie plugin with a few more capabilities.

This extension to the jQuery Cookie plugin adds the capability to save content that is bigger than 4096 bytes long using two different mechanism: the usage of HTML5’s localStorage, or the usage of a series of cookies where the content is chunked and saved. This extension is backward compatible with the jQuery Cookie plugin and its usage should be transparent to the users. Even if existing cookies have been created with the normal Cookie plugin, they will still be usable by this new extension. The usage syntax is the same, but 3 new options have been created.

Now, let’s see how this plugin works, how developers should use it, what are its limitations, etc.

You can immediately download the jQuery Extended Cookie plugin from here:

Limitations Of Cookies

First, let’s see what the RFC 2109 says about the limitations of cookies in web browsers. Browsers should normally have these implementation limits (see section 6.3):

   Practical user agent implementations have limits on the number and
   size of cookies that they can store.  In general, user agents' cookie
   support should have no fixed limits.  They should strive to store as
   many frequently-used cookies as possible.  Furthermore, general-use
   user agents should provide each of the following minimum capabilities
   individually, although not necessarily simultaneously:

      * at least 300 cookies
      * at least 4096 bytes per cookie (as measured by the size of the
        characters that comprise the cookie non-terminal in the syntax
        description of the Set-Cookie header)
      * at least 20 cookies per unique host or domain name

   User agents created for specific purposes or for limited-capacity
   devices should provide at least 20 cookies of 4096 bytes, to ensure
   that the user can interact with a session-based origin server.

   The information in a Set-Cookie response header must be retained in
   its entirety.  If for some reason there is inadequate space to store
   the cookie, it must be discarded, not truncated.

   Applications should use as few and as small cookies as possible, and
   they should cope gracefully with the loss of a cookie.

New Options

Before I explains how this extension works, let me introduce three new options that have been added to the Cookie plugin. These new options will be put into context, and properly defined later in this blog post.

  • maxChunkSize – This defines the maximum number of bytes that can be saved in a single cookie. (default: 3000)
  • maxNumberOfCookies - This is the maximum number of cookies that can be created for a single domain name. (default: 20)
  • useLocalStorage – This tells the extended Cookie plugin to use the HTML5’s localStorage capabilities of the browser instead of a cookie to save that value. (default: true)

How Does This Extension Works?

As I said in the introduction of this blog post, this extension to the jQuery Cookie plugin does two things:

  1. It uses the HTML5 localStorage capabilities of the browser if this feature is available instead of relying on the cookies. However, if cookies are needed by the developer, this feature can be turned off with the useLocalStorage = false option
  2. If the localStorage option is disable, or simply not available on a browser, and if the content is bigger than the limit of the size of a cookie, then this extension will chunk the input content, and save it in multiple cookies

If the useLocalStorage is true, then the plugin will try to see if the HTML5 localStorage mechanism is available on the browser. If it is, then it will use that local storage to save and retrieve content to the browser. If it is not, then the plugin will act like if useLocalStorage is false and the process will continue by using cookies to save and read that content from the browser.

If useLocalStorage is false, or if the HTML5 localStorage mechanism is not available on the browser, then the plugin will check if the content is bigger than the maxChunkSize option, than all the chunks will be saved in different cookies until it reaches the limit imposed by the maxNumberOfCookies option.

If cookies are used, then two use-cases can happen:

  1. The content is smaller than or equal to maxChunkSize
  2. The content is bigger than maxChunkSize

If the content is smaller than or equal to maxChunkSize than only one cookie will be created by the browser. The name of the cookie will be the value provided to the key parameter.

If the content is bigger than maxChunkSize than multiple cookies will be created, one per chunk. The convention is that the name of the first cookie is the value provided to the key parameter. The name of the other chunks is the value provided to the key parameter with the chunk indicator ---ChunkNum append to it. For example, if we have a cookie with a content of 10000 bytes that has maxChunkSize defined to 4000 bytes, then these three cookies would be created:

  • cookie-name
  • cookie-name---1
  • cookie-name---2

Usage

Now, let’s see how this extended jQuery Cookie plugin should be used in your code. The usage of the extension is no different from the usage of the normal jQuery Cookie plugin. However, I am showing how to use the new options along with how to use the plugin in general.

Create a Cookie

Let’s create a cookie that expires in 365 days and where the path is the root:

[cc lang=’javascript’ ]

$.cookie(‘my-cookie’, “the-content-of-my-cookie”, { expires: 365, path: “/” });

[/cc]

By default, this value will be persisted in the localStorage if the browser supports it, and not in a cookie. So, let’s see how to force the plugin to save the content in a cookie by using the useLocalStorage option:

[cc lang=’javascript’ ]

$.cookie(‘my-cookie’, “the-content-of-my-cookie”, {useLocalStorage: false, expires: 365, path: “/” });

[/cc]

Delete a Cookie

Let’s see how a cookie can be deleted. The method is simply to put null as the value of the cookie. This will instruct the plugin to remove the cookie.

[cc lang=’javascript’ ]

$.cookie(‘my-cookie’, null);

[/cc]

With that call, the plugin will try to remove my-cookie both in the localStorage and in the cookies.

Read a Cookie

Let’s see how we can read the content of a cookie:

[cc lang=’javascript’ ]

var value = $.cookie(‘my-cookie’);

[/cc]

With this call, value will get the content that has been saved in the localStorage, or the cookies. This will depend if the localStorage was available in the browser.

Now, let’s see how to force reading the cookies by bypassing the localStorage mechanism:

[cc lang=’javascript’ ]

var value = $.cookie(‘my-cookie’, {useLocalStorage: false});

[/cc]

Note that if the cookie is not existing for a key, then the $.cookie() function will return null.

Using Limitations

Let’s see how to use the maxNumberOfCookies and maxChunkSize options to limit the size and the number of cookies to be created.

With this example, the content will be saved in multiple cookies of 1000 bytes each up to 30 cookies:

[cc lang=’javascript’ ]

var value = $.cookie(‘my-cookie’, “the-content-of-my-cookie-is-10000-bytes-long…”, {useLocalStorage: false, maxChunkSize  = 1000, maxNumberOfCookies = 30, expires: 365, path: “/” });

[/cc]

Limitations

Users have to be aware of the limitations of this enhanced plugin. Depending on the browser, the values of the maxChunkSize and the maxNumberOfCookies options should be different. In the worse case, some cookies (or cookies chunks) may simply not be created by the browser. As stated in the RFC 2109, the web applications have to take that fact into account, and be able to gracefully cope with this.

Future Enhancements

In the future, this extension should detect the browser where it runs, and setup the maxChunkSize and the maxNumberOfCookies parameters automatically depending on the cookies limitation of each browser.

Conclusion

I had to create this extension to the jQuery Cookie plugin to be able to store the resultsets returned by some web service endpoints. It is only used to limit the number of queries sent to these endpoints. Since the values returned by the endpoints are nearly static, that they are loaded at each page view and that they are a few kilobytes big, I had to find a way to save that information in the browser, and to overcome the size limitation of the cookies if possible. I also needed to be able to cope with older versions of browsers that only supports cookies. In the worse case scenario, the browser will simply send the request to the endpoints at each page load for the special use-cases where nothing works: not the cookies and not the localStorage. But at least, my application will benefit of this enhancement from the 95% of the users were one of these solutions works.

Open Semantic Framework Running on Micro Instances

After releasing the new Open Semantic Framework Installer, we started to test it on machines with all kind of different specifications: different CPU limits, different amount of memory, etc. One of the setup that caught our attention was Amazon’s EC2 Micro Instance.

The Micro Instance is a virtual server type that has been introduced by Amazon a little bit more than a year ago. As described by Amazon, Micro Instances are:

Instances of this family provide a small amount of consistent CPU resources and allow you to burst CPU capacity when additional cycles are available. They are well suited for lower throughput applications and web sites that consume significant compute cycles periodically.

We were intrigued by this particular type of instance because we wanted to know how the complete Open Semantic Framework stack could operate on such a small server instance.

Micro Instance Specifications

The Micro Instance’s specifications are as follow:

  • 613 MB memory
  • Up to 2 EC2 Compute Units (for short periodic bursts)
  • 32-bit or 64-bit platform
  • I/O Performance: Low

Note that a EC2 Compute Unit provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor.

Installing The Stack

Installing the stack on the Amazon Micro Instance, using the OSF Installer, is not the fastest experience in the World. In fact, installing the complete stack takes up to 10 hours (5 minutes of your time, but compiling and installing everything takes about 10 hours of CPU time).

The problem is that installing OSF is a CPU intensive task, while the Micro instance is not. The micro instance can sustain small CPU bursts, but it can’t sustain the creation and compilation of the entire stack. That means that the CPU cycles won’t be available to the instance, and that the CPU consumption of that instance will be throttled by Amazon, which will significantly slow down the installation process.

However, as you will see below, once OSF is installed on the Micro instance, the complete stack responds perfectly to all queries sent to it.

Creating an AMI

The only time you have to spend 10 hours to install the OSF stack on an Amazon Micro Instance is the first time. After that, you would only have to create an Amazon AMI from that vanilla OSF instance for future use. If you proceed that way, you will lower the installation time from 10 hours to a few minutes.

Reading and Searching Data

The testing we did for reading and searching data from structWSF shows that performances are as good as the ones you would get from a small instance with a normal workload. The Crud: Read and the Search structWSF endpoints are fully responsive and operational.

Creating, Updating and Deleting Data

The testing we did for creating, updating and deleting entire datasets takes more time than with a small instance even if the instance is dedicated to that only task, without any other queries processed by the instance at the same time. The reason for this decrease in performances is due to the CPU throttling done by Amazon for this kind of more CPU intensive task. However, since individual records creation, updating and deletion creates “CPU Peaks”, such isolated create/update/delete queries doesn’t greatly affect the overall performances of the instance.

What This Type Of Instance Is Good For?

We found that such small instances were perfect for data collection activities performed by a single person, or a small group of collaborators. We also found that it could be used by low-traffic websites such as personal web portal, personal blogs, etc. The complete OSF stack is fully responsive and our analysis shows that the resources (CPU and Memory) are stable and responsive with a normal workload.

Conclusion

Such a small server instance can easily be used to create a personal data collection endpoint, or a personal, or small, data presentation portal such as Mike’s semantic web Sweet Tools. It is well suited for data portals that require reading and searching of data with occasional data changes (addition, removal and modification of instance records).

Volkswagen UK’s Search Engine Powered by structWSF

It is now official, Volkswagen UK‘s search engine is now powered by structWSF. Their new contextual search engine has been released last Friday. I covered the underlying architecture in one of my recent blog post: Volkswagen’s RDF Data Management Workflow.

 

 

John Streit, head of technology at Tribal DDB, described the two key advantages of using the structWSF (part of the Open Semantic Framework (OSF)) for their website in an interview with Wired UK:

The first is that it gives you a single place to access data. Streit explains: “Applications often need to retrieve data from multiple sources which adds complexity and development time. By using this technology we can get everything we need from a single place which drastically lowers development time and running costs.” Furthermore the exposure of data improves search and means that it can be repurposed in new and imaginative ways.