3 Buildings showing External Fabric condition changing over a 30 year period
(NOTE. If blue dots (triangles, squares, circles) do not show up in your browser, try another browser. I have issues with Firefox on mone machine but not another, and Chrome works fine)
The above link takes you to a demonstration site that:
- Has a Google Maps base map
- Has data overlaid onto the map from a Json file stored on my VPS ( so I control that data)
- Has buttons top left that allow the data to change depending on the year that is being inspected, the further away from the current time the elements condition gets worse, as expected, but as each elements condition will get worse at different rates, you can see what groups need to be replaced at a specific time. ( These buttons could be changed to a slider, but I have not found a project that has allowed me to develop that yet)
- The 2 graphs at the bottom are calculations of maintenance costs on the buildings over time, this is done with the SPM Assets software analysis. There is also the current value of the “improvements” of the site (ie building, not land value), so you can see when accrued maintenance costs start to rise above actual cost of the property.
- The data for the 2 graphs and for the condition of the external elements through the years was done by using FME (as this was an office bought product) although I have generally use Knime Analytics, its free and I have found it very good.
- The map can be viewed as A MAP or SATELLITE views and street view can also be used to have a closer look at the property’s.
- The information was initially developed in Excel and was converted at the end to JSON just so the dataset could be read by JavaScript.
The 2 blogs I have written regarding developing these maps can be found HERE and HERE.
After taking this information to this level of detail I started to look at a larger dataset. For this I used a Database table to store the information and then queried the table. This is a follow on from using Excel, as Databases are more responsive in the internet space.
My first exploration was to just have a large amount of data in the table and be able to plot it on a map with some information regarding the data.Map of Housing Assets by Suburb . Initially I looked at just identifying the properties by suburb, colouring them to suburb and giving general information about the properties in a pop up box in the maps. This had just over 300 rows of data.
Then I increased the information on each property by processing the data and of setting the Latitude coordinate for specific elements so that you could differentiate the roof/wall/door/window icons
Map of Housing split into external elements. You need to zoom in on this map to a group or just one property to see the different elements of the same building. This was done without data cleaning on a large dataset and Knime was used to shift coordinated in the latitude for the different elements. I did not try and style the icons as this is just a demonstration of using a larger dataset and displaying the data on a map. Each external element had a 3 letter code, Rof, Exw, Win, Dor etc and on clicking icon a pop-up box gives you data on that element.
As this is a dataset with over 3000 rows, and the data is written to XML before being displayed, it is slow to load the whole dataset onto the map.
An alternative is to select an area first then find the data, this way, only a sub-set of the data will be displayed and it will load much faster. There are a couple of ways to do this, one that I researched was with a polyline, where you drew a boundary around an area and the query only fetched the information of properties withing that bounded area.
Another method is the example of Shops in Cities. In this interactive example you type in a City , eg Boston, Los Angeles etc, and select a radius, say 25 miles, the search will only look in this area and display only shops within that radius, rather than in the whole of the dataset, thus speeding up the search result and display on the map.