How the right data can supercharge your geospatial technology

By on 27 September, 2018

Author Brett Madsen, General Manager, MapData Services

The ‘golden triangle’ is a longstanding business theory which suggests the key to operational success lies in achieving the right balance of people, processes and technology. While this concept once underpinned the geospatial strategies of companies globally, savvy decision-makers have adapted the approach somewhat in recent years.

The prolific use of smartphones and smart apps – which has culminated in the creation of smart cities – has created exponential volumes of data. But, while many organisations are now ‘data rich’, they remain ‘insight poor’ – as they don’t have their hands on the right content to drive the greatest return from their geospatial tech.

As a result, we’ve seen a shift from simply considering people, processes and technology when creating digital transformation strategies, to now add data to the dynamic recipe. Data is a critical – albeit often overlooked – piece of the digital transformation puzzle.

While many organisations are now ‘data rich’, they remain ‘insight poor’ – as they don’t have their hands on the right content to drive the greatest return from their geospatial tech.

The true value of quality data

Working with quality data – both from within your own organisation and external sources – will give greater currency and more conclusive results from your geospatial analytics.

Put simply, without the right type of data, software is nothing more than a piece of technology. Data is the key ingredient that fuels the processes and analytics within technology, providing people with the ability to make valuable, informed decisions. This theory holds true whether you are using a commercial-off-the-shelf or open source technology platform.

That said, there remains some uncertainty about data in both commercial and government realms, in terms of what data is needed to drive the results you are seeking.

The prolific use of smartphones and smart apps – which has culminated in the creation of smart cities – has created exponential volumes of data.

Using data to find the answers you need

When it comes to choosing data, there are literally millions of datasets to choose from – from foundational information such as 3D buildings, roads and traffic, points of interest, addresses, cadastre and property and elevation and terrain, to more abstract insights such as geodemographics, human movement and more.

Drawing on these data sources is enabling government and commercial organisations to do more with GIS technology than many thought possible. For example, we regularly partner with councils which use GIS to analyse their own data with external sources – such as commercial property insights, human movements, natural hazard risks and crowdsourced updates – to provide real-time responses to resident concerns, improved community services and internal workflow efficiencies.

In fact, the internal efficiencies brought about by incorporating external datasets into solutions has been transformative for some traditional roles. For example, authoritative foundation maps provided as web services are eliminating the need for GIS managers to spend copious hours creating base maps from scratch.

So how do you know which data will help you perform the task at hand, or find the answers you need? This is one of the most common questions I get asked on a day-to-day basis – so much so that we have established an entire data and content team to address this challenge.

3 tips for choosing the right data

My advice is to always consult an expert to discuss your organisation’s specific requirements, however to give you some initial parameters, here are my three tips on choosing the right data – and ensuring it maintains value for years to come.

  1. Know your audience and your objectives

Like any activity, it pays to start with establishing a clear understanding of what you are trying to achieve and who the main stakeholder is. Knowing the outcome you are striving for allows data selection to be ‘reverse engineered’, so you’ll end up with more meaningful analytical insights. That said, as business needs evolve, so too can the datasources. The data lifecycle in an organisation can be likened to the way plants begin their life as seeds and grow a system that secures deep and wide into the ground. With most situations, organisations acquire data because of a particular project or business need, but to be able to truly maximise the benefit of data, it needs to fluidly evolve and develop across departments, and grow deep into the business as a source of truth. This will come in time as you continue to add authoritative data to your systems.

  1. Capture the right balance of free data and paid data

There’s lots of valuable, complimentary content you can access online – for example from Open Street Map, government open data portals or public websites – however to undertake really valuable analysis, you typically need to complement this with specialised data. For example, if you’re a logistics company seeking to understand the distances for suppliers from warehouses, then you can access data sources that inform you of warehouse to store distance calculations by truck route with historic traffic patterns to derive potential delays – and determine the most efficient route accordingly. Some of the best authoritative sources of data come from PSMA , Here, and TomTom – however I recommend speaking with an expert so you can ensure you are getting the right datasets to achieve your goals.

  1. Always maintain commitment to data integrity

A solution is only as good as the data it contains, so ensuring data is current is fundamental to managing any data ecosystem. Put simply, out-of-date data will result in poor business decisions. Always ensure you are using the most current version of data available – and in case you’re not prompted on when a new release is available, I recommend scheduling periodic internal checks to ensure everything is up-to-date. It’s also imperative that processes and management frameworks are put in place to properly manage data. Workflow automation is worth considering to both improve business efficiencies and support the efforts of the people in your team assigned to manage the integrity of your data. Finally – for data integrity to be properly maintained, it really requires buy-in from the highest levels of management.

For further information on selecting the right content to supercharge your GIS technology, request a free consultation with a data specialist.

You may also like to read:


Comments are closed.

Newsletter

Sign up now to stay up to date about all the news from Spatial Source. You will get a newsletter every week with the latest news.

Q&A with Renee Bartolo, DCCEEW Chief Remote Pilot
Protecting and preserving Australia’s ancient landscape us...
NZ navy hydrographic ship capsizes and sinks
HMNZS Manawanui ran aground on a reef off the coast of Samoa...
INTERGEO 2024: Round-up of product launch news
Held this year in Stuttgart, the event attracted 579 exhibit...
Carbonix, Toll partner for drone training
A new deal will see Toll Uncrewed Systems become the primary...
Melbourne subsurface model wins digital award
Spark and WSP have won a Going Digital Award for their work ...
C.R. Kennedy now distributing the Teledyne Ladybug6
The 360-degree camera can capture spherical images from movi...