Practical Data Governance
Making the most of your mining and exploration data.
By Jeremy Lykke
Senior Database Manager
For many modern organisations, the ability to extract meaningful and timely information from data is a critical success factor. Having the ability to make timely and effective decisions, in alignment with organisational strategy, can depend greatly on the organisation’s ability to effectively ingest, transform and utilise data.
A data governance framework ensures that an organisation effectively manages its data assets via a defined set of systems and policies, designed to meet both the current and future requirements of the business. This framework ensures that organisational data is fit for purpose, trustworthy and appropriately used. It ensures that data capture, and those responsible for it, understand and support the objectives of the organisation, allowing downstream users to obtain maximum value from all collected data.
At maxgeo we understand that data governance is critical, and we strive every day to ensure that we provide best-practice data governance for our clients, via our various data management products and services. A significant reason that clients choose maxgeo is to ensure that their data is managed in a manner that provides maximum value to their organisation, both now and in the future. We recognise that data is a valuable asset and critical to the sustainability and profitability of our clientele.
As such, as part of the maxgeo data governance framework, numerous systems and processes have been developed that ensure optimal management of data assets, including:
With over 20 years in development, the MDS (maxwell data schema) has been specifically designed to store, validate and process geoscientific data. Our web-based mining and exploration platform DataShed5 (built on the MDS), allows real-time, global and on-demand access to your geoscientific information. With interactive maps, reports and dashboards, you can get the information you need, when you need it!
- Extensive database and server performance and security management systems, ensuring that your data is secure, scalable and accessible, to the right people at the right time.
- Automated data import and export systems, designed to align with your systems and workflows. For example, files from your lab, downhole survey provider or site operations are seamlessly and swiftly integrated into your DataShed5 database via our automation platforms. From there, data can be downloaded directly from the web or delivered directly to you via scheduled email and download subscriptions.
- maxgeo offers a variety of data capture platforms that integrate directly with DataShed5, compatible with numerous platforms, operating systems and devices. These configurable systems allow you to capture validated data in the field, online or offline, building on the data integrity checks contained in the MDS.
- DataShed5 contains extensive automated data quality validation checks, reports and dashboards, ensuring that data not only complies with all standard database constraints but also meets the specific requirements of your organisational workflows. This enhanced validation ensures that data is fit for purpose with regard to your reporting and regulatory concerns. All issues requiring attention are automatically communicated with relevant maxgeo staff, who can remediate or escalate issues as required.
Working with maxgeo, clients can proactively improve the quality, completeness and utility of their data by leveraging the data governance framework provided by maxgeo software and systems. To ensure that operational data is optimised for organisational decision-making, mining and exploration teams should routinely document, review and improve their data management workflows. Some specific examples include:
Working to improve the quality and completeness of your data:
- Investigate and populate missing data. Find similar data and ensure that data entry is consistent throughout.
- Fix any errors noted in the data before the relevant information or responsible person is no longer available. Historical data is notoriously hard to remediate!
- Import any outstanding data, or flag data for collation and import at a later stage when team members have capacity.
- Archive redundant or irrelevant historical data. Data should be maintained but may not be required by end users. This can benefit both system performance and information relevance.
Implement rigour around your data capture and data management systems and processes, including:
- Routinely review and update any data capture systems, including import templates and logging configurations and profiles.
- Ensure that data capture and logging systems are set up appropriately for the data being collected, the person collecting the data, the device and the logging environment. In the mining industry environmental conditions vary greatly and data capture devices may not be suitable in all situations.
- Review any data not currently captured within a managed system and consider possible improvements or solutions.
Rationalise and streamline your library tables, including:
- Populating descriptions wherever possible to ensure that codes have meaning to downstream and future users.
- Improve data categorisation and classification. Simplified coding systems can improve analysis and reporting, allowing users to drill down into the data from an initial high level.
- Ensure that all data capture and logging systems have valid and relevant library lookup tables referenced. Consider opportunities where library tables could be utilised to improve data capture consistency.
Review data integration opportunities, including:
- Implementing file automation, or automated data management processes, to streamline data entry and transformation, and to improve data quality, completeness and timeliness.
- Remove any data silos and leverage data connectivity. Investigate the use of data integration solutions such as API’s to connect to third-party data systems.
- Integration and automation should also be considered to reduce data management costs and errors related to manual processing.
Routinely review data accessibility, including:
- Ensuring that all users have access to data they require, whilst also restricted from accessing any data not required for their role.
- Providing users with adequate training for the various software and systems necessary for their role.
- Ensure that database, network and server permissions are appropriate for each user.
- Review data file formats to ensure they are appropriate for end users.
- Remove any necessity for data post-processing by the end user, ensuring that outputs are appropriate for the end user’s requirements.
- Automate any downloads or reports required by the user, removing the requirement for them to complete the task manually.
Improve data visualisation tools and usage:
- Visualise data, using the spatial analysis tools provided.
- Ensure that data is presented visually in a manner that’s meaningful and valuable to the end user.
- Provide interactive and intuitive dashboards for data analysis, including the provision of reporting solutions compatible with mobile devices.
It should be noted that many of the above are not one-off events, and should be considered at regular intervals, or when a new project or objective is commenced. All initiatives should be documented, with change management protocols followed to encourage a successful and maintained transition.
For any data management or data governance initiatives, organisations must work with a professional and experienced data management team to ensure an efficient, fit-for-purpose and trustworthy solution that truly maximises your data assets.
At maxgeo, we can assist an organisation in building a culture that values good data management practices, providing significant value now and in the future.