Glencore Coal Australia deployed SAP as their ERP and asset management system in 2016. Glencore Coal sites have been adding Maintenance Tasks, Maintenance Plans/Items & Bill of Materials on a site-by-site basis. While some good information was added to SAP, lack of a consistent approach led to the following issues;
- Not all parts had been catalogued leading to some parts being ordered outside of the system (direct purchase) incurring additional administration cost for each order.
- Maintenance budgets could not be prepared entirely from SAP data. The manual intervention takes time and the budget is not linked to the maintenance plan.
- Ad-hoc BOMs were needed for jobs were no BOM existed. This compromised quality and took additional planning time.
AssetOn were engaged to perform an audit and support the data review and update. AssetOn handled the complete project from data audit to the creation of SAP load files to add/extend materials and update the parts
catalogue. The project steps were;
- Audit SAP asset master data across 26 sites for Glencore Coal Australia
- Identify gaps in the data and prepare a report
- Build a plan to prioritise and fix issues identified in the audit
- Develop new data to fill the gaps in tasks, plans and BOMs
- Ensure site validated BOMs in place for all major maintenance tasks and services
- Use any good work created by Glencore to date where possible
- If new data is created for one site/equipment, leverage this information for other sites that have similar equipment
- Catalogue parts added to BOMs (Extend parts to site if they exist in the master Catalogue or create new catalogue items)
- Created SAP load files to update the system with validated data
AssetOn has developed software tools and processes used by our master data team to efficiently audit, review and validate client Master Data against best practice. Clients can then make decisions about which data can be kept and what needs to be review. The tools guide clients by highlighting issues and making suggestions about action to take to improve the data quality. Once clients have validated the data, SAP load sheets are developed to update the master data.
The main parts of the process are;
- AssetOn provided a list of strategies for each asset to compare client task coverage on a site and equipment basis against best practice – Glencore could choose to add additional tasks from the list.
- An algorithm was used to identify variations at the part number level between client BOMs and AssetOn developed BOMs. Glencore then made decisions about what parts should be used. The algorithm also identified superseded and duplicate parts.
- The Algorithm compared the BOM data to Glencore’s master catalogue and identified parts as Catalogued, not catalogued or needing to be extended.
- The completed master data was sent to Glencore for final review. The algorithm highlighted errors for easy review by Glencore.
- To complete the project, AssetOn built the validated master data into SAP load sheets.
The following outcomes were achieved for the project;
- Consistent execution strategy across multiple sites.
- Validated BOMs ensure the right parts are ordered for the job.
- More complete Task and BOM coverage increased the planners’ efficiency
- Catalogued parts reduce the administration costs order parts
- More accurate maintenance budgets can be created from the better quality / more complete master data reducing the risk of cost overruns.
About AssetOn Group
AssetOn brings together purpose build tools, qualified personnel and a library of existing data to efficiently execute master data projects.
. We have built software tools and refined our processes to efficiently audit and correct large volumes of master data for mining assets.
. We employee a team of master data experts in our Brisbane office, many with specific trade and industry background, to review and build quality, fit-for-purpose master data tailored to your equipment.
. We have a library of maintenance strategies and other master data that can be used as the starting point for validation and data build. This greatly improves the quality of the data delivered.