The NANOSOLUTIONS-NanoMILE workshop was held in Stockholm, Sweden, on the 13-14th September 2016, with the objective of discussing the data and methods exchange between NANOSOLUTIONS-NanoMILE. The aim was also to discuss the current status of high throughput and “omics” approaches as alternative methods in nanosafety assessment, with the goal of exchanging ideas, opinions and experiences. The papers agreed upon at the workshop in Stockholm can be the first steps towards convincing regulators to accept the results of NANOSOLUTIONS, says Richard Handy..
One of the main outcomes of the joint workshop in Stockholm between NANOSOLUTIONS and NanoMile was that two papers will be produced. The first, led by Dario Greco, will be a more technical scientific paper on the use of omics and technology in nanoscience, and will present a lot of the evidence that has been gathered.
The second paper, which will be produced by Richard Handy of Plymouth University, will be a perspectives paper to engage the regulatory community in using the new approaches and computational methods that have been developed in the consortium. “This has always been a problem in science,” says Handy. “Whenever a new molecular tool or technique comes along, the regulatory process tends to be quite conservative because of their existing standard procedures and methods. When computational methods for genomics and QSARS were first invented, the regulators were years behind the universities before they started to warm up to them. So we need to start having dialogue with them now about why our approaches could be useful.”
Handy’s experience of working with the OECD on nanomaterials has taught him about the methods and thought processes that regulators go through before accepting new techniques. “There will be a lot of dialogue to be done between the scientists and the regulators, and the process can take five years or so before things get accepted,” he says. “Ultimately, our tools will be not be used if there is no regulatory acceptance of the computational approaches. That is why it’s also important to have the other EU projects such as NanoMile and NanoFase collaborating on this paper so that we have a large section of the nanomaterials community coming forward and delivering the same message together. That way the regulators will will have a weight of evidence showing the benefits of the technology.
The other paper led by Greco, which will present much of the scientific evidence of the project, should also be of interest to the regulators. “Most of the proteomics analysis from animal biology is now done,” says Handy. “We are now at the point where we can test our original question of whether there is a coating effect of whether it’s the core chemistry of a nanomaterial that informs its toxicity. Our initial maps are suggesting the latter, which is more in line with the current paradigm of thinking. If this does prove to be the case, it will probably be a relief for the regulators, as any significant coating effect would mean having to overhaul the regulatory process.”
It isn’t just the science that the regulators must be convinced on. The advanced computational methods using artificial intelligence are probably the first example of any such methods being used for safety assessment, and this will almost certainly cause extra caution. “Our system is essentially teaching itself, and people making regulatory decisions will find this unnerving!” Handy says. “We need to make them realise that the information provided by such AI can be treated like that from a technical expert – you can take it or leave it.”