MAXIMUM MACHINE LEARNING COVERAGE WITH LIMITED DATA
Both global and local robustness play essential roles in designing AI systems with superior performance and adaptability. The key is striking a balance between the two, depending on the specific application requirements.
For achieving local robustness, the amount of data required can be considerably smaller compared to global robustness. The training data needs to be concentrated on the targeted region of interest, providing sufficient coverage of the inputs relevant to the specific task. We call this the specification set which consists of an infinite amount of possible training samples.
Infinite? So, how much coverage, i.e. guarantee of success, can be achieved?
No matter how many training samples you collect, the result will always be:
0 % of MLC, since the mathematically required number of training samples in an infinite continuous space is infinite!
This proves our case that for robust neural network training, you never have enough data to achieve 100 % MLC.
Unless… you apply the Spiki approach, which limits the amount of samples needed to a finite amount and, at the same time, clearly specifies which samples need to be measured in order to ensure robustness.
Excited? Get in touch and find out how Spiki’s AI can help boost your application in robotics, transportation, smart home, speech command recognition and other domains!