There are usually several ways to measure a given physical quantity. Sometimes, by optimally using the measured information, one can achieve a great sensitivity improvement, without changing the measurement hardware. In the field of quantum optics, the most time-consuming measurements can be improved with methods such as machine learning.

We have recently achieved accurate classification of randomly dispersed nanodiamonds containing one or several quantum emitters into “single emitters” vs “ensembles” using machine learning [1]. To make this decision for every nanodiamond, we used a heuristic threshold of *g*^{(2)}(0) = 0.5 on the value of the second-order autocorrelation function at zero delay. The machine learning-based method greatly outperformed the conventional exponential fitting for assessing the value of *g*^{(2)}(0). This method also works well for decision threshold levels other than 0.5. In addition, we have demonstrated the regression of *g*^{(2)}(0) values by a convolutional neural network (CNN) [2]. The CNN’s prediction are on par with those of the standard Levenberg-Marquard fit but require over 10 times less photon correlation data, speeding up the measurement significantly. We are looking to extend these and other optimal data processing methods to various measurements in quantum optics and sensing for scalable assembly and testing of integrated quantum devices.

#### References

[1] Z.A. Kudyshev, S.I. Bogdanov, T. Isacsson, A.V. Kildishev, A. Boltasseva and V. M. Shalaev, Adv. Quant. Technol., **3**, 2000067 (2020)

[2] Z.A. Kudyshev, D. Sychev, Z.O. Martin, S.I. Bogdanov, X. Xu, A.V. Kildishev, A. Boltasseva and V.M. Shalaev, preprint available on *arXiv* at https://arxiv.org/abs/2107.02401