Researchers have developed a physics-informed machine learning approach to address the variability in quantum devices caused by nanoscale imperfections. This could lead to more accurate predictions and better scaling of quantum computing for various applications.