NVIDIA has unveiled a new cloud-based suite of microservices called NVIDIA Omniverse Cloud Sensor RTX, which uses ray-tracing and neural-rendering technologies to create simulated environments for testing autonomous systems. This technology aims to improve safety and reduce costs by allowing developers to test sensor perception and AI software in realistic virtual environments before real-world deployment. The announcement coincides with NVIDIA’s recent victory at the Autonomous Grand Challenge, where its researchers demonstrated an effective workflow for end-to-end driving at scale. The technology is set for release later this year and is already being used by software developers and sensor manufacturers for autonomous vehicle development and validation.
Previous ArticleA Novel Blood-based Epigenetic Biosignature In First-episode Schizophrenia Patients Through Automated Machine Learning
Next Article Top 10 Python Libraries Every Developer Should Know