arcore
In part I and part II of our ARCore blog series, we shared how you can leverage ARCore for Unity features, like motion tracking to create a Jenga AR game or light estimation to trigger object behavior. In this post, we want to share some cool ARCore for Unity experiments that show what you can do with data generated from the camera feed.
在ARCore博客系列的第一部分和第二部分中,我们分享了如何利用ARCore for Unity功能,例如运动跟踪来创建Jenga AR游戏或光线估计来触发对象行为。 在本文中,我们想分享一些很酷的ARCore for Unity实验,这些实验展示了您可以使用相机Feed生成的数据做什么。
手持设备的相机不仅用于拍摄照片和视频 (A handheld device’s camera is not just for taking photos and videos)
ARCore for Unity enhances the utility of the cameras by bringing contextual data to the user experience. To showcase some of the things you can do, we asked some of our top engineers to create some AR experiments, including a breakdown of their techniques and code snippets, so you can explore them on your own. Here are just a few things you can start testing today!
ARCore for Unity通过将上下文数据带入用户体验来增强摄像机的实用性。 为了展示您可以做的事情,我们请一些顶级工程师创建一些AR实验,包括其技术和代码片段的细分,以便您可以自己进行探索。 以下是您可以立即开始测试的几件事!
世界捕捉 (World Captures)
By Dan Miller
丹 |