Apple’s AR plans include accurate handling of real and virtual objects

A new Apple patent describes a system of registering when a user is or is not touching a real object in a virtual space, and doing so with depth measurement from fingertips and calculated from camera angles.

Future Augmented and Virtual Reality systems and updates to ARKit from Apple could display virtual objects alongside real ones, and also allow for the manipulation of them both. The difficulty so far has been in assessing when the user is actually touching such an object, or is merely very close to doing so.

A new patent application from Apple, “Depth-based touch detection,” US Patent No 10,572,072, details the problems inherent in exclusively using virtual camera positions to calculate whether a touch has occurred.

“Using cameras for touch detection [does have] many advantages over methods that rely on sensors embedded in a surface,” says the patent. It describes Face ID-like systems where “vision sensors [or] depth cameras” provide a measurement of distance between the lens and the touch surface.

“[However, one] approach requires a fixed depth camera setup and cannot be applied to dynamic scenes,” it continues. Another approach works by identifying the user’s finger then marking surrounding pixels until so many are recorded that effectively the finger must be touching the surface. “However… this approach… can be quite error prone.”

READ MORE HERE >>>>

Leave a Reply

Your details
  • (Your email address will not be published in your comment)

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>