{"id":8179,"date":"2018-06-09T10:00:15","date_gmt":"2018-06-09T10:00:15","guid":{"rendered":"https:\/\/microsites.bournemouth.ac.uk\/cel\/?p=8179"},"modified":"2018-06-07T14:54:41","modified_gmt":"2018-06-07T14:54:41","slug":"heres-how-apple-is-making-ar-objects-far-more-believable","status":"publish","type":"post","link":"https:\/\/microsites.bournemouth.ac.uk\/flie\/2018\/06\/09\/heres-how-apple-is-making-ar-objects-far-more-believable\/","title":{"rendered":"Here\u2019s how Apple is making AR objects far more believable"},"content":{"rendered":"<a href=\"https:\/\/microsites.bournemouth.ac.uk\/flie\/files\/2018\/06\/apple-arkit-2-0-980x620.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-8180\" src=\"https:\/\/microsites.bournemouth.ac.uk\/flie\/files\/2018\/06\/apple-arkit-2-0-980x620-300x190.jpg\" alt=\"\" width=\"300\" height=\"190\" srcset=\"https:\/\/microsites.bournemouth.ac.uk\/flie\/files\/2018\/06\/apple-arkit-2-0-980x620-300x190.jpg 300w, https:\/\/microsites.bournemouth.ac.uk\/flie\/files\/2018\/06\/apple-arkit-2-0-980x620-768x486.jpg 768w, https:\/\/microsites.bournemouth.ac.uk\/flie\/files\/2018\/06\/apple-arkit-2-0-980x620.jpg 980w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>\n<p>Apple is using machine learning to make augmented reality objects more realistic in iOS 12, using smart environment texturing that can predict reflections, lighting, and more. The new feature, being added in ARKit 2.0, uses on-device processing to better integrate virtual objects with their real-world counterparts, blurring the lines between what\u2019s computer-generated and what\u2019s authentic.<\/p>\n<div class=\"adsense_responsive_box\">\n<p>Currently, if you have a virtual object in a real-world scene, like a metal bowl on a wooden table, the wood\u2019s texture won\u2019t be reflected in the metal bowl. What environmental texturing does is pick up details in the surrounding physical textures, and then map that to the virtual objects. So, the metal bowl would slightly reflect the wooden surface it was sitting on; if you put a banana down next to the bowl, there\u2019d be a yellow reflection added too.<\/p>\n<p>Environment texturing gathers the scene texture information, generally \u2013 though not always \u2013 representing that as the six sides of a cube. Developers can then use that as texture information for their virtual objects. Using computer vision, ARKit 2.0 extracts texture information to fill its cube map, and then \u2013 when the cube map is filled \u2013 can use each facet to understand what would be reflected on each part of the AR object.<\/p>\n<p><a href=\"https:\/\/www.slashgear.com\/apple-arkit-2-ios-12-environment-texture-reflections-06533104\/\">READ MORE HERE &gt;&gt;&gt;<\/a><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Apple is using machine learning to make augmented reality objects more realistic in iOS 12, using smart environment texturing that can predict reflections, lighting, and more. The new feature, being added in ARKit 2.0, uses on-device processing to better integrate virtual objects with their real-world counterparts, blurring the lines between what\u2019s computer-generated and what\u2019s authentic&#8230;.  <a class=\"excerpt-read-more\" href=\"https:\/\/microsites.bournemouth.ac.uk\/flie\/2018\/06\/09\/heres-how-apple-is-making-ar-objects-far-more-believable\/\">Read more &raquo;<span class=\"sr-only\"> about Here\u2019s how Apple is making AR objects far more believable<\/span><\/a><\/p>\n","protected":false},"author":525,"featured_media":8180,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[871],"tags":[],"class_list":["post-8179","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-augmented-reality"],"acf":[],"_links":{"self":[{"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/posts\/8179","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/users\/525"}],"replies":[{"embeddable":true,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/comments?post=8179"}],"version-history":[{"count":1,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/posts\/8179\/revisions"}],"predecessor-version":[{"id":8181,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/posts\/8179\/revisions\/8181"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/media\/8180"}],"wp:attachment":[{"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/media?parent=8179"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/categories?post=8179"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/microsites.bournemouth.ac.uk\/flie\/wp-json\/wp\/v2\/tags?post=8179"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}