{"id":769,"date":"2017-04-04T11:34:56","date_gmt":"2017-04-04T11:34:56","guid":{"rendered":"https:\/\/blogs.mathworks.com\/headlines\/?p=769"},"modified":"2018-03-20T15:19:18","modified_gmt":"2018-03-20T15:19:18","slug":"adaptive-vr-adjusts-to-your-vision-even-if-you-wear-glasses","status":"publish","type":"post","link":"https:\/\/blogs.mathworks.com\/headlines\/2017\/04\/04\/adaptive-vr-adjusts-to-your-vision-even-if-you-wear-glasses\/","title":{"rendered":"Adaptive VR adjusts to your vision, even if you wear glasses"},"content":{"rendered":"<p>Virtual reality (VR) is a\u00a0hot topic among tech enthusiasts. With initial success in the gaming and entertainment markets, VR is now finding new applications in everything from <a href=\"https:\/\/www.forbes.com\/sites\/robertadams\/2017\/03\/09\/is-this-the-future-of-real-estate-marketing\/#4468663d3782\" target=\"_blank\" rel=\"noopener\">real estate<\/a> to <a href=\"http:\/\/www.computerweekly.com\/feature\/MWC-2017-How-virtual-reality-could-be-the-next-big-thing-for-healthcare\" target=\"_blank\" rel=\"noopener\">healthcare<\/a>. Some estimates predict VR will be a $160M market by 2020.<\/p>\n<p>But VR can be hard on the eyes and has been <a href=\"https:\/\/www.inverse.com\/article\/7244-why-vergence-accommodation-conflict-threatens-virtual-reality-users-vision\" target=\"_blank\" rel=\"noopener\">linked to eyestrain and headaches<\/a>. Since it takes a \u201cone-size-fits-all\u201d approach, it can be exceptionally hard on consumers that wear corrective lenses. Based on data from the Vision Council of America, that\u2019s approximately 4 billion adults that could find VR extra challenging.<\/p>\n<p><div style=\"width: 560px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/www.computationalimaging.org\/wp-content\/uploads\/2017\/02\/gearvr_prototype.jpg\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"http:\/\/www.computationalimaging.org\/wp-content\/uploads\/2017\/02\/gearvr_prototype.jpg\" width=\"550\" height=\"256\" \/><\/a><p class=\"wp-caption-text\">A conventional near-eye display (Samsung Gear VR) is augmented by a gaze tracker and a motor that is capable of adjusting the physical distance between screen and lenses. Image credit: Stanford University.<\/p><\/div><\/p>\n<p>The problem for many VR consumers is that headset displays are a fixed distance from your eyes and are designed to work best for people with perfect vision. To address this, researchers at Stanford\u2019s <a href=\"http:\/\/www.computationalimaging.org\/\" target=\"_blank\" rel=\"noopener\">Computational Imaging Lab<\/a>, working with a scientist from the <a href=\"http:\/\/www.emilyacooper.org\/\" target=\"_blank\" rel=\"noopener\">Department of Psychological and Brain Sciences<\/a> at Dartmouth College, published research that shows how VR headsets can be designed to adapt to the differences in eyesight.<\/p>\n<p><a href=\"http:\/\/www.digitaltrends.com\/virtual-reality\/stanford-vr-adaptive\/\" target=\"_blank\" rel=\"noopener\">\u201cIf you wear glasses, or know someone who does, this technology could make virtual reality far easier for them to use,\u201d<\/a> stated <em>Digital Trends.<\/em><\/p>\n<h2>Adapting the VR experience<\/h2>\n<p>The research paper, &#8220;<a href=\"http:\/\/www.computationalimaging.org\/publications\/optimizing-vr-with-gaze-contingent-and-adaptive-focus-displays\/\" target=\"_blank\" rel=\"noopener\">Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays<\/a>,&#8221; was published in in the Proceedings of the National Academy of Sciences (PNAS). It details how the researchers created optocomputational technology that adapts the focal plane for VR users based on their unique vision requirements.<\/p>\n<p>\u201cEvery person needs a different optical mode to get the best possible experience in VR,\u201d said\u00a0<a href=\"http:\/\/web.stanford.edu\/~gordonwz\/cv\/\" target=\"_blank\" rel=\"noopener\">Gordon Wetzstein<\/a>, assistant professor of electrical engineering and senior author of the paper.<\/p>\n<p>The researchers developed and tested two distinct hardware approaches. The software then adapted the hardware to provide the best viewing environment.<\/p>\n<h3>Liquid-filled tunable lenses<\/h3>\n<p>First, they developed a hardware system that utilizes focus-tunable lenses. The tunable lenses were liquid-filled lenses that change in thickness to vary their focal power. This happens in real time while the user is viewing the VR content.<\/p>\n<p>\u201cThe lenses are driven by the same computer that controls the displayed images, allowing for precise temporal synchronization between the virtual image distance and the onscreen content,\u201d per the paper in PNAS.<\/p>\n<p>This tunable lens approach used a special table-mounted system, shown in the image below. The team used an <a href=\"https:\/\/en.wikipedia.org\/wiki\/Autorefractor\" target=\"_blank\" rel=\"noopener\">autorefractor<\/a> to obtain measurements of the users&#8217; vision prescription. Then, liquid-filled lenses are adjusted in real time to obtain the optimum VR viewing experience. The study found that VR-corrected vision provided sharpness compatible with the vision the users achieved when wearing their glasses, showing they could now make full use of VR without their corrective lenses.<\/p>\n<p><div style=\"width: 465px\" class=\"wp-caption alignnone\"><a title=\"https:\/\/2nznub4x5d61ra4q12fyu67t-wpengine.netdna-ssl.com\/wp-content\/uploads\/2017\/02\/virtual-reality-for-different-eyes.jpg (link no longer works)\" target=\"_blank\" rel=\"noopener\"><img class=\"\" title=\"https:\/\/2nznub4x5d61ra4q12fyu67t-wpengine.netdna-ssl.com\/wp-content\/uploads\/2017\/02\/virtual-reality-for-different-eyes.jpg (link no longer works)\" width=\"455\" height=\"300\" \/><\/a><p class=\"wp-caption-text\">The benchtop setup was designed to incorporate adaptive focus via tunable, liquid-filled lenses. Image credit:\u00a0Stanford University in PNAS.<\/p><\/div><\/p>\n<h3>Dynamic focus for moving objects in VR<\/h3>\n<p>When an object moves in depth \u2013 closer to or farther from us \u2013 our eyes naturally adjust to focus on the object. With VR headsets, the focus plane is a set distance from our eyes. The difference between the object&#8217;s distance, near or far, and the set focal plane can cause eye strain.<\/p>\n<p>The researchers\u2019 second hardware approach utilized a mechanically-actuated display to accommodate for objects&#8217; movement in relation to the natural focal length. Instead of fixing the focus for the predetermined distance between the eyes and the VR screen, this approach moved the VR screen to change that distance. This approach also used eye-tracking cameras to monitor where the user was looking.<\/p>\n<p><div style=\"width: 405px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/www.computationalimaging.org\/wp-content\/uploads\/2017\/02\/gaze_contingent_schematic.png\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" loading=\"lazy\" class=\"\" src=\"http:\/\/www.computationalimaging.org\/wp-content\/uploads\/2017\/02\/gaze_contingent_schematic.png\" width=\"395\" height=\"300\" \/><\/a><p class=\"wp-caption-text\">A schematic showing the basic components of a gaze-contingent prototype. The stepper motor mounted on top rotates based on position reported from the integrated eye tracker and software, moving the phone back and forth (red arrows). Image credit: Stanford University.<\/p><\/div><\/p>\n<p>This approach did not require the desktop set-up, but rather made use of a commercially available VR headset. The team modified a Samsung Gear VR by adding a stereoscopic eye tracker and a motor that mechanically adjusts the distance between the screen and the magnifying lenses in real time.<\/p>\n<h2>The \u201ccomputational\u201d in optocomputational<\/h2>\n<p>The systems rely on software to adjust the operation hardware in real-time. Much of the software running the experiment was completed in C++, using\u00a0<a href=\"https:\/\/www.mathworks.com\/products\/matlab\/choosing_hardware.html?s_tid=srchtitle#_Graphics_Processing_Unit_1\" target=\"_blank\" rel=\"noopener\">OpenGL<\/a> and hardware libraries. The data analysis and plotting were completed in <a href=\"https:\/\/www.mathworks.com\/products\/matlab.html\" target=\"_blank\" rel=\"noopener\">MATLAB<\/a>. This included reading the CSV files output by the autorefractor plus the files output by the study-running software.<\/p>\n<p>\u201cThe exact nature of the data analysis we did was Fourier gain analysis of the measured user data, with some thresholding and error rejection,\u201d stated Nitish Padmanaban, the paper\u2019s lead author and a Ph.D. student at Stanford. \u201cWe then combined it with demographic data to produce a set of struct arrays, which we variously slice and filter to plot data for only the desired demographic. From there we run statistics on the differences in populations using, depending on the nature of the data, the ranksum() and signrank() Wilcoxon texts, corr() for Pearson correlation, or a binomial test.\u201d<\/p>\n<h2>Making the virtual world a little easier on the eyes<\/h2>\n<p>These VR display prototypes adapt to the eyes of each user. This can make VR more comfortable for everyone, regardless of age or eyesight.<\/p>\n<p>\u201cIt\u2019s important because people who are nearsighted, farsighted or presbyopic \u2013 these three groups alone \u2013 they account for more than 50 percent of the U.S. population,\u201d said Robert Konrad, one of the paper\u2019s authors and a Ph.D. candidate at Stanford. \u201cThe point is that we can essentially try to tune this into every individual person to give each person the best experience.\u201d<\/p>\n<p>This research has caught the attention of commercial VR providers, giving hope that someday we could all enjoy a personalized VR experience. VR will no longer be a &#8220;one-size-fits-all&#8221; technology. Imagine, it could be possible to have better vision in the virtual world than in the real one.<\/p>\n","protected":false},"excerpt":{"rendered":"<div class=\"overview-image\"><img decoding=\"async\"  class=\"img-responsive\" src=\"http:\/\/www.computationalimaging.org\/wp-content\/uploads\/2017\/02\/gearvr_prototype.jpg\" onError=\"this.style.display ='none';\" \/><\/div>\n<p>Virtual reality (VR) is a\u00a0hot topic among tech enthusiasts. With initial success in the gaming and entertainment markets, VR is now finding new applications in everything from real estate to&#8230; <a class=\"read-more\" href=\"https:\/\/blogs.mathworks.com\/headlines\/2017\/04\/04\/adaptive-vr-adjusts-to-your-vision-even-if-you-wear-glasses\/\">read more >><\/a><\/p>\n","protected":false},"author":138,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/769"}],"collection":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/users\/138"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/comments?post=769"}],"version-history":[{"count":2,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/769\/revisions"}],"predecessor-version":[{"id":1416,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/posts\/769\/revisions\/1416"}],"wp:attachment":[{"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/media?parent=769"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/categories?post=769"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.mathworks.com\/headlines\/wp-json\/wp\/v2\/tags?post=769"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}